Sample records for residential forecasting database

  1. Forecasting residential electricity demand in provincial China.

    PubMed

    Liao, Hua; Liu, Yanan; Gao, Yixuan; Hao, Yu; Ma, Xiao-Wei; Wang, Kan

    2017-03-01

    In China, more than 80% electricity comes from coal which dominates the CO2 emissions. Residential electricity demand forecasting plays a significant role in electricity infrastructure planning and energy policy designing, but it is challenging to make an accurate forecast for developing countries. This paper forecasts the provincial residential electricity consumption of China in the 13th Five-Year-Plan (2016-2020) period using panel data. To overcome the limitations of widely used predication models with unreliably prior knowledge on function forms, a robust piecewise linear model in reduced form is utilized to capture the non-deterministic relationship between income and residential electricity consumption. The forecast results suggest that the growth rates of developed provinces will slow down, while the less developed will be still in fast growing. The national residential electricity demand will increase at 6.6% annually during 2016-2020, and populous provinces such as Guangdong will be the main contributors to the increments.

  2. Residential Saudi load forecasting using analytical model and Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Al-Harbi, Ahmad Abdulaziz

    In recent years, load forecasting has become one of the main fields of study and research. Short Term Load Forecasting (STLF) is an important part of electrical power system operation and planning. This work investigates the applicability of different approaches; Artificial Neural Networks (ANNs) and hybrid analytical models to forecast residential load in Kingdom of Saudi Arabia (KSA). These two techniques are based on model human modes behavior formulation. These human modes represent social, religious, official occasions and environmental parameters impact. The analysis is carried out on residential areas for three regions in two countries exposed to distinct people activities and weather conditions. The collected data are for Al-Khubar and Yanbu industrial city in KSA, in addition to Seattle, USA to show the validity of the proposed models applied on residential load. For each region, two models are proposed. First model is next hour load forecasting while second model is next day load forecasting. Both models are analyzed using the two techniques. The obtained results for ANN next hour models yield very accurate results for all areas while relatively reasonable results are achieved when using hybrid analytical model. For next day load forecasting, the two approaches yield satisfactory results. Comparative studies were conducted to prove the effectiveness of the models proposed.

  3. Model documentation report: Residential sector demand module of the national energy modeling system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This report documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Residential Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, and FORTRAN source code. This reference document provides a detailed description for energy analysts, other users, and the public. The NEMS Residential Sector Demand Module is currently used for mid-term forecasting purposes and energy policy analysis over the forecast horizon of 1993 through 2020. The model generates forecasts of energy demand for the residential sector by service, fuel, and Census Division. Policy impacts resulting from new technologies,more » market incentives, and regulatory changes can be estimated using the module. 26 refs., 6 figs., 5 tabs.« less

  4. Short-Term Energy Outlook Model Documentation: Regional Residential Propane Price Model

    EIA Publications

    2009-01-01

    The regional residential propane price module of the Short-Term Energy Outlook (STEO) model is designed to provide residential retail price forecasts for the 4 Census regions: Northeast, South, Midwest, and West.

  5. Short-Term Energy Outlook Model Documentation: Regional Residential Heating Oil Price Model

    EIA Publications

    2009-01-01

    The regional residential heating oil price module of the Short-Term Energy Outlook (STEO) model is designed to provide residential retail price forecasts for the 4 census regions: Northeast, South, Midwest, and West.

  6. Energy supply and demand modeling. (Latest citations from the NTIS bibliographic database). Published Search

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-01-01

    The bibliography contains citations concerning the use of mathematical models in trend analysis and forecasting of energy supply and demand factors. Models are presented for the industrial, transportation, and residential sectors. Aspects of long term energy strategies and markets are discussed at the global, national, state, and regional levels. Energy demand and pricing, and econometrics of energy, are explored for electric utilities and natural resources, such as coal, oil, and natural gas. Energy resources are modeled both for fuel usage and for reserves. (Contains 250 citations and includes a subject term index and title list.)

  7. Energy supply and demand modeling. (Latest citations from the NTIS bibliographic database). Published Search

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-12-01

    The bibliography contains citations concerning the use of mathematical models in trend analysis and forecasting of energy supply and demand factors. Models are presented for the industrial, transportation, and residential sectors. Aspects of long term energy strategies and markets are discussed at the global, national, state, and regional levels. Energy demand and pricing, and econometrics of energy, are explored for electric utilities and natural resources, such as coal, oil, and natural gas. Energy resources are modeled both for fuel usage and for reserves. (Contains 250 citations and includes a subject term index and title list.)

  8. Forecasting generation of urban solid waste in developing countries--a case study in Mexico.

    PubMed

    Buenrostro, O; Bocco, G; Vence, J

    2001-01-01

    Based on a study of the composition of urban solid waste (USW) and of socioeconomic variables in Morelia, Mexico, generation rates were estimated. In addition, the generation of residential solid waste (RSW) and nonresidential solid waste (NRSW) was forecasted by means of a multiple linear regression (MLR) analysis. For residential sources, the independent variables analyzed were monthly wages, persons per dwelling, age, and educational level of the heads of the household. For nonresidential sources, variables analyzed were number of employees, area of facilities, number of working days, and working hours per day. The forecasted values for residential waste were similar to those observed. This approach may be applied to areas in which available data are scarce, and in which there is an urgent need for the planning of adequate management of USW.

  9. Forecasting relative impacts of land use on anadromous fish habitat to guide conservation planning.

    PubMed

    Lohse, Kathleen A; Newburn, David A; Opperman, Jeff J; Merenlender, Adina M

    2008-03-01

    Land use change can adversely affect water quality and freshwater ecosystems, yet our ability to predict how systems will respond to different land uses, particularly rural-residential development, is limited by data availability and our understanding of biophysical thresholds. In this study, we use spatially explicit parcel-level data to examine the influence of land use (including urban, rural-residential, and vineyard) on salmon spawning substrate quality in tributaries of the Russian River in California. We develop a land use change model to forecast the probability of losses in high-quality spawning habitat and recommend priority areas for incentive-based land conservation efforts. Ordinal logistic regression results indicate that all three land use types were negatively associated with spawning substrate quality, with urban development having the largest marginal impact. For two reasons, however, forecasted rural-residential and vineyard development have much larger influences on decreasing spawning substrate quality relative to urban development. First, the land use change model estimates 10 times greater land use conversion to both rural-residential and vineyard compared to urban. Second, forecasted urban development is concentrated in the most developed watersheds, which already have poor spawning substrate quality, such that the marginal response to future urban development is less significant. To meet the goals of protecting salmonid spawning habitat and optimizing investments in salmon recovery, we suggest investing in watersheds where future rural-residential development and vineyards threaten high-quality fish habitat, rather than the most developed watersheds, where land values are higher.

  10. Developing a Mixed Neural Network Approach to Forecast the Residential Electricity Consumption Based on Sensor Recorded Data

    PubMed Central

    Bâra, Adela; Stănică, Justina-Lavinia; Coculescu, Cristina

    2018-01-01

    In this paper, we report a study having as a main goal the obtaining of a method that can provide an accurate forecast of the residential electricity consumption, refining it up to the appliance level, using sensor recorded data, for residential smart homes complexes that use renewable energy sources as a part of their consumed electricity, overcoming the limitations of not having available historical meteorological data and the unwillingness of the contractor to acquire such data periodically in the future accurate short-term forecasts from a specialized institute due to the implied costs. In this purpose, we have developed a mixed artificial neural network (ANN) approach using both non-linear autoregressive with exogenous input (NARX) ANNs and function fitting neural networks (FITNETs). We have used a large dataset containing detailed electricity consumption data recorded by sensors, monitoring a series of individual appliances, while in the NARX case we have also used timestamps datasets as exogenous variables. After having developed and validated the forecasting method, we have compiled it in view of incorporating it into a cloud solution, being delivered to the contractor that can provide it as a service for a monthly fee to both the operators and residential consumers. PMID:29734761

  11. Developing a Mixed Neural Network Approach to Forecast the Residential Electricity Consumption Based on Sensor Recorded Data.

    PubMed

    Oprea, Simona-Vasilica; Pîrjan, Alexandru; Căruțașu, George; Petroșanu, Dana-Mihaela; Bâra, Adela; Stănică, Justina-Lavinia; Coculescu, Cristina

    2018-05-05

    In this paper, we report a study having as a main goal the obtaining of a method that can provide an accurate forecast of the residential electricity consumption, refining it up to the appliance level, using sensor recorded data, for residential smart homes complexes that use renewable energy sources as a part of their consumed electricity, overcoming the limitations of not having available historical meteorological data and the unwillingness of the contractor to acquire such data periodically in the future accurate short-term forecasts from a specialized institute due to the implied costs. In this purpose, we have developed a mixed artificial neural network (ANN) approach using both non-linear autoregressive with exogenous input (NARX) ANNs and function fitting neural networks (FITNETs). We have used a large dataset containing detailed electricity consumption data recorded by sensors, monitoring a series of individual appliances, while in the NARX case we have also used timestamps datasets as exogenous variables. After having developed and validated the forecasting method, we have compiled it in view of incorporating it into a cloud solution, being delivered to the contractor that can provide it as a service for a monthly fee to both the operators and residential consumers.

  12. Procedures and Standards for Residential Ventilation System Commissioning: An Annotated Bibliography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stratton, J. Chris; Wray, Craig P.

    2013-04-01

    Beginning with the 2008 version of Title 24, new homes in California must comply with ANSI/ASHRAE Standard 62.2-2007 requirements for residential ventilation. Where installed, the limited data available indicate that mechanical ventilation systems do not always perform optimally or even as many codes and forecasts predict. Commissioning such systems when they are installed or during subsequent building retrofits is a step towards eliminating deficiencies and optimizing the tradeoff between energy use and acceptable IAQ. Work funded by the California Energy Commission about a decade ago at Berkeley Lab documented procedures for residential commissioning, but did not focus on ventilation systems.more » Since then, standards and approaches for commissioning ventilation systems have been an active area of work in Europe. This report describes our efforts to collect new literature on commissioning procedures and to identify information that can be used to support the future development of residential-ventilation-specific procedures and standards. We recommend that a standardized commissioning process and a commissioning guide for practitioners be developed, along with a combined energy and IAQ benefit assessment standard and tool, and a diagnostic guide for estimating continuous pollutant emission rates of concern in residences (including a database that lists emission test data for commercially-available labeled products).« less

  13. Forecasting residential solar photovoltaic deployment in California

    DOE PAGES

    Dong, Changgui; Sigrin, Benjamin; Brinkman, Gregory

    2016-12-06

    Residential distributed photovoltaic (PV) deployment in the United States has experienced robust growth, and policy changes impacting the value of solar are likely to occur at the federal and state levels. To establish a credible baseline and evaluate impacts of potential new policies, this analysis employs multiple methods to forecast residential PV deployment in California, including a time-series forecasting model, a threshold heterogeneity diffusion model, a Bass diffusion model, and National Renewable Energy Laboratory's dSolar model. As a baseline, the residential PV market in California is modeled to peak in the early 2020s, with a peak annual installation of 1.5-2more » GW across models. We then use the baseline results from the dSolar model and the threshold model to gauge the impact of the recent federal investment tax credit (ITC) extension, the newly approved California net energy metering (NEM) policy, and a hypothetical value-of-solar (VOS) compensation scheme. We find that the recent ITC extension may increase annual PV installations by 12%-18% (roughly 500 MW, MW) for the California residential sector in 2019-2020. The new NEM policy only has a negligible effect in California due to the relatively small new charges (< 100 MW in 2019-2020). Moreover, impacts of the VOS compensation scheme (0.12 cents per kilowatt-hour) are larger, reducing annual PV adoption by 32% (or 900-1300 MW) in 2019-2020.« less

  14. Forecasting residential solar photovoltaic deployment in California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dong, Changgui; Sigrin, Benjamin; Brinkman, Gregory

    Residential distributed photovoltaic (PV) deployment in the United States has experienced robust growth, and policy changes impacting the value of solar are likely to occur at the federal and state levels. To establish a credible baseline and evaluate impacts of potential new policies, this analysis employs multiple methods to forecast residential PV deployment in California, including a time-series forecasting model, a threshold heterogeneity diffusion model, a Bass diffusion model, and National Renewable Energy Laboratory's dSolar model. As a baseline, the residential PV market in California is modeled to peak in the early 2020s, with a peak annual installation of 1.5-2more » GW across models. We then use the baseline results from the dSolar model and the threshold model to gauge the impact of the recent federal investment tax credit (ITC) extension, the newly approved California net energy metering (NEM) policy, and a hypothetical value-of-solar (VOS) compensation scheme. We find that the recent ITC extension may increase annual PV installations by 12%-18% (roughly 500 MW, MW) for the California residential sector in 2019-2020. The new NEM policy only has a negligible effect in California due to the relatively small new charges (< 100 MW in 2019-2020). Moreover, impacts of the VOS compensation scheme (0.12 cents per kilowatt-hour) are larger, reducing annual PV adoption by 32% (or 900-1300 MW) in 2019-2020.« less

  15. Study on Battery Capacity for Grid-connection Power Planning with Forecasts in Clustered Photovoltaic Systems

    NASA Astrophysics Data System (ADS)

    Shimada, Takae; Kawasaki, Norihiro; Ueda, Yuzuru; Sugihara, Hiroyuki; Kurokawa, Kosuke

    This paper aims to clarify the battery capacity required by a residential area with densely grid-connected photovoltaic (PV) systems. This paper proposes a planning method of tomorrow's grid-connection power from/to the external electric power system by using demand power forecasting and insolation forecasting for PV power predictions, and defines a operation method of the electricity storage device to control the grid-connection power as planned. A residential area consisting of 389 houses consuming 2390 MWh/year of electricity with 2390kW PV systems is simulated based on measured data and actual forecasts. The simulation results show that 8.3MWh of battery capacity is required in the conditions of half-hour planning and 1% or less of planning error ratio and PV output limiting loss ratio. The results also show that existing technologies of forecasting reduce required battery capacity to 49%, and increase the allowable installing PV amount to 210%.

  16. [Demography perspectives and forecasts of the demand for electricity].

    PubMed

    Roy, L; Guimond, E

    1995-01-01

    "Demographic perspectives form an integral part in the development of electric load forecasts. These forecasts in turn are used to justify the addition and repair of generating facilities that will supply power in the coming decades. The goal of this article is to present how demographic perspectives are incorporated into the electric load forecasting in Quebec. The first part presents the methods, hypotheses and results of population and household projections used by Hydro-Quebec in updating its latest development plan. The second section demonstrates applications of such demographic projections for forecasting the electric load, with a focus on the residential sector." (SUMMARY IN ENG AND SPA) excerpt

  17. 76 FR 52854 - Energy Conservation Program: Energy Conservation Standards for Residential Clothes Dryers and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-24

    ... Equipment Price Forecasting in Energy Conservation Standards Analysis (76 FR 9696, Feb. 22, 2011), has not... with such switching (e.g., the need to install a new dedicated electrical outlet). 3. Energy Price Forecast AGA stated that DOE's use of the Annual Energy Outlook (AEO) 2010 Reference Case for energy prices...

  18. Energy data sourcebook for the US residential sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wenzel, T.P.; Koomey, J.G.; Sanchez, M.

    Analysts assessing policies and programs to improve energy efficiency in the residential sector require disparate input data from a variety of sources. This sourcebook, which updates a previous report, compiles these input data into a single location. The data provided include information on end-use unit energy consumption (UEC) values of appliances and equipment efficiency; historical and current appliance and equipment market shares; appliances and equipment efficiency and sales trends; appliance and equipment efficiency standards; cost vs. efficiency data for appliances and equipment; product lifetime estimates; thermal shell characteristics of buildings; heating and cooling loads; shell measure cost data for newmore » and retrofit buildings; baseline housing stocks; forecasts of housing starts; and forecasts of energy prices and other economic drivers. This report is the essential sourcebook for policy analysts interested in residential sector energy use. The report can be downloaded from the Web at http://enduse.lbl. gov/Projects/RED.html. Future updates to the report, errata, and related links, will also be posted at this address.« less

  19. Electricity forecasting on the individual household level enhanced based on activity patterns

    PubMed Central

    Gajowniczek, Krzysztof; Ząbkowski, Tomasz

    2017-01-01

    Leveraging smart metering solutions to support energy efficiency on the individual household level poses novel research challenges in monitoring usage and providing accurate load forecasting. Forecasting electricity usage is an especially important component that can provide intelligence to smart meters. In this paper, we propose an enhanced approach for load forecasting at the household level. The impacts of residents’ daily activities and appliance usages on the power consumption of the entire household are incorporated to improve the accuracy of the forecasting model. The contributions of this paper are threefold: (1) we addressed short-term electricity load forecasting for 24 hours ahead, not on the aggregate but on the individual household level, which fits into the Residential Power Load Forecasting (RPLF) methods; (2) for the forecasting, we utilized a household specific dataset of behaviors that influence power consumption, which was derived using segmentation and sequence mining algorithms; and (3) an extensive load forecasting study using different forecasting algorithms enhanced by the household activity patterns was undertaken. PMID:28423039

  20. Electricity forecasting on the individual household level enhanced based on activity patterns.

    PubMed

    Gajowniczek, Krzysztof; Ząbkowski, Tomasz

    2017-01-01

    Leveraging smart metering solutions to support energy efficiency on the individual household level poses novel research challenges in monitoring usage and providing accurate load forecasting. Forecasting electricity usage is an especially important component that can provide intelligence to smart meters. In this paper, we propose an enhanced approach for load forecasting at the household level. The impacts of residents' daily activities and appliance usages on the power consumption of the entire household are incorporated to improve the accuracy of the forecasting model. The contributions of this paper are threefold: (1) we addressed short-term electricity load forecasting for 24 hours ahead, not on the aggregate but on the individual household level, which fits into the Residential Power Load Forecasting (RPLF) methods; (2) for the forecasting, we utilized a household specific dataset of behaviors that influence power consumption, which was derived using segmentation and sequence mining algorithms; and (3) an extensive load forecasting study using different forecasting algorithms enhanced by the household activity patterns was undertaken.

  1. Validation of a 20-year forecast of US childhood lead poisoning: Updated prospects for 2010.

    PubMed

    Jacobs, David E; Nevin, Rick

    2006-11-01

    We forecast childhood lead poisoning and residential lead paint hazard prevalence for 1990-2010, based on a previously unvalidated model that combines national blood lead data with three different housing data sets. The housing data sets, which describe trends in housing demolition, rehabilitation, window replacement, and lead paint, are the American Housing Survey, the Residential Energy Consumption Survey, and the National Lead Paint Survey. Blood lead data are principally from the National Health and Nutrition Examination Survey. New data now make it possible to validate the midpoint of the forecast time period. For the year 2000, the model predicted 23.3 million pre-1960 housing units with lead paint hazards, compared to an empirical HUD estimate of 20.6 million units. Further, the model predicted 498,000 children with elevated blood lead levels (EBL) in 2000, compared to a CDC empirical estimate of 434,000. The model predictions were well within 95% confidence intervals of empirical estimates for both residential lead paint hazard and blood lead outcome measures. The model shows that window replacement explains a large part of the dramatic reduction in lead poisoning that occurred from 1990 to 2000. Here, the construction of the model is described and updated through 2010 using new data. Further declines in childhood lead poisoning are achievable, but the goal of eliminating children's blood lead levels > or =10 microg/dL by 2010 is unlikely to be achieved without additional action. A window replacement policy will yield multiple benefits of lead poisoning prevention, increased home energy efficiency, decreased power plant emissions, improved housing affordability, and other previously unrecognized benefits. Finally, combining housing and health data could be applied to forecasting other housing-related diseases and injuries.

  2. Validation of a 20-year forecast of US childhood lead poisoning: Updated prospects for 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobs, David E.; Nevin, Rick

    2006-11-15

    We forecast childhood lead poisoning and residential lead paint hazard prevalence for 1990-2010, based on a previously unvalidated model that combines national blood lead data with three different housing data sets. The housing data sets, which describe trends in housing demolition, rehabilitation, window replacement, and lead paint, are the American Housing Survey, the Residential Energy Consumption Survey, and the National Lead Paint Survey. Blood lead data are principally from the National Health and Nutrition Examination Survey. New data now make it possible to validate the midpoint of the forecast time period. For the year 2000, the model predicted 23.3 millionmore » pre-1960 housing units with lead paint hazards, compared to an empirical HUD estimate of 20.6 million units. Further, the model predicted 498,000 children with elevated blood lead levels (EBL) in 2000, compared to a CDC empirical estimate of 434,000. The model predictions were well within 95% confidence intervals of empirical estimates for both residential lead paint hazard and blood lead outcome measures. The model shows that window replacement explains a large part of the dramatic reduction in lead poisoning that occurred from 1990 to 2000. Here, the construction of the model is described and updated through 2010 using new data. Further declines in childhood lead poisoning are achievable, but the goal of eliminating children's blood lead levels {>=}10 {mu}g/dL by 2010 is unlikely to be achieved without additional action. A window replacement policy will yield multiple benefits of lead poisoning prevention, increased home energy efficiency, decreased power plant emissions, improved housing affordability, and other previously unrecognized benefits. Finally, combining housing and health data could be applied to forecasting other housing-related diseases and injuries.« less

  3. The Eruption Forecasting Information System: Volcanic Eruption Forecasting Using Databases

    NASA Astrophysics Data System (ADS)

    Ogburn, S. E.; Harpel, C. J.; Pesicek, J. D.; Wellik, J.

    2016-12-01

    Forecasting eruptions, including the onset size, duration, location, and impacts, is vital for hazard assessment and risk mitigation. The Eruption Forecasting Information System (EFIS) project is a new initiative of the US Geological Survey-USAID Volcano Disaster Assistance Program (VDAP) and will advance VDAP's ability to forecast the outcome of volcanic unrest. The project supports probability estimation for eruption forecasting by creating databases useful for pattern recognition, identifying monitoring data thresholds beyond which eruptive probabilities increase, and for answering common forecasting questions. A major component of the project is a global relational database, which contains multiple modules designed to aid in the construction of probabilistic event trees and to answer common questions that arise during volcanic crises. The primary module contains chronologies of volcanic unrest. This module allows us to query eruption chronologies, monitoring data, descriptive information, operational data, and eruptive phases alongside other global databases, such as WOVOdat and the Global Volcanism Program. The EFIS database is in the early stages of development and population; thus, this contribution also is a request for feedback from the community. Preliminary data are already benefitting several research areas. For example, VDAP provided a forecast of the likely remaining eruption duration for Sinabung volcano, Indonesia, using global data taken from similar volcanoes in the DomeHaz database module, in combination with local monitoring time-series data. In addition, EFIS seismologists used a beta-statistic test and empirically-derived thresholds to identify distal volcano-tectonic earthquake anomalies preceding Alaska volcanic eruptions during 1990-2015 to retrospectively evaluate Alaska Volcano Observatory eruption precursors. This has identified important considerations for selecting analog volcanoes for global data analysis, such as differences between closed and open system volcanoes.

  4. Econometrics 101: forecasting demystified

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crow, R.T.

    1980-05-01

    Forecasting by econometric modeling is described in a commonsense way which omits much of the technical jargon. A trend of continuous growth is no longer an adequate forecasting tool. Today's forecasters must consider rapid changes in price, policies, regulations, capital availability, and the cost of being wrong. A forecasting model is designed by identifying future influences on electricity purchases and quantifying their relationships to each other. A record is produced which can be evaluated and used to make corrections in the models. Residential consumption is used to illustrate how this works and to demonstrate how power consumption is also relatedmore » to the purchase and use of equipment. While models can quantify behavioral relationships, they cannot account for the impacts of non-price factors because of limited data. (DCK)« less

  5. Improved Dust Forecast Products for Southwest Asia Forecasters through Dust Source Database Advancements

    NASA Astrophysics Data System (ADS)

    Brooks, G. R.

    2011-12-01

    Dust storm forecasting is a critical part of military theater operations in Afghanistan and Iraq as well as other strategic areas of the globe. The Air Force Weather Agency (AFWA) has been using the Dust Transport Application (DTA) as a forecasting tool since 2001. Initially developed by The Johns Hopkins University Applied Physics Laboratory (JHUAPL), output products include dust concentration and reduction of visibility due to dust. The performance of the products depends on several factors including the underlying dust source database, treatment of soil moisture, parameterization of dust processes, and validity of the input atmospheric model data. Over many years of analysis, seasonal dust forecast biases of the DTA have been observed and documented. As these products are unique and indispensible for U.S. and NATO forces, amendments were required to provide the best forecasts possible. One of the quickest ways to scientifically address the dust concentration biases noted over time was to analyze the weaknesses in, and adjust the dust source database. Dust source database strengths and weaknesses, the satellite analysis and adjustment process, and tests which confirmed the resulting improvements in the final dust concentration and visibility products will be shown.

  6. Air pollution forecasting in Ankara, Turkey using air pollution index and its relation to assimilative capacity of the atmosphere.

    PubMed

    Genc, D Deniz; Yesilyurt, Canan; Tuncel, Gurdal

    2010-07-01

    Spatial and temporal variations in concentrations of CO, NO, NO(2), SO(2), and PM(10), measured between 1999 and 2000, at traffic-impacted and residential stations in Ankara were investigated. Air quality in residential areas was found to be influenced by traffic activities in the city. Pollutant ratios were proven to be reliable tracers to differentiate between different sources. Air pollution index (API) of the whole city was calculated to evaluate the level of air quality in Ankara. Multiple linear regression model was developed for forecasting API in Ankara. The correlation coefficients were found to be 0.79 and 0.63 for different time periods. The assimilative capacity of Ankara atmosphere was calculated in terms of ventilation coefficient (VC). The relation between API and VC was investigated and found that the air quality in Ankara was determined by meteorology rather than emissions.

  7. 18 CFR 1304.201 - Applicability.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... application, TVA-owned shorelands designated in TVA's property forecast system as “reservoir operations... the following TVA-reservoir shoreland classifications: (1) TVA-owned shorelands over which the... in current TVA Reservoir Land Management Plans as open for consideration of residential development...

  8. 18 CFR 1304.201 - Applicability.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... application, TVA-owned shorelands designated in TVA's property forecast system as “reservoir operations... the following TVA-reservoir shoreland classifications: (1) TVA-owned shorelands over which the... in current TVA Reservoir Land Management Plans as open for consideration of residential development...

  9. Short-term Power Load Forecasting Based on Balanced KNN

    NASA Astrophysics Data System (ADS)

    Lv, Xianlong; Cheng, Xingong; YanShuang; Tang, Yan-mei

    2018-03-01

    To improve the accuracy of load forecasting, a short-term load forecasting model based on balanced KNN algorithm is proposed; According to the load characteristics, the historical data of massive power load are divided into scenes by the K-means algorithm; In view of unbalanced load scenes, the balanced KNN algorithm is proposed to classify the scene accurately; The local weighted linear regression algorithm is used to fitting and predict the load; Adopting the Apache Hadoop programming framework of cloud computing, the proposed algorithm model is parallelized and improved to enhance its ability of dealing with massive and high-dimension data. The analysis of the household electricity consumption data for a residential district is done by 23-nodes cloud computing cluster, and experimental results show that the load forecasting accuracy and execution time by the proposed model are the better than those of traditional forecasting algorithm.

  10. Seasonal Forecasting of Fire Weather Based on a New Global Fire Weather Database

    NASA Technical Reports Server (NTRS)

    Dowdy, Andrew J.; Field, Robert D.; Spessa, Allan C.

    2016-01-01

    Seasonal forecasting of fire weather is examined based on a recently produced global database of the Fire Weather Index (FWI) system beginning in 1980. Seasonal average values of the FWI are examined in relation to measures of the El Nino-Southern Oscillation (ENSO) and the Indian Ocean Dipole (IOD). The results are used to examine seasonal forecasts of fire weather conditions throughout the world.

  11. Regional early flood warning system: design and implementation

    NASA Astrophysics Data System (ADS)

    Chang, L. C.; Yang, S. N.; Kuo, C. L.; Wang, Y. F.

    2017-12-01

    This study proposes a prototype of the regional early flood inundation warning system in Tainan City, Taiwan. The AI technology is used to forecast multi-step-ahead regional flood inundation maps during storm events. The computing time is only few seconds that leads to real-time regional flood inundation forecasting. A database is built to organize data and information for building real-time forecasting models, maintaining the relations of forecasted points, and displaying forecasted results, while real-time data acquisition is another key task where the model requires immediately accessing rain gauge information to provide forecast services. All programs related database are constructed in Microsoft SQL Server by using Visual C# to extracting real-time hydrological data, managing data, storing the forecasted data and providing the information to the visual map-based display. The regional early flood inundation warning system use the up-to-date Web technologies driven by the database and real-time data acquisition to display the on-line forecasting flood inundation depths in the study area. The friendly interface includes on-line sequentially showing inundation area by Google Map, maximum inundation depth and its location, and providing KMZ file download of the results which can be watched on Google Earth. The developed system can provide all the relevant information and on-line forecast results that helps city authorities to make decisions during typhoon events and make actions to mitigate the losses.

  12. WOVOdat, A Worldwide Volcano Unrest Database, to Improve Eruption Forecasts

    NASA Astrophysics Data System (ADS)

    Widiwijayanti, C.; Costa, F.; Win, N. T. Z.; Tan, K.; Newhall, C. G.; Ratdomopurbo, A.

    2015-12-01

    WOVOdat is the World Organization of Volcano Observatories' Database of Volcanic Unrest. An international effort to develop common standards for compiling and storing data on volcanic unrests in a centralized database and freely web-accessible for reference during volcanic crises, comparative studies, and basic research on pre-eruption processes. WOVOdat will be to volcanology as an epidemiological database is to medicine. Despite the large spectrum of monitoring techniques, the interpretation of monitoring data throughout the evolution of the unrest and making timely forecasts remain the most challenging tasks for volcanologists. The field of eruption forecasting is becoming more quantitative, based on the understanding of the pre-eruptive magmatic processes and dynamic interaction between variables that are at play in a volcanic system. Such forecasts must also acknowledge and express the uncertainties, therefore most of current research in this field focused on the application of event tree analysis to reflect multiple possible scenarios and the probability of each scenario. Such forecasts are critically dependent on comprehensive and authoritative global volcano unrest data sets - the very information currently collected in WOVOdat. As the database becomes more complete, Boolean searches, side-by-side digital and thus scalable comparisons of unrest, pattern recognition, will generate reliable results. Statistical distribution obtained from WOVOdat can be then used to estimate the probabilities of each scenario after specific patterns of unrest. We established main web interface for data submission and visualizations, and have now incorporated ~20% of worldwide unrest data into the database, covering more than 100 eruptive episodes. In the upcoming years we will concentrate in acquiring data from volcano observatories develop a robust data query interface, optimizing data mining, and creating tools by which WOVOdat can be used for probabilistic eruption forecasting. The more data in WOVOdat, the more useful it will be.

  13. Residential Simulation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Starke, Michael R; Abdelaziz, Omar A; Jackson, Rogerick K

    Residential Simulation Tool was developed to understand the impact of residential load consumption on utilities including the role of demand response. This is complicated as many different residential loads exist and are utilized for different purposes. The tool models human behavior and contributes this to load utilization, which contributes to the electrical consumption prediction by the tool. The tool integrates a number of different databases from Department of Energy and other Government websites to support the load consumption prediction.

  14. Residential energy demand models: Current status and future improvements

    NASA Astrophysics Data System (ADS)

    Peabody, G.

    1980-12-01

    Two models currently used to analyze energy use by the residential sector are described. The ORNL model is used to forecast energy use by fuel type for various end uses on a yearly basis. The MATH/CHRDS model analyzes variations in energy expenditures by households of various socioeconomic and demographic characteristics. The essential features of the ORNL and MATH/CHRDS models are retained in a proposed model and integrated into a framework that is more flexible than either model. The important determinants of energy use by households are reviewed.

  15. Relation of land use/land cover to resource demands

    NASA Technical Reports Server (NTRS)

    Clayton, C.

    1981-01-01

    Predictive models for forecasting residential energy demand are investigated. The models are examined in the context of implementation through manipulation of geographic information systems containing land use/cover information. Remotely sensed data is examined as a possible component in this process.

  16. Forecasting of municipal solid waste quantity in a developing country using multivariate grey models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Intharathirat, Rotchana, E-mail: rotchana.in@gmail.com; Abdul Salam, P., E-mail: salam@ait.ac.th; Kumar, S., E-mail: kumar@ait.ac.th

    Highlights: • Grey model can be used to forecast MSW quantity accurately with the limited data. • Prediction interval overcomes the uncertainty of MSW forecast effectively. • A multivariate model gives accuracy associated with factors affecting MSW quantity. • Population, urbanization, employment and household size play role for MSW quantity. - Abstract: In order to plan, manage and use municipal solid waste (MSW) in a sustainable way, accurate forecasting of MSW generation and composition plays a key role. It is difficult to carry out the reliable estimates using the existing models due to the limited data available in the developingmore » countries. This study aims to forecast MSW collected in Thailand with prediction interval in long term period by using the optimized multivariate grey model which is the mathematical approach. For multivariate models, the representative factors of residential and commercial sectors affecting waste collected are identified, classified and quantified based on statistics and mathematics of grey system theory. Results show that GMC (1, 5), the grey model with convolution integral, is the most accurate with the least error of 1.16% MAPE. MSW collected would increase 1.40% per year from 43,435–44,994 tonnes per day in 2013 to 55,177–56,735 tonnes per day in 2030. This model also illustrates that population density is the most important factor affecting MSW collected, followed by urbanization, proportion employment and household size, respectively. These mean that the representative factors of commercial sector may affect more MSW collected than that of residential sector. Results can help decision makers to develop the measures and policies of waste management in long term period.« less

  17. Wind Power Forecasting Error Frequency Analyses for Operational Power System Studies: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Florita, A.; Hodge, B. M.; Milligan, M.

    2012-08-01

    The examination of wind power forecasting errors is crucial for optimal unit commitment and economic dispatch of power systems with significant wind power penetrations. This scheduling process includes both renewable and nonrenewable generators, and the incorporation of wind power forecasts will become increasingly important as wind fleets constitute a larger portion of generation portfolios. This research considers the Western Wind and Solar Integration Study database of wind power forecasts and numerical actualizations. This database comprises more than 30,000 locations spread over the western United States, with a total wind power capacity of 960 GW. Error analyses for individual sites andmore » for specific balancing areas are performed using the database, quantifying the fit to theoretical distributions through goodness-of-fit metrics. Insights into wind-power forecasting error distributions are established for various levels of temporal and spatial resolution, contrasts made among the frequency distribution alternatives, and recommendations put forth for harnessing the results. Empirical data are used to produce more realistic site-level forecasts than previously employed, such that higher resolution operational studies are possible. This research feeds into a larger work of renewable integration through the links wind power forecasting has with various operational issues, such as stochastic unit commitment and flexible reserve level determination.« less

  18. Distributed Generation Market Demand Model | NREL

    Science.gov Websites

    Demand Model The Distributed Generation Market Demand (dGen) model simulates the potential adoption of distributed energy resources (DERs) for residential, commercial, and industrial entities in the dGen model can help develop deployment forecasts for distributed resources, including sensitivity to

  19. Do quantitative decadal forecasts from GCMs provide decision relevant skill?

    NASA Astrophysics Data System (ADS)

    Suckling, E. B.; Smith, L. A.

    2012-04-01

    It is widely held that only physics-based simulation models can capture the dynamics required to provide decision-relevant probabilistic climate predictions. This fact in itself provides no evidence that predictions from today's GCMs are fit for purpose. Empirical (data-based) models are employed to make probability forecasts on decadal timescales, where it is argued that these 'physics free' forecasts provide a quantitative 'zero skill' target for the evaluation of forecasts based on more complicated models. It is demonstrated that these zero skill models are competitive with GCMs on decadal scales for probability forecasts evaluated over the last 50 years. Complications of statistical interpretation due to the 'hindcast' nature of this experiment, and the likely relevance of arguments that the lack of hindcast skill is irrelevant as the signal will soon 'come out of the noise' are discussed. A lack of decision relevant quantiative skill does not bring the science-based insights of anthropogenic warming into doubt, but it does call for a clear quantification of limits, as a function of lead time, for spatial and temporal scales on which decisions based on such model output are expected to prove maladaptive. Failing to do so may risk the credibility of science in support of policy in the long term. The performance amongst a collection of simulation models is evaluated, having transformed ensembles of point forecasts into probability distributions through the kernel dressing procedure [1], according to a selection of proper skill scores [2] and contrasted with purely data-based empirical models. Data-based models are unlikely to yield realistic forecasts for future climate change if the Earth system moves away from the conditions observed in the past, upon which the models are constructed; in this sense the empirical model defines zero skill. When should a decision relevant simulation model be expected to significantly outperform such empirical models? Probability forecasts up to ten years ahead (decadal forecasts) are considered, both on global and regional spatial scales for surface air temperature. Such decadal forecasts are not only important in terms of providing information on the impacts of near-term climate change, but also from the perspective of climate model validation, as hindcast experiments and a sufficient database of historical observations allow standard forecast verification methods to be used. Simulation models from the ENSEMBLES hindcast experiment [3] are evaluated and contrasted with static forecasts of the observed climatology, persistence forecasts and against simple statistical models, called dynamic climatology (DC). It is argued that DC is a more apropriate benchmark in the case of a non-stationary climate. It is found that the ENSEMBLES models do not demonstrate a significant increase in skill relative to the empirical models even at global scales over any lead time up to a decade ahead. It is suggested that the contsruction and co-evaluation with the data-based models become a regular component of the reporting of large simulation model forecasts. The methodology presented may easily be adapted to other forecasting experiments and is expected to influence the design of future experiments. The inclusion of comparisons with dynamic climatology and other data-based approaches provide important information to both scientists and decision makers on which aspects of state-of-the-art simulation forecasts are likely to be fit for purpose. [1] J. Bröcker and L. A. Smith. From ensemble forecasts to predictive distributions, Tellus A, 60(4), 663-678 (2007). [2] J. Bröcker and L. A. Smith. Scoring probabilistic forecasts: The importance of being proper, Weather and Forecasting, 22, 382-388 (2006). [3] F. J. Doblas-Reyes, A. Weisheimer, T. N. Palmer, J. M. Murphy and D. Smith. Forecast quality asessment of the ENSEMBLES seasonal-to-decadal stream 2 hindcasts, ECMWF Technical Memorandum, 621 (2010).

  20. 78 FR 39290 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-01

    ... project, known as the ``National Mortgage Database,'' which is a joint effort of FHFA and the Consumer... a database of timely and otherwise unavailable residential mortgage market information to be made... Mortgage Database. The key purpose of the National Mortgage Database is to make accessible accurate...

  1. Disaggregating residential water demand for improved forecasts and decision making

    NASA Astrophysics Data System (ADS)

    Woodard, G.; Brookshire, D.; Chermak, J.; Krause, K.; Roach, J.; Stewart, S.; Tidwell, V.

    2003-04-01

    Residential water demand is the product of population and per capita demand. Estimates of per capita demand often are based on econometric models of demand, usually based on time series data of demand aggregated at the water provider level. Various studies have examined the impact of such factors as water pricing, weather, and income, with many other factors and details of water demand remaining unclear. Impacts of water conservation programs often are estimated using simplistic engineering calculations. Partly as a result of this, policy discussions regarding water demand management often focus on water pricing, water conservation, and growth control. Projecting water demand is often a straight-forward, if fairly uncertain process of forecasting population and per capita demand rates. SAHRA researchers are developing improved forecasts of residential water demand by disaggregating demand to the level of individuals, households, and specific water uses. Research results based on high-resolution water meter loggers, household-level surveys, economic experiments and recent census data suggest that changes in wealth, household composition, and individual behavior may affect demand more than changes in population or the stock of landscape plants, water-using appliances and fixtures, generally considered the primary determinants of demand. Aging populations and lower fertility rates are dramatically reducing household size, thereby increasing the number of households and residences for a given population. Recent prosperity and low interest rates have raised home ownership rates to unprecented levels. These two trends are leading to increased per capita outdoor water demand. Conservation programs have succeeded in certain areas, such as promoting drought-tolerant native landscaping, but have failed in other areas, such as increasing irrigation efficiency or curbing swimming pool water usage. Individual behavior often is more important than the household's stock of water-using fixtures, and ranges from hedonism (installing pools and whirlpool tubs) to satisficing (adjusting irrigation timers only twice per year) to acting on deeply-held conservation ethics in ways that not only fail any benefit-cost test, but are discouraged, or even illegal (reuse of gray water and black water). Research findings are being captured in dynamic simulation models that integrate social and natural science to create tools to assist water resource managers in providing sustainable water supplies and improving residential water demand forecasts. These models feature simple, graphical user interfaces and output screens that provide decision makers with visual, easy-to-understand information at the basin level. The models reveal connections between various supply and demand components, and highlight direct impacts and feedback mechanisms associated with various policy options.

  2. Short-term energy outlook, Annual supplement 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-07-25

    This supplement is published once a year as a complement to the Short- Term Energy Outlook, Quarterly Projections. The purpose of the Supplement is to review the accuracy of the forecasts published in the Outlook, make comparisons with other independent energy forecasts, and examine current energy topics that affect the forecasts. Chap. 2 analyzes the response of the US petroleum industry to the recent four Federal environmental rules on motor gasoline. Chap. 3 compares the EIA base or mid case energy projections for 1995 and 1996 (as published in the first quarter 1995 Outlook) with recent projections made by fourmore » other major forecasting groups. Chap. 4 evaluates the overall accuracy. Chap. 5 presents the methology used in the Short- Term Integrated Forecasting Model for oxygenate supply/demand balances. Chap. 6 reports theoretical and empirical results from a study of non-transportation energy demand by sector. The empirical analysis involves the short-run energy demand in the residential, commercial, industrial, and electrical utility sectors in US.« less

  3. 24 CFR 3400.1 - Purpose.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... participate in a nationwide mortgage licensing system and registry database of residential mortgage loan... charged with establishing and maintaining a licensing and registry database for loan originators. (b...

  4. Forecasting the use of elderly care: a static micro-simulation model.

    PubMed

    Eggink, Evelien; Woittiez, Isolde; Ras, Michiel

    2016-07-01

    This paper describes a model suitable for forecasting the use of publicly funded long-term elderly care, taking into account both ageing and changes in the health status of the population. In addition, the impact of socioeconomic factors on care use is included in the forecasts. The model is also suitable for the simulation of possible implications of some specific policy measures. The model is a static micro-simulation model, consisting of an explanatory model and a population model. The explanatory model statistically relates care use to individual characteristics. The population model mimics the composition of the population at future points in time. The forecasts of care use are driven by changes in the composition of the population in terms of relevant characteristics instead of dynamics at the individual level. The results show that a further 37 % increase in the use of elderly care (from 7 to 9 % of the Dutch 30-plus population) between 2008 and 2030 can be expected due to a further ageing of the population. However, the use of care is expected to increase less than if it were based on the increasing number of elderly only (+70 %), due to decreasing disability levels and increasing levels of education. As an application of the model, we simulated the effects of restricting access to residential care to elderly people with severe physical disabilities. The result was a lower growth of residential care use (32 % instead of 57 %), but a somewhat faster growth in the use of home care (35 % instead of 32 %).

  5. Changes in Natural Gas Monthly Consumption Data Collection and the Short-Term Energy Outlook

    EIA Publications

    2010-01-01

    Beginning with the December 2010 issue of the Short-Term Energy Outlook (STEO), the Energy Information Administration (EIA) will present natural gas consumption forecasts for the residential and commercial sectors that are consistent with recent changes to the Form EIA-857 monthly natural gas survey.

  6. Effect of initial conditions of a catchment on seasonal streamflow prediction using ensemble streamflow prediction (ESP) technique for the Rangitata and Waitaki River basins on the South Island of New Zealand

    NASA Astrophysics Data System (ADS)

    Singh, Shailesh Kumar; Zammit, Christian; Hreinsson, Einar; Woods, Ross; Clark, Martyn; Hamlet, Alan

    2013-04-01

    Increased access to water is a key pillar of the New Zealand government plan for economic growths. Variable climatic conditions coupled with market drivers and increased demand on water resource result in critical decision made by water managers based on climate and streamflow forecast. Because many of these decisions have serious economic implications, accurate forecast of climate and streamflow are of paramount importance (eg irrigated agriculture and electricity generation). New Zealand currently does not have a centralized, comprehensive, and state-of-the-art system in place for providing operational seasonal to interannual streamflow forecasts to guide water resources management decisions. As a pilot effort, we implement and evaluate an experimental ensemble streamflow forecasting system for the Waitaki and Rangitata River basins on New Zealand's South Island using a hydrologic simulation model (TopNet) and the familiar ensemble streamflow prediction (ESP) paradigm for estimating forecast uncertainty. To provide a comprehensive database for evaluation of the forecasting system, first a set of retrospective model states simulated by the hydrologic model on the first day of each month were archived from 1972-2009. Then, using the hydrologic simulation model, each of these historical model states was paired with the retrospective temperature and precipitation time series from each historical water year to create a database of retrospective hindcasts. Using the resulting database, the relative importance of initial state variables (such as soil moisture and snowpack) as fundamental drivers of uncertainties in forecasts were evaluated for different seasons and lead times. The analysis indicate that the sensitivity of flow forecast to initial condition uncertainty is depend on the hydrological regime and season of forecast. However initial conditions do not have a large impact on seasonal flow uncertainties for snow dominated catchments. Further analysis indicates that this result is valid when the hindcast database is conditioned by ENSO classification. As a result hydrological forecasts based on ESP technique, where present initial conditions with histological forcing data are used may be plausible for New Zealand catchments.

  7. REGRESSION MODELS OF RESIDENTIAL EXPOSURE TO CHLORPYRIFOS AND DIAZINON

    EPA Science Inventory

    This study examines the ability of regression models to predict residential exposures to chlorpyrifos and diazinon, based on the information from the NHEXAS-AZ database. The robust method was used to generate "fill-in" values for samples that are below the detection l...

  8. Forecasting of municipal solid waste quantity in a developing country using multivariate grey models.

    PubMed

    Intharathirat, Rotchana; Abdul Salam, P; Kumar, S; Untong, Akarapong

    2015-05-01

    In order to plan, manage and use municipal solid waste (MSW) in a sustainable way, accurate forecasting of MSW generation and composition plays a key role. It is difficult to carry out the reliable estimates using the existing models due to the limited data available in the developing countries. This study aims to forecast MSW collected in Thailand with prediction interval in long term period by using the optimized multivariate grey model which is the mathematical approach. For multivariate models, the representative factors of residential and commercial sectors affecting waste collected are identified, classified and quantified based on statistics and mathematics of grey system theory. Results show that GMC (1, 5), the grey model with convolution integral, is the most accurate with the least error of 1.16% MAPE. MSW collected would increase 1.40% per year from 43,435-44,994 tonnes per day in 2013 to 55,177-56,735 tonnes per day in 2030. This model also illustrates that population density is the most important factor affecting MSW collected, followed by urbanization, proportion employment and household size, respectively. These mean that the representative factors of commercial sector may affect more MSW collected than that of residential sector. Results can help decision makers to develop the measures and policies of waste management in long term period. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. 78 FR 24420 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-25

    ... academics and other interested parties outside of the government. Generally, the National Mortgage Database... project, known as the ``National Mortgage Database,'' which is a joint effort of FHFA and the Consumer... a database of timely and otherwise unavailable residential mortgage market information to be made...

  10. Forecasting Safe or Dangerous Space Weather from HMI Magnetograms

    NASA Technical Reports Server (NTRS)

    Falconer, David; Barghouty, Abdulnasser F.; Khazanov, Igor; Moore, Ron

    2011-01-01

    We have developed a space-weather forecasting tool using an active-region free-energy proxy that was measured from MDI line-of-sight magnetograms. To develop this forecasting tool (Falconer et al 2011, Space Weather Journal, in press), we used a database of 40,000 MDI magnetograms of 1300 active regions observed by MDI during the previous solar cycle (cycle 23). From each magnetogram we measured our free-energy proxy and for each active region we determined its history of major flare, CME and Solar Particle Event (SPE) production. This database determines from the value of an active region s free-energy proxy the active region s expected rate of production of 1) major flares, 2) CMEs, 3) fast CMEs, and 4) SPEs during the next few days. This tool was delivered to NASA/SRAG in 2010. With MDI observations ending, we have to be able to use HMI magnetograms instead of MDI magnetograms. One of the difficulties is that the measured value of the free-energy proxy is sensitive to the spatial resolution of the measured magnetogram: the 0.5 /pixel resolution of HMI gives a different value for the free-energy proxy than the 2 /pixels resolution of MDI. To use our MDI-database forecasting curves until a comparably large HMI database is accumulated, we smooth HMI line-of-sight magnetograms to MDI resolution, so that we can use HMI to find the value of the free-energy proxy that MDI would have measured, and then use the forecasting curves given by the MDI database. The new version for use with HMI magnetograms was delivered to NASA/SRAG (March 2011). It can also use GONG magnetograms, as a backup.

  11. Projected electric power demands for the Potomac Electric Power Company. Volume 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estomin, S.; Kahal, M.

    1984-03-01

    This three-volume report presents the results of an econometric forecast of peak and electric power demands for the Potomac Electric Power Company (PEPCO) through the year 2002. Volume I describes the methodology, the results of the econometric estimations, the forecast assumptions and the calculated forecasts of peak demand and energy usage. Separate sets of models were developed for the Maryland Suburbs (Montgomery and Prince George's counties), the District of Columbia and Southern Maryland (served by a wholesale customer of PEPCO). For each of the three jurisdictions, energy equations were estimated for residential and commercial/industrial customers for both summer and wintermore » seasons. For the District of Columbia, summer and winter equations for energy sales to the federal government were also estimated. Equations were also estimated for street lighting and energy losses. Noneconometric techniques were employed to forecast energy sales to the Northern Virginia suburbs, Metrorail and federal government facilities located in Maryland.« less

  12. NOAA Propagation Database Value in Tsunami Forecast Guidance

    NASA Astrophysics Data System (ADS)

    Eble, M. C.; Wright, L. M.

    2016-02-01

    The National Oceanic and Atmospheric Administration (NOAA) Center for Tsunami Research (NCTR) has developed a tsunami forecasting capability that combines a graphical user interface with data ingestion and numerical models to produce estimates of tsunami wave arrival times, amplitudes, current or water flow rates, and flooding at specific coastal communities. The capability integrates several key components: deep-ocean observations of tsunamis in real-time, a basin-wide pre-computed propagation database of water level and flow velocities based on potential pre-defined seismic unit sources, an inversion or fitting algorithm to refine the tsunami source based on the observations during an event, and tsunami forecast models. As tsunami waves propagate across the ocean, observations from the deep ocean are automatically ingested into the application in real-time to better define the source of the tsunami itself. Since passage of tsunami waves over a deep ocean reporting site is not immediate, we explore the value of the NOAA propagation database in providing placeholder forecasts in advance of deep ocean observations. The propagation database consists of water elevations and flow velocities pre-computed for 50 x 100 [km] unit sources in a continuous series along all known ocean subduction zones. The 2011 Japan Tohoku tsunami is presented as the case study

  13. Consumption of Energy in New York State: 1972 (with Estimates for 1973).

    ERIC Educational Resources Information Center

    Hausgaard, Olaf

    This report contains tabular data on energy consumption for the calendar year 1972 and a forecast of natural gas requirements for the period 1973 to 1976. Broad sector categories used in the tables are electric utilities, residential commercial, industrial, and transportation. Tables show energy consumption by primary source and major sector for…

  14. 24 CFR 3400.1 - Purpose.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... participate in a nationwide mortgage licensing system and registry database of residential mortgage loan... charged with establishing and maintaining a licensing and registry database for loan originators. (b.... Subpart D provides minimum requirements for the administration of the Nationwide Mortgage Licensing System...

  15. 24 CFR 3400.1 - Purpose.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... participate in a nationwide mortgage licensing system and registry database of residential mortgage loan... charged with establishing and maintaining a licensing and registry database for loan originators. (b.... Subpart D provides minimum requirements for the administration of the Nationwide Mortgage Licensing System...

  16. The Eruption Forecasting Information System (EFIS) database project

    NASA Astrophysics Data System (ADS)

    Ogburn, Sarah; Harpel, Chris; Pesicek, Jeremy; Wellik, Jay; Pallister, John; Wright, Heather

    2016-04-01

    The Eruption Forecasting Information System (EFIS) project is a new initiative of the U.S. Geological Survey-USAID Volcano Disaster Assistance Program (VDAP) with the goal of enhancing VDAP's ability to forecast the outcome of volcanic unrest. The EFIS project seeks to: (1) Move away from relying on the collective memory to probability estimation using databases (2) Create databases useful for pattern recognition and for answering common VDAP questions; e.g. how commonly does unrest lead to eruption? how commonly do phreatic eruptions portend magmatic eruptions and what is the range of antecedence times? (3) Create generic probabilistic event trees using global data for different volcano 'types' (4) Create background, volcano-specific, probabilistic event trees for frequently active or particularly hazardous volcanoes in advance of a crisis (5) Quantify and communicate uncertainty in probabilities A major component of the project is the global EFIS relational database, which contains multiple modules designed to aid in the construction of probabilistic event trees and to answer common questions that arise during volcanic crises. The primary module contains chronologies of volcanic unrest, including the timing of phreatic eruptions, column heights, eruptive products, etc. and will be initially populated using chronicles of eruptive activity from Alaskan volcanic eruptions in the GeoDIVA database (Cameron et al. 2013). This database module allows us to query across other global databases such as the WOVOdat database of monitoring data and the Smithsonian Institution's Global Volcanism Program (GVP) database of eruptive histories and volcano information. The EFIS database is in the early stages of development and population; thus, this contribution also serves as a request for feedback from the community.

  17. 12 CFR 1008.1 - Purpose.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... participate in a nationwide mortgage licensing system and registry database of residential mortgage loan... requirements, the Bureau is charged with establishing and maintaining a licensing and registry database for... administration of the Nationwide Mortgage Licensing System and Registry. (5) Subpart E clarifies the Bureau's...

  18. 12 CFR 1008.1 - Purpose.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... participate in a nationwide mortgage licensing system and registry database of residential mortgage loan... requirements, the Bureau is charged with establishing and maintaining a licensing and registry database for... administration of the Nationwide Mortgage Licensing System and Registry. (5) Subpart E clarifies the Bureau's...

  19. 12 CFR 1008.1 - Purpose.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... participate in a nationwide mortgage licensing system and registry database of residential mortgage loan... requirements, the Bureau is charged with establishing and maintaining a licensing and registry database for... administration of the Nationwide Mortgage Licensing System and Registry. (5) Subpart E clarifies the Bureau's...

  20. 77 FR 39985 - Information Collection; Forest Industries and Residential Fuelwood and Post Data Collection Systems

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-06

    ...-pulp or composite panel, primary wood-using mills, including small, part-time mills, as well as large... 1978 require the Forest Service to evaluate trends in the use of logs and wood chips, to forecast anticipated levels of logs and wood chips, and to analyze changes in the harvest of these resources from the...

  1. Release of ToxCastDB and ExpoCastDB databases

    EPA Science Inventory

    EPA has released two databases - the Toxicity Forecaster database (ToxCastDB) and a database of chemical exposure studies (ExpoCastDB) - that scientists and the public can use to access chemical toxicity and exposure data. ToxCastDB users can search and download data from over 50...

  2. Mission and Assets Database

    NASA Technical Reports Server (NTRS)

    Baldwin, John; Zendejas, Silvino; Gutheinz, Sandy; Borden, Chester; Wang, Yeou-Fang

    2009-01-01

    Mission and Assets Database (MADB) Version 1.0 is an SQL database system with a Web user interface to centralize information. The database stores flight project support resource requirements, view periods, antenna information, schedule, and forecast results for use in mid-range and long-term planning of Deep Space Network (DSN) assets.

  3. Housing price gradient and immigrant population: Data from the Italian real estate market.

    PubMed

    Antoniucci, Valentina; Marella, Giuliano

    2018-02-01

    The database presented here was collected by Antoniucci and Marella to analyze the correlation between the housing price gradient and the immigrant population in Italy during 2016. It may also be useful in other statistical analyses, be they on the real estate market or in another branches of social science. The data sample relates to 112 Italian provincial capitals. It provides accurate information on urban structure, and specifically on urban density. The two most significant variables are original indicators constructed from official data sources: the housing price gradient, or the ratio between average prices in the center and suburbs by city; and building density, which is the average number of housing units per residential building. The housing price gradient is calculated for the two residential sub-markets, new-build and existing units, providing an original and detailed sample of the Italian residential market. Rather than average prices, the housing price gradient helps to identify potential divergences in residential market trends. As well as house prices, two other data clusters are considered: socio-economic variables, which provide a framework of each city, in terms of demographic and economic information; and various data on urban structure, which are rarely included in the same database.

  4. The analysis of Taiwan's residential electricity demand under the electricity tariff policy

    NASA Astrophysics Data System (ADS)

    Chen, Po-Jui

    In October 2013, the Taiwan Power Company (Taipower), the monopolized state utility service in Taiwan, implemented an electricity tariff adjustment policy to reduce residential electricity demand. Using bi-monthly billing data from 6,932 electricity consumers, this study examine how consumers respond to an increase in electricity prices. This study employs an empirical approach that takes advantage of quasi-random variation over a period of time when household bills were affected by a change in electricity price. The study found that this price increase caused a 1.78% decline in residential electricity consumption, implying a price elasticity of -0.19 for summer-season months and -0.15 for non-summer-season months. The demand for electricity is therefore relatively inelastic, likely because it is hard for people to change their electricity consumption behavior in the short-term. The results of this study highlight that demand-side management cannot be the only lever used to address Taiwan's forecasted decrease in electricity supply.

  5. Energy performance standards for new buildings: Economic analysis

    NASA Astrophysics Data System (ADS)

    1980-01-01

    The major economic impacts of the implementations of the standards on affected groups were assessed and the effectiveness of the standards as an investment in energy conservation was evaluated. The methodology used to evaluate the standards for the various building types and perspectives is described. The net economic effect of changes in building cost and energy use are discussed for three categories of buildings: single family residential, commercial and multifamily residential, and mobile homes. Forecasts of energy savings and national costs and benefits both with and without implementation of the standards are presented. The effects of changes in energy consumption and construction of new buildings on the national economy, including such factors as national income, investment, employment, and balance of trade are assessed.

  6. Application research for 4D technology in flood forecasting and evaluation

    NASA Astrophysics Data System (ADS)

    Li, Ziwei; Liu, Yutong; Cao, Hongjie

    1998-08-01

    In order to monitor the region which disaster flood happened frequently in China, satisfy the great need of province governments for high accuracy monitoring and evaluated data for disaster and improve the efficiency for repelling disaster, under the Ninth Five-year National Key Technologies Programme, the method was researched for flood forecasting and evaluation using satellite and aerial remoted sensed image and land monitor data. The effective and practicable flood forecasting and evaluation system was established and DongTing Lake was selected as the test site. Modern Digital photogrammetry, remote sensing and GIS technology was used in this system, the disastrous flood could be forecasted and loss can be evaluated base on '4D' (DEM -- Digital Elevation Model, DOQ -- Digital OrthophotoQuads, DRG -- Digital Raster Graph, DTI -- Digital Thematic Information) disaster background database. The technology of gathering and establishing method for '4D' disaster environment background database, application technology for flood forecasting and evaluation based on '4D' background data and experimental results for DongTing Lake test site were introduced in detail in this paper.

  7. The National Solar Radiation Database (NSRDB)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sengupta, Manajit; Habte, Aron; Lopez, Anthony

    This presentation provides a high-level overview of the National Solar Radiation Database (NSRDB), including sensing, measurement and forecasting, and discusses observations that are needed for research and product development.

  8. The Use of Artificial Neural Networks for Forecasting the Electric Demand of Stand-Alone Consumers

    NASA Astrophysics Data System (ADS)

    Ivanin, O. A.; Direktor, L. B.

    2018-05-01

    The problem of short-term forecasting of electric power demand of stand-alone consumers (small inhabited localities) situated outside centralized power supply areas is considered. The basic approaches to modeling the electric power demand depending on the forecasting time frame and the problems set, as well as the specific features of such modeling, are described. The advantages and disadvantages of the methods used for the short-term forecast of the electric demand are indicated, and difficulties involved in the solution of the problem are outlined. The basic principles of arranging artificial neural networks are set forth; it is also shown that the proposed method is preferable when the input information necessary for prediction is lacking or incomplete. The selection of the parameters that should be included into the list of the input data for modeling the electric power demand of residential areas using artificial neural networks is validated. The structure of a neural network is proposed for solving the problem of modeling the electric power demand of residential areas. The specific features of generation of the training dataset are outlined. The results of test modeling of daily electric demand curves for some settlements of Kamchatka and Yakutia based on known actual electric demand curves are provided. The reliability of the test modeling has been validated. A high value of the deviation of the modeled curve from the reference curve obtained in one of the four reference calculations is explained. The input data and the predicted power demand curves for the rural settlement of Kuokuiskii Nasleg are provided. The power demand curves were modeled for four characteristic days of the year, and they can be used in the future for designing a power supply system for the settlement. To enhance the accuracy of the method, a series of measures based on specific features of a neural network's functioning are proposed.

  9. 42 CFR 436.407 - Types of acceptable documentary evidence of citizenship.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... perjury by a residential care facility director or administrator on behalf of an institutionalized... State database subsequent changes in eligibility should not require repeating the documentation of... databases to verify that the individual already established citizenship. (6) CMS requires that as a check...

  10. Irish persons with intellectual disability moving from family care to residential accommodation in a period of austerity.

    PubMed

    McConkey, Roy; Kelly, Fionnola; Craig, Sarah; Keogh, Fiona

    2018-02-09

    Ireland has a growing population of adult persons living with family carers, thereby increasing the demand for residential places. Simultaneously, government policy aimed to reprovision residents living in congregated settings but at a time when funding was curtailed due to the economic crisis. This study examines the movements of people into and among three types of residential options between 2009 and 2014. A cohort of 20,163 persons recorded on the National Intellectual Disability Database in 2009 was identified and tracked to the 2014 database. An estimated 200 persons per annum (@1.6% of those living with families) moved from family care although the number of places available nationally fell by 9%. Moreover, transfers of existing residents into vacated places tended to exceed those from families. More people will have to continue living with their families and for longer if funding for new places remains curtailed. © 2018 John Wiley & Sons Ltd.

  11. Optimization Based Data Mining Approah for Forecasting Real-Time Energy Demand

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Omitaomu, Olufemi A; Li, Xueping; Zhou, Shengchao

    The worldwide concern over environmental degradation, increasing pressure on electric utility companies to meet peak energy demand, and the requirement to avoid purchasing power from the real-time energy market are motivating the utility companies to explore new approaches for forecasting energy demand. Until now, most approaches for forecasting energy demand rely on monthly electrical consumption data. The emergence of smart meters data is changing the data space for electric utility companies, and creating opportunities for utility companies to collect and analyze energy consumption data at a much finer temporal resolution of at least 15-minutes interval. While the data granularity providedmore » by smart meters is important, there are still other challenges in forecasting energy demand; these challenges include lack of information about appliances usage and occupants behavior. Consequently, in this paper, we develop an optimization based data mining approach for forecasting real-time energy demand using smart meters data. The objective of our approach is to develop a robust estimation of energy demand without access to these other building and behavior data. Specifically, the forecasting problem is formulated as a quadratic programming problem and solved using the so-called support vector machine (SVM) technique in an online setting. The parameters of the SVM technique are optimized using simulated annealing approach. The proposed approach is applied to hourly smart meters data for several residential customers over several days.« less

  12. Military, Charter, Unreported Domestic Traffic and General Aviation 1976, 1984, 1992, and 2015 Emission Scenarios

    NASA Technical Reports Server (NTRS)

    Mortlock, Alan; VanAlstyne, Richard

    1998-01-01

    The report describes development of databases estimating aircraft engine exhaust emissions for the years 1976 and 1984 from global operations of Military, Charter, historic Soviet and Chinese, Unreported Domestic traffic, and General Aviation (GA). These databases were developed under the National Aeronautics and Space Administration's (NASA) Advanced Subsonic Assessment (AST). McDonnell Douglas Corporation's (MDC), now part of the Boeing Company has previously estimated engine exhaust emissions' databases for the baseline year of 1992 and a 2015 forecast year scenario. Since their original creation, (Ward, 1994 and Metwally, 1995) revised technology algorithms have been developed. Additionally, GA databases have been created and all past NIDC emission inventories have been updated to reflect the new technology algorithms. Revised data (Baughcum, 1996 and Baughcum, 1997) for the scheduled inventories have been used in this report to provide a comparison of the total aviation emission forecasts from various components. Global results of two historic years (1976 and 1984), a baseline year (1992) and a forecast year (2015) are presented. Since engine emissions are directly related to fuel usage, an overview of individual aviation annual global fuel use for each inventory component is also given in this report.

  13. AIR QUALITY FORECAST DATABASE AND ANALYSIS

    EPA Science Inventory

    In 2003, NOAA and EPA signed a Memorandum of Agreement to collaborate on the design and implementation of a capability to produce daily air quality modeling forecast information for the U.S. NOAA's ETA meteorological model and EPA's Community Multiscale Air Quality (CMAQ) model ...

  14. Air pollutant emissions from Chinese households: A major and underappreciated ambient pollution source.

    PubMed

    Liu, Jun; Mauzerall, Denise L; Chen, Qi; Zhang, Qiang; Song, Yu; Peng, Wei; Klimont, Zbigniew; Qiu, Xinghua; Zhang, Shiqiu; Hu, Min; Lin, Weili; Smith, Kirk R; Zhu, Tong

    2016-07-12

    As part of the 12th Five-Year Plan, the Chinese government has developed air pollution prevention and control plans for key regions with a focus on the power, transport, and industrial sectors. Here, we investigate the contribution of residential emissions to regional air pollution in highly polluted eastern China during the heating season, and find that dramatic improvements in air quality would also result from reduction in residential emissions. We use the Weather Research and Forecasting model coupled with Chemistry to evaluate potential residential emission controls in Beijing and in the Beijing, Tianjin, and Hebei (BTH) region. In January and February 2010, relative to the base case, eliminating residential emissions in Beijing reduced daily average surface PM2.5 (particulate mater with aerodynamic diameter equal or smaller than 2.5 micrometer) concentrations by 14 ± 7 μg⋅m(-3) (22 ± 6% of a baseline concentration of 67 ± 41 μg⋅m(-3); mean ± SD). Eliminating residential emissions in the BTH region reduced concentrations by 28 ± 19 μg⋅m(-3) (40 ± 9% of 67 ± 41 μg⋅m(-3)), 44 ± 27 μg⋅m(-3) (43 ± 10% of 99 ± 54 μg⋅m(-3)), and 25 ± 14 μg⋅m(-3) (35 ± 8% of 70 ± 35 μg⋅m(-3)) in Beijing, Tianjin, and Hebei provinces, respectively. Annually, elimination of residential sources in the BTH region reduced emissions of primary PM2.5 by 32%, compared with 5%, 6%, and 58% achieved by eliminating emissions from the transportation, power, and industry sectors, respectively. We also find air quality in Beijing would benefit substantially from reductions in residential emissions from regional controls in Tianjin and Hebei, indicating the value of policies at the regional level.

  15. Forecasting in foodservice: model development, testing, and evaluation.

    PubMed

    Miller, J L; Thompson, P A; Orabella, M M

    1991-05-01

    This study was designed to develop, test, and evaluate mathematical models appropriate for forecasting menu-item production demand in foodservice. Data were collected from residence and dining hall foodservices at Ohio State University. Objectives of the study were to collect, code, and analyze the data; develop and test models using actual operation data; and compare forecasting results with current methods in use. Customer count was forecast using deseasonalized simple exponential smoothing. Menu-item demand was forecast by multiplying the count forecast by a predicted preference statistic. Forecasting models were evaluated using mean squared error, mean absolute deviation, and mean absolute percentage error techniques. All models were more accurate than current methods. A broad spectrum of forecasting techniques could be used by foodservice managers with access to a personal computer and spread-sheet and database-management software. The findings indicate that mathematical forecasting techniques may be effective in foodservice operations to control costs, increase productivity, and maximize profits.

  16. 42 CFR 435.407 - Types of acceptable documentary evidence of citizenship.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... under penalty of perjury by a residential care facility director or administrator on behalf of an... documented and recorded in a State database subsequent changes in eligibility should not require repeating.... The State need only check its databases to verify that the individual already established citizenship...

  17. Probabilistic flood warning using grand ensemble weather forecasts

    NASA Astrophysics Data System (ADS)

    He, Y.; Wetterhall, F.; Cloke, H.; Pappenberger, F.; Wilson, M.; Freer, J.; McGregor, G.

    2009-04-01

    As the severity of floods increases, possibly due to climate and landuse change, there is urgent need for more effective and reliable warning systems. The incorporation of numerical weather predictions (NWP) into a flood warning system can increase forecast lead times from a few hours to a few days. A single NWP forecast from a single forecast centre, however, is insufficient as it involves considerable non-predictable uncertainties and can lead to a high number of false or missed warnings. An ensemble of weather forecasts from one Ensemble Prediction System (EPS), when used on catchment hydrology, can provide improved early flood warning as some of the uncertainties can be quantified. EPS forecasts from a single weather centre only account for part of the uncertainties originating from initial conditions and stochastic physics. Other sources of uncertainties, including numerical implementations and/or data assimilation, can only be assessed if a grand ensemble of EPSs from different weather centres is used. When various models that produce EPS from different weather centres are aggregated, the probabilistic nature of the ensemble precipitation forecasts can be better retained and accounted for. The availability of twelve global EPSs through the 'THORPEX Interactive Grand Global Ensemble' (TIGGE) offers a new opportunity for the design of an improved probabilistic flood forecasting framework. This work presents a case study using the TIGGE database for flood warning on a meso-scale catchment. The upper reach of the River Severn catchment located in the Midlands Region of England is selected due to its abundant data for investigation and its relatively small size (4062 km2) (compared to the resolution of the NWPs). This choice was deliberate as we hypothesize that the uncertainty in the forcing of smaller catchments cannot be represented by a single EPS with a very limited number of ensemble members, but only through the variance given by a large number ensembles and ensemble system. A coupled atmospheric-hydrologic-hydraulic cascade system driven by the TIGGE ensemble forecasts is set up to study the potential benefits of using the TIGGE database in early flood warning. Physically based and fully distributed LISFLOOD suite of models is selected to simulate discharge and flood inundation consecutively. The results show the TIGGE database is a promising tool to produce forecasts of discharge and flood inundation comparable with the observed discharge and simulated inundation driven by the observed discharge. The spread of discharge forecasts varies from centre to centre, but it is generally large, implying a significant level of uncertainties. Precipitation input uncertainties dominate and propagate through the cascade chain. The current NWPs fall short of representing the spatial variability of precipitation on a comparatively small catchment. This perhaps indicates the need to improve NWPs resolution and/or disaggregation techniques to narrow down the spatial gap between meteorology and hydrology. It is not necessarily true that early flood warning becomes more reliable when more ensemble forecasts are employed. It is difficult to identify the best forecast centre(s), but in general the chance of detecting floods is increased by using the TIGGE database. Only one flood event was studied because most of the TIGGE data became available after October 2007. It is necessary to test the TIGGE ensemble forecasts with other flood events in other catchments with different hydrological and climatic regimes before general conclusions can be made on its robustness and applicability.

  18. Trends of multiple air pollutants emissions from residential coal combustion in Beijing and its implication on improving air quality for control measures

    NASA Astrophysics Data System (ADS)

    Xue, Yifeng; Zhou, Zhen; Nie, Teng; Wang, Kun; Nie, Lei; Pan, Tao; Wu, Xiaoqing; Tian, Hezhong; Zhong, Lianhong; Li, Jing; Liu, Huanjia; Liu, Shuhan; Shao, Panyang

    2016-10-01

    Residential coal combustion is considered to be an important source of air pollution in Beijing. However, knowledge regarding the emission characteristics of residential coal combustion and the related impacts on the air quality is very limited. In this study, we have developed an emission inventory for multiple hazardous air pollutants (HAPs) associated with residential coal combustion in Beijing for the period of 2000-2012. Furthermore, a widely used regional air quality model, the Community Multi-Scale Air Quality model (CMAQ), is applied to analyze the impact of residential coal combustion on the air quality in Beijing in 2012. The results show that the emissions of primary air pollutants from residential coal combustion have basically remained the same levels during the past decade, however, along with the strict emission control imposed on major industrial sources, the contribution of residential coal combustion emissions to the overall emissions from anthropogenic sources have increased obviously. In particular, the contributions of residential coal combustion to the total air pollutants concentrations of PM10, SO2, NOX, and CO represent approximately 11.6%, 27.5%, 2.8% and 7.3%, respectively, during the winter heating season. In terms of impact on the spatial variation patterns, the distributions of the pollutants concentrations are similar to the distribution of the associated primary HAPs emissions, which are highly concentrated in the rural-urban fringe zones and rural suburb areas. In addition, emissions of primary pollutants from residential coal combustion are forecasted by using a scenario analysis. Generally, comprehensive measures must be taken to control residential coal combustion in Beijing. The best way to reduce the associated emissions from residential coal combustion is to use economic incentive means to promote the conversion to clean energy sources for residential heating and cooking. In areas with reliable energy supplies, the coal used for residential heating can be replaced with gas-burning wall-heaters, ground-source heat pumps, solar energy and electricity. In areas with inadequate clean energy sources, low-sulfur coal should be used instead of the traditional raw coal with high sulfur and ash content, thereby slightly reducing the emissions of PM, SO2, CO and other toxic pollutants.

  19. The Assessment of Climatological Impacts on Agricultural Production and Residential Energy Demand

    NASA Astrophysics Data System (ADS)

    Cooter, Ellen Jean

    The assessment of climatological impacts on selected economic activities is presented as a multi-step, inter -disciplinary problem. The assessment process which is addressed explicitly in this report focuses on (1) user identification, (2) direct impact model selection, (3) methodological development, (4) product development and (5) product communication. Two user groups of major economic importance were selected for study; agriculture and gas utilities. The broad agricultural sector is further defined as U.S.A. corn production. The general category of utilities is narrowed to Oklahoma residential gas heating demand. The CERES physiological growth model was selected as the process model for corn production. The statistical analysis for corn production suggests that (1) although this is a statistically complex model, it can yield useful impact information, (2) as a result of output distributional biases, traditional statistical techniques are not adequate analytical tools, (3) the model yield distribution as a whole is probably non-Gausian, particularly in the tails and (4) there appears to be identifiable weekly patterns of forecasted yields throughout the growing season. Agricultural quantities developed include point yield impact estimates and distributional characteristics, geographic corn weather distributions, return period estimates, decision making criteria (confidence limits) and time series of indices. These products were communicated in economic terms through the use of a Bayesian decision example and an econometric model. The NBSLD energy load model was selected to represent residential gas heating consumption. A cursory statistical analysis suggests relationships among weather variables across the Oklahoma study sites. No linear trend in "technology -free" modeled energy demand or input weather variables which would correspond to that contained in observed state -level residential energy use was detected. It is suggested that this trend is largely the result of non-weather factors such as population and home usage patterns rather than regional climate change. Year-to-year changes in modeled residential heating demand on the order of 10('6) Btu's per household were determined and later related to state -level components of the Oklahoma economy. Products developed include the definition of regional forecast areas, likelihood estimates of extreme seasonal conditions and an energy/climate index. This information is communicated in economic terms through an input/output model which is used to estimate changes in Gross State Product and Household income attributable to weather variability.

  20. Forecasting the impact of transport improvements on commuting and residential choice

    NASA Astrophysics Data System (ADS)

    Elhorst, J. Paul; Oosterhaven, Jan

    2006-03-01

    This paper develops a probabilistic, competing-destinations, assignment model that predicts changes in the spatial pattern of the working population as a result of transport improvements. The choice of residence is explained by a new non-parametric model, which represents an alternative to the popular multinominal logit model. Travel times between zones are approximated by a normal distribution function with different mean and variance for each pair of zones, whereas previous models only use average travel times. The model’s forecast error of the spatial distribution of the Dutch working population is 7% when tested on 1998 base-year data. To incorporate endogenous changes in its causal variables, an almost ideal demand system is estimated to explain the choice of transport mode, and a new economic geography inter-industry model (RAEM) is estimated to explain the spatial distribution of employment. In the application, the model is used to forecast the impact of six mutually exclusive Dutch core-periphery railway proposals in the projection year 2020.

  1. Intelligent demand side management of residential building energy systems

    NASA Astrophysics Data System (ADS)

    Sinha, Maruti N.

    Advent of modern sensing technologies, data processing capabilities and rising cost of energy are driving the implementation of intelligent systems in buildings and houses which constitute 41% of total energy consumption. The primary motivation has been to provide a framework for demand-side management and to improve overall reliability. The entire formulation is to be implemented on NILM (Non-Intrusive Load Monitoring System), a smart meter. This is going to play a vital role in the future of demand side management. Utilities have started deploying smart meters throughout the world which will essentially help to establish communication between utility and consumers. This research is focused on investigation of a suitable thermal model of residential house, building up control system and developing diagnostic and energy usage forecast tool. The present work has considered measurement based approach to pursue. Identification of building thermal parameters is the very first step towards developing performance measurement and controls. The proposed identification technique is PEM (Prediction Error Method) based, discrete state-space model. The two different models have been devised. First model is focused toward energy usage forecast and diagnostics. Here one of the novel idea has been investigated which takes integral of thermal capacity to identify thermal model of house. The purpose of second identification is to build up a model for control strategy. The controller should be able to take into account the weather forecast information, deal with the operating point constraints and at the same time minimize the energy consumption. To design an optimal controller, MPC (Model Predictive Control) scheme has been implemented instead of present thermostatic/hysteretic control. This is a receding horizon approach. Capability of the proposed schemes has also been investigated.

  2. Establishing an Environmental Scanning/Forecasting System to Augment College and University Planning.

    ERIC Educational Resources Information Center

    Morrison, James L.

    1987-01-01

    The major benefit of an environmental scanning/forecasting system is in providing critical information for strategic planning. Such a system allows the institution to detect social, technological, economic, and political trends and potential events. The environmental scanning database developed by United Way of America is described. (MLW)

  3. Major Risks, Uncertain Outcomes: Making Ensemble Forecasts Work for Multiple Audiences

    NASA Astrophysics Data System (ADS)

    Semmens, K. A.; Montz, B.; Carr, R. H.; Maxfield, K.; Ahnert, P.; Shedd, R.; Elliott, J.

    2017-12-01

    When extreme river levels are possible in a community, effective communication of weather and hydrologic forecasts is critical to protect life and property. Residents, emergency personnel, and water resource managers need to make timely decisions about how and when to prepare. Uncertainty in forecasting is a critical component of this decision-making, but often poses a confounding factor for public and professional understanding of forecast products. In 2016 and 2017, building on previous research about the use of uncertainty forecast products, and with funding from NOAA's CSTAR program, East Carolina University and Nurture Nature Center (a non-profit organization with a focus on flooding issues, based in Easton, PA) conducted a research project to understand how various audiences use and interpret ensemble forecasts showing a range of hydrologic forecast possibilities. These audiences include community residents, emergency managers and water resource managers. The research team held focus groups in Jefferson County, WV and Frederick County, MD, to test a new suite of products from the National Weather Service's Hydrologic Ensemble Forecast System (HEFS). HEFS is an ensemble system that provides short and long-range forecasts, ranging from 6 hours to 1 year, showing uncertainty in hydrologic forecasts. The goal of the study was to assess the utility of the HEFS products, identify the barriers to proper understanding of the products, and suggest modifications to product design that could improve the understandability and accessibility for residential, emergency managers, and water resource managers. The research team worked with the Sterling, VA Weather Forecast Office and the Middle Atlantic River Forecast center to develop a weather scenario as the basis of the focus group discussions, which also included pre and post session surveys. This presentation shares the findings from those focus group discussions and surveys, including recommendations for revisions to HEFS products to improve accessibility of the forecast tools for various audiences. The presentation will provide a broad perspective on the range of graphic design considerations that affected how the public responded to products and will provide an overview of lessons learned about how product design can influence decision-making by users.

  4. Development of Hydrometeorological Monitoring and Forecasting as AN Essential Component of the Early Flood Warning System:

    NASA Astrophysics Data System (ADS)

    Manukalo, V.

    2012-12-01

    Defining issue The river inundations are the most common and destructive natural hazards in Ukraine. Among non-structural flood management and protection measures a creation of the Early Flood Warning System is extremely important to be able to timely recognize dangerous situations in the flood-prone areas. Hydrometeorological information and forecasts are a core importance in this system. The primary factors affecting reliability and a lead - time of forecasts include: accuracy, speed and reliability with which real - time data are collected. The existing individual conception of monitoring and forecasting resulted in a need in reconsideration of the concept of integrated monitoring and forecasting approach - from "sensors to database and forecasters". Result presentation The Project: "Development of Flood Monitoring and Forecasting in the Ukrainian part of the Dniester River Basin" is presented. The project is developed by the Ukrainian Hydrometeorological Service in a conjunction with the Water Management Agency and the Energy Company "Ukrhydroenergo". The implementation of the Project is funded by the Ukrainian Government and the World Bank. The author is nominated as the responsible person for coordination of activity of organizations involved in the Project. The term of the Project implementation: 2012 - 2014. The principal objectives of the Project are: a) designing integrated automatic hydrometeorological measurement network (including using remote sensing technologies); b) hydrometeorological GIS database construction and coupling with electronic maps for flood risk assessment; c) interface-construction classic numerical database -GIS and with satellite images, and radar data collection; d) providing the real-time data dissemination from observation points to forecasting centers; e) developing hydrometeoroogical forecasting methods; f) providing a flood hazards risk assessment for different temporal and spatial scales; g) providing a dissemination of current information, forecasts and warnings to consumers automatically. Besides scientific and technical issues the implementation of these objectives requires solution of a number of organizational issues. Thus, as a result of the increased complexity of types of hydrometeorological data and in order to develop forecasting methods, a reconsideration of meteorological and hydrological measurement networks should be carried out. The "optimal density of measuring networks" is proposed taking into account principal terms: a) minimizing an uncertainty in characterizing the spacial distribution of hydrometeorological parameters; b) minimizing the Total Life Cycle Cost of creation and maintenance of measurement networks. Much attention will be given to training Ukrainian disaster management authorities from the Ministry of Emergencies and the Water Management Agency to identify the flood hazard risk level and to indicate the best protection measures on the basis of continuous monitoring and forecasts of evolution of meteorological and hydrological conditions in the river basin.

  5. Short-Term Energy Outlook Model Documentation: Natural Gas Consumption and Prices

    EIA Publications

    2015-01-01

    The natural gas consumption and price modules of the Short-Term Energy Outlook (STEO) model are designed to provide consumption and end-use retail price forecasts for the residential, commercial, and industrial sectors in the nine Census districts and natural gas working inventories in three regions. Natural gas consumption shares and prices in each Census district are used to calculate an average U.S. retail price for each end-use sector.

  6. The Impact of Residential Combustion Emissions on Air Quality and Human Health in China

    NASA Astrophysics Data System (ADS)

    Archer-Nicholls, S.; Wiedinmyer, C.; Baumgartner, J.; Brauer, M.; Cohen, A.; Carter, E.; Frostad, J.; Forouzanfar, M.; Xiao, Q.; Liu, Y.; Yang, X.; Hongjiang, N.; Kun, N.

    2015-12-01

    Solid fuel cookstoves are used heavily in rural China for both residential cooking and heating purposes. Their use contributes significantly to regional emissions of several key pollutants, including carbon monoxide, volatile organic compounds, oxides of nitrogen, and aerosol particles. The residential sector was responsible for approximately 36%, 46% and 81% of China's total primary PM2.5, BC and OC emissions respectively in 2005 (Lei et al., 2011). These emissions have serious consequences for household air pollution, ambient air quality, tropospheric ozone formation, and the resulting population health and climate impacts. This paper presents initial findings from the modeling component of a multi-disciplinary energy intervention study currently being conducted in Sichuan, China. The purpose of this effort is to quantify the impact of residential cooking and heating emissions on regional air quality and human health. Simulations with varying levels of residential emissions have been carried out for the whole of 2014 using the Weather Research and Forecasting model with Chemistry (WRF-Chem), a fully-coupled, "online" regional chemical transport model. Model output is evaluated against surface air quality measurements across China and compared with seasonal (winter and summer) ambient air pollution measurements conducted at the Sichuan study site in 2014. The model output is applied to available exposure—response relationships between PM2.5 and cardiopulmonary health outcomes. The sensitivity in different regions across China to the different cookstove emission scenarios and seasonality of impacts are presented. By estimating the mortality and disease burden risk attributable to residential emissions we demonstrate the potential benefits from large-scale energy interventions. Lei Y, Zhang Q, He KB, Streets DG. 2011. Primary anthropogenic aerosol emission trends for China, 1990-2005. Atmos. Chem. Phys. 11:931-954.

  7. Coherent mortality forecasts for a group of populations: An extension of the Lee-Carter method

    PubMed Central

    Li, Nan; Lee, Ronald

    2005-01-01

    Mortality patterns and trajectories in closely related populations are likely to be similar in some respects, and differences are unlikely to increase in the long run. It should therefore be possible to improve the mortality forecasts for individual countries by taking into account the patterns in a larger group. Using the Human Mortality Database, we apply the Lee-Carter model to a group of populations, allowing each its own age pattern and level of mortality but imposing shared rates of change by age. Our forecasts also allow divergent patterns to continue for a while before tapering off. We forecast greater longevity gains for the US and lesser ones for Japan relative to separate forecasts. PMID:16235614

  8. TRAVEL FORECASTER

    NASA Technical Reports Server (NTRS)

    Mauldin, L. E.

    1994-01-01

    Business travel planning within an organization is often a time-consuming task. Travel Forecaster is a menu-driven, easy-to-use program which plans, forecasts cost, and tracks actual vs. planned cost for business-related travel of a division or branch of an organization and compiles this information into a database to aid the travel planner. The program's ability to handle multiple trip entries makes it a valuable time-saving device. Travel Forecaster takes full advantage of relational data base properties so that information that remains constant, such as per diem rates and airline fares (which are unique for each city), needs entering only once. A typical entry would include selection with the mouse of the traveler's name and destination city from pop-up lists, and typed entries for number of travel days and purpose of the trip. Multiple persons can be selected from the pop-up lists and multiple trips are accommodated by entering the number of days by each appropriate month on the entry form. An estimated travel cost is not required of the user as it is calculated by a Fourth Dimension formula. With this information, the program can produce output of trips by month with subtotal and total cost for either organization or sub-entity of an organization; or produce outputs of trips by month with subtotal and total cost for international-only travel. It will also provide monthly and cumulative formats of planned vs. actual outputs in data or graph form. Travel Forecaster users can do custom queries to search and sort information in the database, and it can create custom reports with the user-friendly report generator. Travel Forecaster 1.1 is a database program for use with Fourth Dimension Runtime 2.1.1. It requires a Macintosh Plus running System 6.0.3 or later, 2Mb of RAM and a hard disk. The standard distribution medium for this package is one 3.5 inch 800K Macintosh format diskette. Travel Forecaster was developed in 1991. Macintosh is a registered trademark of Apple Computer, Inc. Fourth Dimension is a registered trademark of Acius, Inc.

  9. A Study of Rapidly Developing Low Cloud Ceilings in a Stable Atmosphere at the Florida Spaceport

    NASA Technical Reports Server (NTRS)

    Wheeler, Mark M.; Case, Jonathan L.; Baggett, G. Wayne

    2006-01-01

    Forecasters at the Space Meteorology Group (SMG) issue 30 to 90 minute forecasts for low cloud ceilings at the Shuttle Landing Facility (KTTS) in Kennedy Space Center, FL for all Space Shuttle missions. Mission verification statistics have shown cloud ceilings to be the biggest forecast challenge. SMG forecasters are especially concerned with rapidly developing cloud ceilings below 8000 ft. in a stable, capped thermodynamic environment because ceilings below 8000 ft restrict Shuttle landing operations and are the most challenging to predict accurately. This project involves the development of a database of these cases over east-central Florida in order to identify the onset, location, and if possible, dissipation times of rapidly-developing low cloud ceilings. Another goal is to document the atmospheric regimes favoring this type of cloud development to improve forecast skill of such events during Space Shuttle launch and landing operations. A 10-year database of stable, rapid low cloud development days during the daylight hours was compiled for the Florida cool-season months by examining the Cape Canaveral Air Force Station sounding data, and identifying days that had high boundary layer relative humidity associated with a thermally-capped environment below 8000 ft. Archived hourly surface observations from KTTS and Melbourne, Orlando, Sanford, and Ocala, FL were then examined for the onset of cloud ceilings below 8000 ft between 1100 and 2000 UTC. Once the database was supplemented with the hourly surface cloud observations, visible satellite imagery was examined in 30-minute intervals to confirm event occurrences. This paper will present results from some of the rapidly developing cloud ceiling cases and the prevailing meteorological conditions associated with these events, focusing on potential pre-curser information that may help improve their prediction.

  10. Applicability of land use models for the Houston area test site

    NASA Technical Reports Server (NTRS)

    Petersburg, R. K.; Bradford, L. H.

    1973-01-01

    Descriptions of land use models are presented which were considered for their applicability to the Houston Area Test Site. These models are representative both of the prevailing theories of land use dynamics and of basic approaches to simulation. The models considered are: a model of metropolis, land use simulation model, emperic land use forecasting model, a probabilistic model for residential growth, and the regional environmental management allocation process. Sources of environmental/resource information are listed.

  11. Emission Sectoral Contributions of Foreign Emissions to Particulate Matter Concentrations over South Korea

    NASA Astrophysics Data System (ADS)

    Kim, E.; Kim, S.; Kim, H. C.; Kim, B. U.; Cho, J. H.; Woo, J. H.

    2017-12-01

    In this study, we investigated the contributions of major emission source categories located upwind of South Korea to Particulate Matter (PM) in South Korea. In general, air quality in South Korea is affected by anthropogenic air pollutants emitted from foreign countries including China. Some studies reported that foreign emissions contributed 50 % of annual surface PM total mass concentrations in the Seoul Metropolitan Area, South Korea in 2014. Previous studies examined PM contributions of foreign emissions from all sectors considering meteorological variations. However, little studies conducted to assess contributions of specific foreign source categories. Therefore, we attempted to estimate sectoral contributions of foreign emissions from China to South Korea PM using our air quality forecasting system. We used Model Inter-Comparison Study in Asia 2010 for foreign emissions and Clean Air Policy Support System 2010 emission inventories for domestic emissions. To quantify contributions of major emission sectors to South Korea PM, we applied the Community Multi-scale Air Quality system with brute force method by perturbing emissions from industrial, residential, fossil-fuel power plants, transportation, and agriculture sectors in China. We noted that industrial sector was pre-dominant over the region except during cold season for primary PMs when residential emissions drastically increase due to heating demand. This study will benefit ensemble air quality forecasting and refined control strategy design by providing quantitative assessment on seasonal contributions of foreign emissions from major source categories.

  12. Air pollutant emissions from Chinese households: A major and underappreciated ambient pollution source

    PubMed Central

    Liu, Jun; Mauzerall, Denise L.; Chen, Qi; Zhang, Qiang; Song, Yu; Peng, Wei; Klimont, Zbigniew; Qiu, Xinghua; Zhang, Shiqiu; Hu, Min; Lin, Weili; Smith, Kirk R.; Zhu, Tong

    2016-01-01

    As part of the 12th Five-Year Plan, the Chinese government has developed air pollution prevention and control plans for key regions with a focus on the power, transport, and industrial sectors. Here, we investigate the contribution of residential emissions to regional air pollution in highly polluted eastern China during the heating season, and find that dramatic improvements in air quality would also result from reduction in residential emissions. We use the Weather Research and Forecasting model coupled with Chemistry to evaluate potential residential emission controls in Beijing and in the Beijing, Tianjin, and Hebei (BTH) region. In January and February 2010, relative to the base case, eliminating residential emissions in Beijing reduced daily average surface PM2.5 (particulate mater with aerodynamic diameter equal or smaller than 2.5 micrometer) concentrations by 14 ± 7 μg⋅m−3 (22 ± 6% of a baseline concentration of 67 ± 41 μg⋅m−3; mean ± SD). Eliminating residential emissions in the BTH region reduced concentrations by 28 ± 19 μg⋅m−3 (40 ± 9% of 67 ± 41 μg⋅m−3), 44 ± 27 μg⋅m−3 (43 ± 10% of 99 ± 54 μg⋅m−3), and 25 ± 14 μg⋅m−3 (35 ± 8% of 70 ± 35 μg⋅m−3) in Beijing, Tianjin, and Hebei provinces, respectively. Annually, elimination of residential sources in the BTH region reduced emissions of primary PM2.5 by 32%, compared with 5%, 6%, and 58% achieved by eliminating emissions from the transportation, power, and industry sectors, respectively. We also find air quality in Beijing would benefit substantially from reductions in residential emissions from regional controls in Tianjin and Hebei, indicating the value of policies at the regional level. PMID:27354524

  13. Tsunami early warning in the Mediterranean: role, structure and tricks of pre-computed tsunami simulation databases and matching/forecasting algorithms

    NASA Astrophysics Data System (ADS)

    Armigliato, Alberto; Pagnoni, Gianluca; Tinti, Stefano

    2014-05-01

    The general idea that pre-computed simulated scenario databases can play a key role in conceiving tsunami early warning systems is commonly accepted by now. But it was only in the last decade that it started to be applied to the Mediterranean region, taking special impulse from initiatives like the GDACS and from recently concluded EU-funded projects such as TRIDEC and NearToWarn. With reference to these two projects and with the possibility of further developing this research line in the frame of the FP7 ASTARTE project, we discuss some results we obtained regarding two major topics, namely the strategies applicable to the tsunami scenario database building and the design and performance assessment of a timely and "reliable" elementary-scenario combination algorithm to be run in real-time. As for the first theme, we take advantage of the experience gained in the test areas of Western Iberia, Rhodes (Greece) and Cyprus to illustrate the criteria with which a "Matching Scenario Database" (MSDB) can be built. These involve 1) the choice of the main tectonic tsunamigenic sources (or areas), 2) their tessellation with matrices of elementary faults whose dimension heavily depend on the particular studied area and must be a compromise between the needs to represent the tsunamigenic area in sufficient detail and of limiting the number of scenarios to be simulated, 3) the computation of the scenarios themselves, 4) the choice of the relevant simulation outputs and the standardisation of their formats. Regarding the matching/forecast algorithm, we want it to select and combine the MSDB elements based on the initial earthquake magnitude and location estimate, and to produce a forecast of (at least) the tsunami arrival time, amplitude and period at the closest tide-level sensors and in all needed forecast points. We discuss the performance of the algorithm in terms of the time needed to produce the forecast after the earthquake is detected. In particular, we analyse the different contributions of a number of factors such as the efficient code development and availability of cutting-edge hardware to run the code itself, the wise selection of the MSDB outputs to be combined, the choice of the forecast points where water elevation time series must be taken into account, and few others.

  14. Solar Market Research and Analysis Publications | Solar Research | NREL

    Science.gov Websites

    lifespan, and saving costs. The report is an expanded edition of an interim report published in 2015. Cost achieving the SETO 2030 residential PV cost target of $0.05 /kWh by identifying and quantifying cost reduction opportunities. Distribution Grid Integration Unit Cost Database: This database contains unit cost

  15. Loss estimation and damage forecast using database provided

    NASA Astrophysics Data System (ADS)

    Pyrchenko, V.; Byrova, V.; Petrasov, A.

    2009-04-01

    There is a wide spectrum of development of natural hazards is observed in Russian territory. It the necessity of investigation of numerous events of dangerous natural processes, researches of mechanisms of their development and interaction with each other (synergetic amplification or new hazards emerging) with the purpose of the forecast of possible losses. Employees of Laboratory of the analysis of geological risk IEG RAS have created a database about displays of natural hazards in territory of Russia, which contains the information on 1310 cases of their display during 1991 - 2008. The wide range of the used sources has determined certain difficulties in creation of Database and has demanded to develop a special new technique of unification of the information received at different times. One of points of this technique is classification of negative consequences of display of the natural hazards, considering a death-roll, wounded mans, victims and direct economic damage. This Database has allowed to track dynamics of natural hazards and the emergency situations caused by them (ES) for the considered period, and also to define laws of their development in territory of Russia in time and space. It gives the chance to create theoretical, methodological and methodical bases of forecasting of possible losses with a certain degree of probability for territory of Russia and for its separate regions that guarantees in the future maintenance of adequate, operative and efficient pre-emptive decision-making.

  16. Delivering culturally appropriate residential rehabilitation for urban Indigenous Australians: a review of the challenges and opportunities.

    PubMed

    Taylor, Kate; Thompson, Sandra; Davis, Robyn

    2010-07-01

    To review the challenges facing Indigenous and mainstream services in delivering residential rehabilitation services to Indigenous Australians, and explore opportunities to enhance outcomes. A literature review was conducted using keyword searches of databases, on-line journals, articles, national papers, conference proceedings and reports from different organisations, with snowball follow-up of relevant citations. Each article was assessed for quality using recognised criteria. Despite debate about the effectiveness of mainstream residential alcohol rehabilitation treatment, most Indigenous Australians with harmful alcohol consumption who seek help have a strong preference for residential treatment. While there is a significant gap in the cultural appropriateness of mainstream services for Indigenous clients, Indigenous-controlled residential organisations also face issues in service delivery. Limitations and inherent difficulties in rigorous evaluation processes further plague both areas of service provision. With inadequate evidence surrounding what constitutes 'best practice' for Indigenous clients in residential settings, more research is needed to investigate, evaluate and contribute to the further development of culturally appropriate models of best practice. In urban settings, a key area for innovation involves improving the capacity and quality of service delivery through effective inter-agency partnerships between Indigenous and mainstream service providers.

  17. Multimodel hydrological ensemble forecasts for the Baskatong catchment in Canada using the TIGGE database.

    NASA Astrophysics Data System (ADS)

    Tito Arandia Martinez, Fabian

    2014-05-01

    Adequate uncertainty assessment is an important issue in hydrological modelling. An important issue for hydropower producers is to obtain ensemble forecasts which truly grasp the uncertainty linked to upcoming streamflows. If properly assessed, this uncertainty can lead to optimal reservoir management and energy production (ex. [1]). The meteorological inputs to the hydrological model accounts for an important part of the total uncertainty in streamflow forecasting. Since the creation of the THORPEX initiative and the TIGGE database, access to meteorological ensemble forecasts from nine agencies throughout the world have been made available. This allows for hydrological ensemble forecasts based on multiple meteorological ensemble forecasts. Consequently, both the uncertainty linked to the architecture of the meteorological model and the uncertainty linked to the initial condition of the atmosphere can be accounted for. The main objective of this work is to show that a weighted combination of meteorological ensemble forecasts based on different atmospheric models can lead to improved hydrological ensemble forecasts, for horizons from one to ten days. This experiment is performed for the Baskatong watershed, a head subcatchment of the Gatineau watershed in the province of Quebec, in Canada. Baskatong watershed is of great importance for hydro-power production, as it comprises the main reservoir for the Gatineau watershed, on which there are six hydropower plants managed by Hydro-Québec. Since the 70's, they have been using pseudo ensemble forecast based on deterministic meteorological forecasts to which variability derived from past forecasting errors is added. We use a combination of meteorological ensemble forecasts from different models (precipitation and temperature) as the main inputs for hydrological model HSAMI ([2]). The meteorological ensembles from eight of the nine agencies available through TIGGE are weighted according to their individual performance and combined to form a grand ensemble. Results show that the hydrological forecasts derived from the grand ensemble perform better than the pseudo ensemble forecasts actually used operationally at Hydro-Québec. References: [1] M. Verbunt, A. Walser, J. Gurtz et al., "Probabilistic flood forecasting with a limited-area ensemble prediction system: Selected case studies," Journal of Hydrometeorology, vol. 8, no. 4, pp. 897-909, Aug, 2007. [2] N. Evora, Valorisation des prévisions météorologiques d'ensemble, Institu de recherceh d'Hydro-Québec 2005. [3] V. Fortin, Le modèle météo-apport HSAMI: historique, théorie et application, Institut de recherche d'Hydro-Québec, 2000.

  18. Severe Weather Forecast Decision Aid

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III; Wheeler, Mark M.; Short, David A.

    2005-01-01

    This report presents a 15-year climatological study of severe weather events and related severe weather atmospheric parameters. Data sources included local forecast rules, archived sounding data, Cloud-to-Ground Lightning Surveillance System (CGLSS) data, surface and upper air maps, and two severe weather event databases covering east-central Florida. The local forecast rules were used to set threat assessment thresholds for stability parameters that were derived from the sounding data. The severe weather events databases were used to identify days with reported severe weather and the CGLSS data was used to differentiate between lightning and non-lightning days. These data sets provided the foundation for analyzing the stability parameters and synoptic patterns that were used to develop an objective tool to aid in forecasting severe weather events. The period of record for the analysis was May - September, 1989 - 2003. The results indicate that there are certain synoptic patterns more prevalent on days with severe weather and some of the stability parameters are better predictors of severe weather days based on locally tuned threat values. The results also revealed the stability parameters that did not display any skill related to severe weather days. An interactive web-based Severe Weather Decision Aid was developed to assist the duty forecaster by providing a level of objective guidance based on the analysis of the stability parameters, CGLSS data, and synoptic-scale dynamics. The tool will be tested and evaluated during the 2005 warm season.

  19. Forecasting of particulate matter time series using wavelet analysis and wavelet-ARMA/ARIMA model in Taiyuan, China.

    PubMed

    Zhang, Hong; Zhang, Sheng; Wang, Ping; Qin, Yuzhe; Wang, Huifeng

    2017-07-01

    Particulate matter with aerodynamic diameter below 10 μm (PM 10 ) forecasting is difficult because of the uncertainties in describing the emission and meteorological fields. This paper proposed a wavelet-ARMA/ARIMA model to forecast the short-term series of the PM 10 concentrations. It was evaluated by experiments using a 10-year data set of daily PM 10 concentrations from 4 stations located in Taiyuan, China. The results indicated the following: (1) PM 10 concentrations of Taiyuan had a decreasing trend during 2005 to 2012 but increased in 2013. PM 10 concentrations had an obvious seasonal fluctuation related to coal-fired heating in winter and early spring. (2) Spatial differences among the four stations showed that the PM 10 concentrations in industrial and heavily trafficked areas were higher than those in residential and suburb areas. (3) Wavelet analysis revealed that the trend variation and the changes of the PM 10 concentration of Taiyuan were complicated. (4) The proposed wavelet-ARIMA model could be efficiently and successfully applied to the PM 10 forecasting field. Compared with the traditional ARMA/ARIMA methods, this wavelet-ARMA/ARIMA method could effectively reduce the forecasting error, improve the prediction accuracy, and realize multiple-time-scale prediction. Wavelet analysis can filter noisy signals and identify the variation trend and the fluctuation of the PM 10 time-series data. Wavelet decomposition and reconstruction reduce the nonstationarity of the PM 10 time-series data, and thus improve the accuracy of the prediction. This paper proposed a wavelet-ARMA/ARIMA model to forecast the PM 10 time series. Compared with the traditional ARMA/ARIMA method, this wavelet-ARMA/ARIMA method could effectively reduce the forecasting error, improve the prediction accuracy, and realize multiple-time-scale prediction. The proposed model could be efficiently and successfully applied to the PM 10 forecasting field.

  20. Post-Inpatient Brain Injury Rehabilitation Outcomes: Report from the National OutcomeInfo Database.

    PubMed

    Malec, James F; Kean, Jacob

    2016-07-15

    This study examined outcomes for intensive residential and outpatient/community-based post-inpatient brain injury rehabilitation (PBIR) programs compared with supported living programs. The goal of supported living programs was stable functioning (no change). Data were obtained for a large cohort of adults with acquired brain injury (ABI) from the OutcomeInfo national database, a web-based database system developed through National Institutes of Health (NIH) Small Business Technology Transfer (STTR) funding for monitoring progress and outcomes in PBIR programs primarily with the Mayo-Portland Adaptability Inventory (MPAI-4). Rasch-derived MPAI-4 measures for cases from 2008 to 2014 from 9 provider organizations offering programs in 23 facilities throughout the United States were examined. Controlling for age at injury, time in program, and time since injury on admission (chronicity), both intensive residential (n = 205) and outpatient/community-based (n = 2781) programs resulted in significant (approximately 1 standard deviation [SD]) functional improvement on the MPAI-4 Total Score compared with supported living (n = 101) programs (F = 18.184, p < 0.001). Intensive outpatient/community-based programs showed greater improvements on MPAI-4 Ability (F = 14.135, p < 0.001), Adjustment (F = 12.939, p < 0.001), and Participation (F = 16.679, p < 0.001) indices than supported living programs; whereas, intensive residential programs showed improvement primarily in Adjustment and Participation. Age at injury and time in program had small effects on outcome; the effect of chronicity was small to moderate. Examination of more chronic cases (>1 year post-injury) showed significant, but smaller (approximately 0.5 SD) change on the MPAI-4 relative to supported living programs (F = 17.562, p < 0.001). Results indicate that intensive residential and outpatient/community-based PIBR programs result in substantial positive functional changes moderated by chronicity.

  1. Post-Inpatient Brain Injury Rehabilitation Outcomes: Report from the National OutcomeInfo Database

    PubMed Central

    Kean, Jacob

    2016-01-01

    Abstract This study examined outcomes for intensive residential and outpatient/community-based post-inpatient brain injury rehabilitation (PBIR) programs compared with supported living programs. The goal of supported living programs was stable functioning (no change). Data were obtained for a large cohort of adults with acquired brain injury (ABI) from the OutcomeInfo national database, a web-based database system developed through National Institutes of Health (NIH) Small Business Technology Transfer (STTR) funding for monitoring progress and outcomes in PBIR programs primarily with the Mayo-Portland Adaptability Inventory (MPAI-4). Rasch-derived MPAI-4 measures for cases from 2008 to 2014 from 9 provider organizations offering programs in 23 facilities throughout the United States were examined. Controlling for age at injury, time in program, and time since injury on admission (chronicity), both intensive residential (n = 205) and outpatient/community-based (n = 2781) programs resulted in significant (approximately 1 standard deviation [SD]) functional improvement on the MPAI-4 Total Score compared with supported living (n = 101) programs (F = 18.184, p < 0.001). Intensive outpatient/community-based programs showed greater improvements on MPAI-4 Ability (F = 14.135, p < 0.001), Adjustment (F = 12.939, p < 0.001), and Participation (F = 16.679, p < 0.001) indices than supported living programs; whereas, intensive residential programs showed improvement primarily in Adjustment and Participation. Age at injury and time in program had small effects on outcome; the effect of chronicity was small to moderate. Examination of more chronic cases (>1 year post-injury) showed significant, but smaller (approximately 0.5 SD) change on the MPAI-4 relative to supported living programs (F = 17.562, p < 0.001). Results indicate that intensive residential and outpatient/community-based PIBR programs result in substantial positive functional changes moderated by chronicity. PMID:26414433

  2. The integration of quantitative information with an intelligent decision support system for residential energy retrofits

    NASA Astrophysics Data System (ADS)

    Mo, Yunjeong

    The purpose of this research is to support the development of an intelligent Decision Support System (DSS) by integrating quantitative information with expert knowledge in order to facilitate effective retrofit decision-making. To achieve this goal, the Energy Retrofit Decision Process Framework is analyzed. Expert system shell software, a retrofit measure cost database, and energy simulation software are needed for developing the DSS; Exsys Corvid, the NREM database and BEopt were chosen for implementing an integration model. This integration model demonstrates the holistic function of a residential energy retrofit system for existing homes, by providing a prioritized list of retrofit measures with cost information, energy simulation and expert advice. The users, such as homeowners and energy auditors, can acquire all of the necessary retrofit information from this unified system without having to explore several separate systems. The integration model plays the role of a prototype for the finalized intelligent decision support system. It implements all of the necessary functions for the finalized DSS, including integration of the database, energy simulation and expert knowledge.

  3. Design and Performance of a Xenobiotic Metabolism Database Manager for Building Metabolic Pathway Databases

    EPA Science Inventory

    A major challenge for scientists and regulators is accounting for the metabolic activation of chemicals that may lead to increased toxicity. Reliable forecasting of chemical metabolism is a critical factor in estimating a chemical’s toxic potential. Research is underway to develo...

  4. Verification of National Weather Service spot forecasts using surface observations

    NASA Astrophysics Data System (ADS)

    Lammers, Matthew Robert

    Software has been developed to evaluate National Weather Service spot forecasts issued to support prescribed burns and early-stage wildfires. Fire management officials request spot forecasts from National Weather Service Weather Forecast Offices to provide detailed guidance as to atmospheric conditions in the vicinity of planned prescribed burns as well as wildfires that do not have incident meteorologists on site. This open source software with online display capabilities is used to examine an extensive set of spot forecasts of maximum temperature, minimum relative humidity, and maximum wind speed from April 2009 through November 2013 nationwide. The forecast values are compared to the closest available surface observations at stations installed primarily for fire weather and aviation applications. The accuracy of the spot forecasts is compared to those available from the National Digital Forecast Database (NDFD). Spot forecasts for selected prescribed burns and wildfires are used to illustrate issues associated with the verification procedures. Cumulative statistics for National Weather Service County Warning Areas and for the nation are presented. Basic error and accuracy metrics for all available spot forecasts and the entire nation indicate that the skill of the spot forecasts is higher than that available from the NDFD, with the greatest improvement for maximum temperature and the least improvement for maximum wind speed.

  5. Upgrade Summer Severe Weather Tool

    NASA Technical Reports Server (NTRS)

    Watson, Leela

    2011-01-01

    The goal of this task was to upgrade to the existing severe weather database by adding observations from the 2010 warm season, update the verification dataset with results from the 2010 warm season, use statistical logistic regression analysis on the database and develop a new forecast tool. The AMU analyzed 7 stability parameters that showed the possibility of providing guidance in forecasting severe weather, calculated verification statistics for the Total Threat Score (TTS), and calculated warm season verification statistics for the 2010 season. The AMU also performed statistical logistic regression analysis on the 22-year severe weather database. The results indicated that the logistic regression equation did not show an increase in skill over the previously developed TTS. The equation showed less accuracy than TTS at predicting severe weather, little ability to distinguish between severe and non-severe weather days, and worse standard categorical accuracy measures and skill scores over TTS.

  6. Acceleration, Transport, Forecasting and Impact of solar energetic particles in the framework of the 'HESPERIA' HORIZON 2020 project

    NASA Astrophysics Data System (ADS)

    Malandraki, Olga; Klein, Karl-Ludwig; Vainio, Rami; Agueda, Neus; Nunez, Marlon; Heber, Bernd; Buetikofer, Rolf; Sarlanis, Christos; Crosby, Norma

    2017-04-01

    High-energy solar energetic particles (SEPs) emitted from the Sun are a major space weather hazard motivating the development of predictive capabilities. In this work, the current state of knowledge on the origin and forecasting of SEP events will be reviewed. Subsequently, we will present the EU HORIZON2020 HESPERIA (High Energy Solar Particle Events foRecastIng and Analysis) project, its structure, its main scientific objectives and forecasting operational tools, as well as the added value to SEP research both from the observational as well as the SEP modelling perspective. The project addresses through multi-frequency observations and simulations the chain of processes from particle acceleration in the corona, particle transport in the magnetically complex corona and interplanetary space to the detection near 1 AU. Furthermore, publicly available software to invert neutron monitor observations of relativistic SEPs to physical parameters that can be compared with space-borne measurements at lower energies is provided for the first time by HESPERIA. In order to achieve these goals, HESPERIA is exploiting already available large datasets stored in databases such as the neutron monitor database (NMDB) and SEPServer that were developed under EU FP7 projects from 2008 to 2013. Forecasting results of the two novel SEP operational forecasting tools published via the consortium server of 'HESPERIA' will be presented, as well as some scientific key results on the acceleration, transport and impact on Earth of high-energy particles. Acknowledgement: This project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 637324.

  7. MAG4 versus alternative techniques for forecasting active region flare productivity.

    PubMed

    Falconer, David A; Moore, Ronald L; Barghouty, Abdulnasser F; Khazanov, Igor

    2014-05-01

    MAG4 is a technique of forecasting an active region's rate of production of major flares in the coming few days from a free magnetic energy proxy. We present a statistical method of measuring the difference in performance between MAG4 and comparable alternative techniques that forecast an active region's major-flare productivity from alternative observed aspects of the active region. We demonstrate the method by measuring the difference in performance between the "Present MAG4" technique and each of three alternative techniques, called "McIntosh Active-Region Class," "Total Magnetic Flux," and "Next MAG4." We do this by using (1) the MAG4 database of magnetograms and major flare histories of sunspot active regions, (2) the NOAA table of the major-flare productivity of each of 60 McIntosh active-region classes of sunspot active regions, and (3) five technique performance metrics (Heidke Skill Score, True Skill Score, Percent Correct, Probability of Detection, and False Alarm Rate) evaluated from 2000 random two-by-two contingency tables obtained from the databases. We find that (1) Present MAG4 far outperforms both McIntosh Active-Region Class and Total Magnetic Flux, (2) Next MAG4 significantly outperforms Present MAG4, (3) the performance of Next MAG4 is insensitive to the forward and backward temporal windows used, in the range of one to a few days, and (4) forecasting from the free-energy proxy in combination with either any broad category of McIntosh active-region classes or any Mount Wilson active-region class gives no significant performance improvement over forecasting from the free-energy proxy alone (Present MAG4). Quantitative comparison of performance of pairs of forecasting techniques Next MAG4 forecasts major flares more accurately than Present MAG4 Present MAG4 forecast outperforms McIntosh AR Class and total magnetic flux.

  8. MAG4 versus alternative techniques for forecasting active region flare productivity

    PubMed Central

    Falconer, David A; Moore, Ronald L; Barghouty, Abdulnasser F; Khazanov, Igor

    2014-01-01

    MAG4 is a technique of forecasting an active region's rate of production of major flares in the coming few days from a free magnetic energy proxy. We present a statistical method of measuring the difference in performance between MAG4 and comparable alternative techniques that forecast an active region's major-flare productivity from alternative observed aspects of the active region. We demonstrate the method by measuring the difference in performance between the “Present MAG4” technique and each of three alternative techniques, called “McIntosh Active-Region Class,” “Total Magnetic Flux,” and “Next MAG4.” We do this by using (1) the MAG4 database of magnetograms and major flare histories of sunspot active regions, (2) the NOAA table of the major-flare productivity of each of 60 McIntosh active-region classes of sunspot active regions, and (3) five technique performance metrics (Heidke Skill Score, True Skill Score, Percent Correct, Probability of Detection, and False Alarm Rate) evaluated from 2000 random two-by-two contingency tables obtained from the databases. We find that (1) Present MAG4 far outperforms both McIntosh Active-Region Class and Total Magnetic Flux, (2) Next MAG4 significantly outperforms Present MAG4, (3) the performance of Next MAG4 is insensitive to the forward and backward temporal windows used, in the range of one to a few days, and (4) forecasting from the free-energy proxy in combination with either any broad category of McIntosh active-region classes or any Mount Wilson active-region class gives no significant performance improvement over forecasting from the free-energy proxy alone (Present MAG4). Key Points Quantitative comparison of performance of pairs of forecasting techniques Next MAG4 forecasts major flares more accurately than Present MAG4 Present MAG4 forecast outperforms McIntosh AR Class and total magnetic flux PMID:26213517

  9. Short term load forecasting using a self-supervised adaptive neural network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, H.; Pimmel, R.L.

    The authors developed a self-supervised adaptive neural network to perform short term load forecasts (STLF) for a large power system covering a wide service area with several heavy load centers. They used the self-supervised network to extract correlational features from temperature and load data. In using data from the calendar year 1993 as a test case, they found a 0.90 percent error for hour-ahead forecasting and 1.92 percent error for day-ahead forecasting. These levels of error compare favorably with those obtained by other techniques. The algorithm ran in a couple of minutes on a PC containing an Intel Pentium --more » 120 MHz CPU. Since the algorithm included searching the historical database, training the network, and actually performing the forecasts, this approach provides a real-time, portable, and adaptable STLF.« less

  10. WOVOdat as a worldwide resource to improve eruption forecasts

    NASA Astrophysics Data System (ADS)

    Widiwijayanti, Christina; Costa, Fidel; Zar Win Nang, Thin; Tan, Karine; Newhall, Chris; Ratdomopurbo, Antonius

    2015-04-01

    During periods of volcanic unrest, volcanologists need to interpret signs of unrest to be able to forecast whether an eruption is likely to occur. Some volcanic eruptions display signs of impending eruption such as seismic activity, surface deformation, or gas emissions; but not all will give signs and not all signs are necessarily followed by an eruption. Volcanoes behave differently. Precursory signs of an eruption are sometimes very short, less than an hour, but can be also weeks, months, or even years. Some volcanoes are regularly active and closely monitored, while other aren't. Often, the record of precursors to historical eruptions of a volcano isn't enough to allow a forecast of its future activity. Therefore, volcanologists must refer to monitoring data of unrest and eruptions at similar volcanoes. WOVOdat is the World Organization of Volcano Observatories' Database of volcanic unrest - an international effort to develop common standards for compiling and storing data on volcanic unrests in a centralized database and freely web-accessible for reference during volcanic crises, comparative studies, and basic research on pre-eruption processes. WOVOdat will be to volcanology as an epidemiological database is to medicine. We have up to now incorporated about 15% of worldwide unrest data into WOVOdat, covering more than 100 eruption episodes, which includes: volcanic background data, eruptive histories, monitoring data (seismic, deformation, gas, hydrology, thermal, fields, and meteorology), monitoring metadata, and supporting data such as reports, images, maps and videos. Nearly all data in WOVOdat are time-stamped and geo-referenced. Along with creating a database on volcanic unrest, WOVOdat also developing web-tools to help users to query, visualize, and compare data, which further can be used for probabilistic eruption forecasting. Reference to WOVOdat will be especially helpful at volcanoes that have not erupted in historical or 'instrumental' time and thus for which no previous data exist. The more data in WOVOdat, the more useful it will be. We actively solicit relevant data contributions from volcano observatories, other institutions, and individual researchers. Detailed information and documentation about the database and how to use it can be found at www.wovodat.org.

  11. Statistical models and time series forecasting of sulfur dioxide: a case study Tehran.

    PubMed

    Hassanzadeh, S; Hosseinibalam, F; Alizadeh, R

    2009-08-01

    This study performed a time-series analysis, frequency distribution and prediction of SO(2) levels for five stations (Pardisan, Vila, Azadi, Gholhak and Bahman) in Tehran for the period of 2000-2005. Most sites show a quite similar characteristic with highest pollution in autumn-winter time and least pollution in spring-summer. The frequency distributions show higher peaks at two residential sites. The potential for SO(2) problems is high because of high emissions and the close geographical proximity of the major industrial and urban centers. The ACF and PACF are nonzero for several lags, indicating a mixed (ARMA) model, then at Bahman station an ARMA model was used for forecasting SO(2). The partial autocorrelations become close to 0 after about 5 lags while the autocorrelations remain strong through all the lags shown. The results proved that ARMA (2,2) model can provides reliable, satisfactory predictions for time series.

  12. A Diagnostics Tool to detect ensemble forecast system anomaly and guide operational decisions

    NASA Astrophysics Data System (ADS)

    Park, G. H.; Srivastava, A.; Shrestha, E.; Thiemann, M.; Day, G. N.; Draijer, S.

    2017-12-01

    The hydrologic community is moving toward using ensemble forecasts to take uncertainty into account during the decision-making process. The New York City Department of Environmental Protection (DEP) implements several types of ensemble forecasts in their decision-making process: ensemble products for a statistical model (Hirsch and enhanced Hirsch); the National Weather Service (NWS) Advanced Hydrologic Prediction Service (AHPS) forecasts based on the classical Ensemble Streamflow Prediction (ESP) technique; and the new NWS Hydrologic Ensemble Forecasting Service (HEFS) forecasts. To remove structural error and apply the forecasts to additional forecast points, the DEP post processes both the AHPS and the HEFS forecasts. These ensemble forecasts provide mass quantities of complex data, and drawing conclusions from these forecasts is time-consuming and difficult. The complexity of these forecasts also makes it difficult to identify system failures resulting from poor data, missing forecasts, and server breakdowns. To address these issues, we developed a diagnostic tool that summarizes ensemble forecasts and provides additional information such as historical forecast statistics, forecast skill, and model forcing statistics. This additional information highlights the key information that enables operators to evaluate the forecast in real-time, dynamically interact with the data, and review additional statistics, if needed, to make better decisions. We used Bokeh, a Python interactive visualization library, and a multi-database management system to create this interactive tool. This tool compiles and stores data into HTML pages that allows operators to readily analyze the data with built-in user interaction features. This paper will present a brief description of the ensemble forecasts, forecast verification results, and the intended applications for the diagnostic tool.

  13. [Development of an analyzing system for soil parameters based on NIR spectroscopy].

    PubMed

    Zheng, Li-Hua; Li, Min-Zan; Sun, Hong

    2009-10-01

    A rapid estimation system for soil parameters based on spectral analysis was developed by using object-oriented (OO) technology. A class of SOIL was designed. The instance of the SOIL class is the object of the soil samples with the particular type, specific physical properties and spectral characteristics. Through extracting the effective information from the modeling spectral data of soil object, a map model was established between the soil parameters and its spectral data, while it was possible to save the mapping model parameters in the database of the model. When forecasting the content of any soil parameter, the corresponding prediction model of this parameter can be selected with the same soil type and the similar soil physical properties of objects. And after the object of target soil samples was carried into the prediction model and processed by the system, the accurate forecasting content of the target soil samples could be obtained. The system includes modules such as file operations, spectra pretreatment, sample analysis, calibrating and validating, and samples content forecasting. The system was designed to run out of equipment. The parameters and spectral data files (*.xls) of the known soil samples can be input into the system. Due to various data pretreatment being selected according to the concrete conditions, the results of predicting content will appear in the terminal and the forecasting model can be stored in the model database. The system reads the predicting models and their parameters are saved in the model database from the module interface, and then the data of the tested samples are transferred into the selected model. Finally the content of soil parameters can be predicted by the developed system. The system was programmed with Visual C++6.0 and Matlab 7.0. And the Access XP was used to create and manage the model database.

  14. Progress towards Managing Residential Electricity Demand: Impacts of Standards and Labeling for Refrigerators and Air Conditioners in India

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNeil, Michael A.; Iyer, Maithili

    The development of Energy Efficiency Standards and Labeling (EES&L) began in earnest in India in 2001 with the Energy Conservation Act and the establishment of the Indian Bureau of Energy Efficiency (BEE). The first main residential appliance to be targeted was refrigerators, soon to be followed by room air conditioners. Both of these appliances are of critical importance to India's residential electricity demand. About 15percent of Indian households own a refrigerator, and sales total about 4 million per year, but are growing. At the same time, the Indian refrigerator market has seen a strong trend towards larger and more consumptivemore » frost-free units. Room air conditioners in India have traditionally been sold to commercial sector customers, but an increasing number are going to the residential sector. Room air conditioner sales growth in India peaked in the last few years at 20percent per year. In this paper, we perform an engineering-based analysis using data specific to Indian appliances. We evaluate costs and benefits to residential and commercial sector consumers from increased equipment costs and utility bill savings. The analysis finds that, while the BEE scheme presents net benefits to consumers, there remain opportunities for efficiency improvement that would optimize consumer benefits, according to Life Cycle Cost analysis. Due to the large and growing market for refrigerators and air conditioners in India, we forecast large impacts from the standards and labeling program as scheduled. By 2030, this program, if fully implemented would reduce Indian residential electricity consumption by 55 TWh. Overall savings through 2030 totals 385 TWh. Finally, while efficiency levels have been set for several years for refrigerators, labels and MEPS for these products remain voluntary. We therefore consider the negative impact of this delay of implementation to energy and financial savings achievable by 2030.« less

  15. Short-term forecasting of individual household electricity loads with investigating impact of data resolution and forecast horizon

    NASA Astrophysics Data System (ADS)

    Yildiz, Baran; Bilbao, Jose I.; Dore, Jonathon; Sproul, Alistair B.

    2018-05-01

    Smart grid components such as smart home and battery energy management systems, high penetration of renewable energy systems, and demand response activities, require accurate electricity demand forecasts for the successful operation of the electricity distribution networks. For example, in order to optimize residential PV generation and electricity consumption and plan battery charge-discharge regimes by scheduling household appliances, forecasts need to target and be tailored to individual household electricity loads. The recent uptake of smart meters allows easier access to electricity readings at very fine resolutions; hence, it is possible to utilize this source of available data to create forecast models. In this paper, models which predominantly use smart meter data alongside with weather variables, or smart meter based models (SMBM), are implemented to forecast individual household loads. Well-known machine learning models such as artificial neural networks (ANN), support vector machines (SVM) and Least-Square SVM are implemented within the SMBM framework and their performance is compared. The analysed household stock consists of 14 households from the state of New South Wales, Australia, with at least a year worth of 5 min. resolution data. In order for the results to be comparable between different households, our study first investigates household load profiles according to their volatility and reveals the relationship between load standard deviation and forecast performance. The analysis extends previous research by evaluating forecasts over four different data resolution; 5, 15, 30 and 60 min, each resolution analysed for four different horizons; 1, 6, 12 and 24 h ahead. Both, data resolution and forecast horizon, proved to have significant impact on the forecast performance and the obtained results provide important insights for the operation of various smart grid applications. Finally, it is shown that the load profile of some households vary significantly across different days; as a result, providing a single model for the entire period may result in limited performance. By the use of a pre-clustering step, similar daily load profiles are grouped together according to their standard deviation, and instead of applying one SMBM for the entire data-set of a particular household, separate SMBMs are applied to each one of the clusters. This preliminary clustering step increases the complexity of the analysis however it results in significant improvements in forecast performance.

  16. An Ecology of Prestige in New York City: Examining the Relationships Among Population Density, Socio-economic Status, Group Identity, and Residential Canopy Cover

    NASA Astrophysics Data System (ADS)

    Grove, J. Morgan; Locke, Dexter H.; O'Neil-Dunne, Jarlath P. M.

    2014-09-01

    Several social theories have been proposed to explain the uneven distribution of vegetation in urban residential areas: population density, social stratification, luxury effect, and ecology of prestige. We evaluate these theories using a combination of demographic and socio-economic predictors of vegetative cover on all residential lands in New York City. We use diverse data sources including the City's property database, time-series demographic and socio-economic data from the US Census, and land cover data from the University of Vermont's Spatial Analysis Lab (SAL). These data are analyzed using a multi-model inferential, spatial econometrics approach. We also examine the distribution of vegetation within distinct market categories using Claritas' Potential Rating Index for Zipcode Markets (PRIZM™) database. These categories can be disaggregated, corresponding to the four social theories. We compare the econometric and categorical results for validation. Models associated with ecology of prestige theory are more effective for predicting the distribution of vegetation. This suggests that private, residential patterns of vegetation, reflecting the consumption of environmentally relevant goods and services, are associated with different lifestyles and lifestages. Further, our spatial and temporal analyses suggest that there are significant spatial and temporal dependencies that have theoretical and methodological implications for understanding urban ecological systems. These findings may have policy implications. Decision makers may need to consider how to most effectively reach different social groups in terms of messages and messengers in order to advance land management practices and achieve urban sustainability.

  17. An ecology of prestige in New York City: examining the relationships among population density, socio-economic status, group identity, and residential canopy cover.

    PubMed

    Grove, J Morgan; Locke, Dexter H; O'Neil-Dunne, Jarlath P M

    2014-09-01

    Several social theories have been proposed to explain the uneven distribution of vegetation in urban residential areas: population density, social stratification, luxury effect, and ecology of prestige. We evaluate these theories using a combination of demographic and socio-economic predictors of vegetative cover on all residential lands in New York City. We use diverse data sources including the City's property database, time-series demographic and socio-economic data from the US Census, and land cover data from the University of Vermont's Spatial Analysis Lab (SAL). These data are analyzed using a multi-model inferential, spatial econometrics approach. We also examine the distribution of vegetation within distinct market categories using Claritas' Potential Rating Index for Zipcode Markets (PRIZM™) database. These categories can be disaggregated, corresponding to the four social theories. We compare the econometric and categorical results for validation. Models associated with ecology of prestige theory are more effective for predicting the distribution of vegetation. This suggests that private, residential patterns of vegetation, reflecting the consumption of environmentally relevant goods and services, are associated with different lifestyles and lifestages. Further, our spatial and temporal analyses suggest that there are significant spatial and temporal dependencies that have theoretical and methodological implications for understanding urban ecological systems. These findings may have policy implications. Decision makers may need to consider how to most effectively reach different social groups in terms of messages and messengers in order to advance land management practices and achieve urban sustainability.

  18. Neural net forecasting for geomagnetic activity

    NASA Technical Reports Server (NTRS)

    Hernandez, J. V.; Tajima, T.; Horton, W.

    1993-01-01

    We use neural nets to construct nonlinear models to forecast the AL index given solar wind and interplanetary magnetic field (IMF) data. We follow two approaches: (1) the state space reconstruction approach, which is a nonlinear generalization of autoregressive-moving average models (ARMA) and (2) the nonlinear filter approach, which reduces to a moving average model (MA) in the linear limit. The database used here is that of Bargatze et al. (1985).

  19. Long-Range Atmosphere-Ocean Forecasting in Support of Undersea Warfare Operations in the Western North Pacific

    DTIC Science & Technology

    2009-09-01

    Forecasts ECS East China Sea ESRL Earth Systems Research Laboratory FA False alarm FARate False alarm rate xviii GDEM Generalized Digital...uses a LTM based, global ocean climatology database called Generalized Digital Environment Model ( GDEM ), in tactical decision aid (TDA) software, such...environment for USW planning. GDEM climatology is derived using temperature and salinity profiles from the Modular Ocean Data Assimilation System

  20. Forecasting jobs in the supply chain for investments in residential energy efficiency retrofits in Florida

    NASA Astrophysics Data System (ADS)

    Fobair, Richard C., II

    This research presents a model for forecasting the numbers of jobs created in the energy efficiency retrofit (EER) supply chain resulting from an investment in upgrading residential buildings in Florida. This investigation examined material supply chains stretching from mining to project installation for three product types: insulation, windows/doors, and heating, ventilating, and air conditioning (HVAC) systems. Outputs from the model are provided for the project, sales, manufacturing, and mining level. The model utilizes reverse-estimation to forecast the numbers of jobs that result from an investment. Reverse-estimation is a process that deconstructs a total investment into its constituent parts. In this research, an investment is deconstructed into profit, overhead, and hard costs for each level of the supply chain and over multiple iterations of inter-industry exchanges. The model processes an investment amount, the type of work and method of contracting into a prediction of the number of jobs created. The deconstruction process utilizes data from the U.S. Economic Census. At each supply chain level, the cost of labor is reconfigured into full-time equivalent (FTE) jobs (i.e. equivalent to 40 hours per week for 52 weeks) utilizing loaded labor rates and a typical employee mix. The model is sensitive to adjustable variables, such as percentage of work performed per type of product, allocation of worker time per skill level, annual hours for FTE calculations, wage rate, and benefits. This research provides several new insights into job creation. First, it provides definitions that can be used for future research on jobs in supply chains related to energy efficiency. Second, it provides a methodology for future investigators to calculate jobs in a supply chain resulting from an investment in energy efficiency upgrades to a building. The methodology used in this research is unique because it examines gross employment at the sub-industry level for specific commodities. Most research on employment examines the net employment change (job creation less job destruction) at levels for regions, industries, and the aggregate economy. Third, it provides a forecast of the numbers of jobs for an investment in energy efficiency over the entire supply chain for the selected industries and the job factors for major levels of the supply chain.

  1. Analysis of a Meteorological Database for London Heathrow in the Context of Wake Vortex Hazards

    NASA Astrophysics Data System (ADS)

    Agnew, P.; Ogden, D. J.; Hoad, D. J.

    2003-04-01

    A database of meteorological parameters collected by aircraft arriving at LHR has recently been compiled. We have used the recorded variation of temperature and wind with height to deduce the 'wake vortex behaviour class' (WVBC) along the glide slope, as experienced by each flight. The integrated state of the glide slope has been investigated, allowing us to estimate the proportion of time for which the wake vortex threat is reduced, due to either rapid decay or transport off the glide slope. A numerical weather prediction model was used to forecast the meteorological parameters for periods coinciding with the aircraft data. This allowed us to perform a comparison of forecast WVBC with those deduced from the aircraft measurements.

  2. Exploiting teleconnection indices for probabilistic forecasting of drought class transitions in Sicily region (Italy)

    NASA Astrophysics Data System (ADS)

    Bonaccorso, Brunella; Cancelliere, Antonino

    2015-04-01

    In the present study two probabilistic models for short-medium term drought forecasting able to include information provided by teleconnection indices are proposed and applied to Sicily region (Italy). Drought conditions are expressed in terms of the Standardized Precipitation-Evapotranspiration Index (SPEI) at different aggregation time scales. More specifically, a multivariate approach based on normal distribution is developed in order to estimate: 1) on the one hand transition probabilities to future SPEI drought classes and 2) on the other hand, SPEI forecasts at a generic time horizon M, as functions of past values of SPEI and the selected teleconnection index. To this end, SPEI series at 3, 4 and 6 aggregation time scales for Sicily region are extracted from the Global SPEI database, SPEIbase , available at Web repository of the Spanish National Research Council (http://sac.csic.es/spei/database.html), and averaged over the study area. In particular, SPEIbase v2.3 with spatial resolution of 0.5° lat/lon and temporal coverage between January 1901 and December 2013 is used. A preliminary correlation analysis is carried out to investigate the link between the drought index and different teleconnection patterns, namely: the North Atlantic Oscillation (NAO), the Scandinavian (SCA) and the East Atlantic-West Russia (EA-WR) patterns. Results of such analysis indicate a strongest influence of NAO on drought conditions in Sicily with respect to other teleconnection indices. Then, the proposed forecasting methodology is applied and the skill in forecasting of the proposed models is quantitatively assessed through the application of a simple score approach and of performance indices. Results indicate that inclusion of NAO index generally enhance model performance thus confirming the suitability of the models for short- medium term forecast of drought conditions.

  3. Development of Parallel Code for the Alaska Tsunami Forecast Model

    NASA Astrophysics Data System (ADS)

    Bahng, B.; Knight, W. R.; Whitmore, P.

    2014-12-01

    The Alaska Tsunami Forecast Model (ATFM) is a numerical model used to forecast propagation and inundation of tsunamis generated by earthquakes and other means in both the Pacific and Atlantic Oceans. At the U.S. National Tsunami Warning Center (NTWC), the model is mainly used in a pre-computed fashion. That is, results for hundreds of hypothetical events are computed before alerts, and are accessed and calibrated with observations during tsunamis to immediately produce forecasts. ATFM uses the non-linear, depth-averaged, shallow-water equations of motion with multiply nested grids in two-way communications between domains of each parent-child pair as waves get closer to coastal waters. Even with the pre-computation the task becomes non-trivial as sub-grid resolution gets finer. Currently, the finest resolution Digital Elevation Models (DEM) used by ATFM are 1/3 arc-seconds. With a serial code, large or multiple areas of very high resolution can produce run-times that are unrealistic even in a pre-computed approach. One way to increase the model performance is code parallelization used in conjunction with a multi-processor computing environment. NTWC developers have undertaken an ATFM code-parallelization effort to streamline the creation of the pre-computed database of results with the long term aim of tsunami forecasts from source to high resolution shoreline grids in real time. Parallelization will also permit timely regeneration of the forecast model database with new DEMs; and, will make possible future inclusion of new physics such as the non-hydrostatic treatment of tsunami propagation. The purpose of our presentation is to elaborate on the parallelization approach and to show the compute speed increase on various multi-processor systems.

  4. Integrating Remote Sensing and Disease Surveillance to Forecast Malaria Epidemics

    NASA Astrophysics Data System (ADS)

    Wimberly, M. C.; Beyane, B.; DeVos, M.; Liu, Y.; Merkord, C. L.; Mihretie, A.

    2015-12-01

    Advance information about the timing and locations of malaria epidemics can facilitate the targeting of resources for prevention and emergency response. Early detection methods can detect incipient outbreaks by identifying deviations from expected seasonal patterns, whereas early warning approaches typically forecast future malaria risk based on lagged responses to meteorological factors. A critical limiting factor for implementing either of these approaches is the need for timely and consistent acquisition, processing and analysis of both environmental and epidemiological data. To address this need, we have developed EPIDEMIA - an integrated system for surveillance and forecasting of malaria epidemics. The EPIDEMIA system includes a public health interface for uploading and querying weekly surveillance reports as well as algorithms for automatically validating incoming data and updating the epidemiological surveillance database. The newly released EASTWeb 2.0 software application automatically downloads, processes, and summaries remotely-sensed environmental data from multiple earth science data archives. EASTWeb was implemented as a component of the EPIDEMIA system, which combines the environmental monitoring data and epidemiological surveillance data into a unified database that supports both early detection and early warning models. Dynamic linear models implemented with Kalman filtering were used to carry out forecasting and model updating. Preliminary forecasts have been disseminated to public health partners in the Amhara Region of Ethiopia and will be validated and refined as the EPIDEMIA system ingests new data. In addition to continued model development and testing, future work will involve updating the public health interface to provide a broader suite of outbreak alerts and data visualization tools that are useful to our public health partners. The EPIDEMIA system demonstrates a feasible approach to synthesizing the information from epidemiological surveillance systems and remotely-sensed environmental monitoring systems to improve malaria epidemic detection and forecasting.

  5. Simulating Glacial Outburst Lake Releases for Suicide Basin, Mendenhall Glacier, Juneau, Alaska

    NASA Astrophysics Data System (ADS)

    Jacobs, A. B.; Moran, T.; Hood, E. W.

    2017-12-01

    Glacial Lake outbursts from Suicide Basin are recent phenomenon first characterized in 2011. The 2014 event resulted in record river stage and moderate flooding on the Mendenhall River in Juneau. Recognizing that these events can adversely impact residential areas of Juneau's Mendenhall Valley, the Alaska-Pacific River Forecast Center developed a real-time modeling technique capable of forecasting the timing and magnitude of the flood-wave crest due to releases from Suicide Basin. The 2014 event was estimated at about 37,000 acre feet with water levels cresting within 36 hours from the time the flood wave hit Mendenhall Lake. Given the magnitude of possible impacts to the public, accurate hydrological forecasting is essential for public safety and Emergency Managers. However, the data needed to effectively forecast magnitudes of specific jökulhlaup events are limited. Estimating this event as related to river stage depended upon three variables: 1) the timing of the lag between Suicide Basin water level declines and the related rise of Mendenhall Lake, 2) continuous monitoring of Mendenhall Lake water levels, and 3) estimating the total water volume stored in Suicide Basin. Real-time modeling of the event utilized a Time of Concentration hydrograph with independent power equations representing the rising and falling limbs of the hydrograph. The initial accuracy of the model — as forecasted about 24 hours prior to crest — resulted in an estimated crest within 0.5 feet of the actual with a timing error of about six hours later than the actual crest.

  6. Uncertainty forecasts improve weather-related decisions and attenuate the effects of forecast error.

    PubMed

    Joslyn, Susan L; LeClerc, Jared E

    2012-03-01

    Although uncertainty is inherent in weather forecasts, explicit numeric uncertainty estimates are rarely included in public forecasts for fear that they will be misunderstood. Of particular concern are situations in which precautionary action is required at low probabilities, often the case with severe events. At present, a categorical weather warning system is used. The work reported here tested the relative benefits of several forecast formats, comparing decisions made with and without uncertainty forecasts. In three experiments, participants assumed the role of a manager of a road maintenance company in charge of deciding whether to pay to salt the roads and avoid a potential penalty associated with icy conditions. Participants used overnight low temperature forecasts accompanied in some conditions by uncertainty estimates and in others by decision advice comparable to categorical warnings. Results suggested that uncertainty information improved decision quality overall and increased trust in the forecast. Participants with uncertainty forecasts took appropriate precautionary action and withheld unnecessary action more often than did participants using deterministic forecasts. When error in the forecast increased, participants with conventional forecasts were reluctant to act. However, this effect was attenuated by uncertainty forecasts. Providing categorical decision advice alone did not improve decisions. However, combining decision advice with uncertainty estimates resulted in the best performance overall. The results reported here have important implications for the development of forecast formats to increase compliance with severe weather warnings as well as other domains in which one must act in the face of uncertainty. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  7. WaveNet: A Web-Based Metocean Data Access, Processing and Analysis Tool; Part 5 - WW3 Database

    DTIC Science & Technology

    2015-02-01

    Program ( CDIP ); and Part 4 for the Great Lakes Observing System/Coastal Forecasting System (GLOS/GLCFS). Using step-by-step instructions, this Part 5...Demirbilek, Z., L. Lin, and D. Wilson. 2014a. WaveNet: A web-based metocean data access, processing, and analysis tool; part 3– CDIP database

  8. Medium- and long-term electric power demand forecasting based on the big data of smart city

    NASA Astrophysics Data System (ADS)

    Wei, Zhanmeng; Li, Xiyuan; Li, Xizhong; Hu, Qinghe; Zhang, Haiyang; Cui, Pengjie

    2017-08-01

    Based on the smart city, this paper proposed a new electric power demand forecasting model, which integrates external data such as meteorological information, geographic information, population information, enterprise information and economic information into the big database, and uses an improved algorithm to analyse the electric power demand and provide decision support for decision makers. The data mining technology is used to synthesize kinds of information, and the information of electric power customers is analysed optimally. The scientific forecasting is made based on the trend of electricity demand, and a smart city in north-eastern China is taken as a sample.

  9. Disruption Event Characterization and Forecasting in Tokamaks

    NASA Astrophysics Data System (ADS)

    Berkery, J. W.; Sabbagh, S. A.; Park, Y. S.; Ahn, J. H.; Jiang, Y.; Riquezes, J. D.; Gerhardt, S. P.; Myers, C. E.

    2017-10-01

    The Disruption Event Characterization and Forecasting (DECAF) code, being developed to meet the challenging goal of high reliability disruption prediction in tokamaks, automates data analysis to determine chains of events that lead to disruptions and to forecast their evolution. The relative timing of magnetohydrodynamic modes and other events including plasma vertical displacement, loss of boundary control, proximity to density limits, reduction of safety factor, and mismatch of the measured and desired plasma current are considered. NSTX/-U databases are examined with analysis expanding to DIII-D, KSTAR, and TCV. Characterization of tearing modes has determined mode bifurcation frequency and locking points. In an NSTX database exhibiting unstable resistive wall modes (RWM), the RWM event and loss of boundary control event were found in 100%, and the vertical displacement event in over 90% of cases. A reduced kinetic RWM stability physics model is evaluated to determine the proximity of discharges to marginal stability. The model shows high success as a disruption predictor (greater than 85%) with relatively low false positive rate. Supported by US DOE Contracts DE-FG02-99ER54524, DE-AC02-09CH11466, and DE-SC0016614.

  10. Forecasting database for the tsunami warning regional center for the western Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Gailler, A.; Hebert, H.; Loevenbruck, A.; Hernandez, B.

    2010-12-01

    Improvements in the availability of sea-level observations and advances in numerical modeling techniques are increasing the potential for tsunami warnings to be based on numerical model forecasts. Numerical tsunami propagation and inundation models are well developed, but they present a challenge to run in real-time, partly due to computational limitations and also to a lack of detailed knowledge on the earthquake rupture parameters. Through the establishment of the tsunami warning regional center for NE Atlantic and western Mediterranean Sea, the CEA is especially in charge of providing rapidly a map with uncertainties showing zones in the main axis of energy at the Mediterranean scale. The strategy is based initially on a pre-computed tsunami scenarios database, as source parameters available a short time after an earthquake occurs are preliminary and may be somewhat inaccurate. Existing numerical models are good enough to provide a useful guidance for warning structures to be quickly disseminated. When an event will occur, an appropriate variety of offshore tsunami propagation scenarios by combining pre-computed propagation solutions (single or multi sources) may be recalled through an automatic interface. This approach would provide quick estimates of tsunami offshore propagation, and aid hazard assessment and evacuation decision-making. As numerical model accuracy is inherently limited by errors in bathymetry and topography, and as inundation maps calculation is more complex and expensive in term of computational time, only tsunami offshore propagation modeling will be included in the forecasting database using a single sparse bathymetric computation grid for the numerical modeling. Because of too much variability in the mechanism of tsunamigenic earthquakes, all possible magnitudes cannot be represented in the scenarios database. In principle, an infinite number of tsunami propagation scenarios can be constructed by linear combinations of a finite number of pre-computed unit scenarios. The whole notion of a pre-computed forecasting database also requires a historical earthquake and tsunami database, as well as an up-to-date seismotectonic database including faults geometry and a zonation based on seismotectonic synthesis of source zones and tsunamigenic faults. Our forecast strategy is thus based on a unit source function methodology, whereby the model runs are combined and scaled linearly to produce any composite tsunamis propagation solution. Each unit source function is equivalent to a tsunami generated by a Mo 1.75E+19 N.m earthquake (Mw ~6.8) with a rectangular fault 25 km by 20 km in size and 1 m in slip. The faults of the unit functions are placed adjacent to each other, following the discretization of the main seismogenic faults bounding the western Mediterranean basin. The number of unit functions involved varies with the magnitude of the wanted composite solution and the combined waveheights are multiplied by a given scaling factor to produce the new arbitrary scenario. Some test-cases examples are presented (e.g., Boumerdès 2003 [Algeria, Mw 6.8], Djijel 1856 [Algeria, Mw 7.2], Ligure 1887 [Italia, Mw 6.5-6.7]).

  11. 42 CFR 436.407 - Types of acceptable documentary evidence of citizenship.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... (including general education and high school equivalency diplomas), marriage certificates, divorce decrees... perjury by a residential care facility director or administrator on behalf of an institutionalized... State database subsequent changes in eligibility should not require repeating the documentation of...

  12. 42 CFR 436.407 - Types of acceptable documentary evidence of citizenship.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... (including general education and high school equivalency diplomas), marriage certificates, divorce decrees... perjury by a residential care facility director or administrator on behalf of an institutionalized... State database subsequent changes in eligibility should not require repeating the documentation of...

  13. Multiple Solutions of Real-time Tsunami Forecasting Using Short-term Inundation Forecasting for Tsunamis Tool

    NASA Astrophysics Data System (ADS)

    Gica, E.

    2016-12-01

    The Short-term Inundation Forecasting for Tsunamis (SIFT) tool, developed by NOAA Center for Tsunami Research (NCTR) at the Pacific Marine Environmental Laboratory (PMEL), is used in forecast operations at the Tsunami Warning Centers in Alaska and Hawaii. The SIFT tool relies on a pre-computed tsunami propagation database, real-time DART buoy data, and an inversion algorithm to define the tsunami source. The tsunami propagation database is composed of 50×100km unit sources, simulated basin-wide for at least 24 hours. Different combinations of unit sources, DART buoys, and length of real-time DART buoy data can generate a wide range of results within the defined tsunami source. For an inexperienced SIFT user, the primary challenge is to determine which solution, among multiple solutions for a single tsunami event, would provide the best forecast in real time. This study investigates how the use of different tsunami sources affects simulated tsunamis at tide gauge locations. Using the tide gauge at Hilo, Hawaii, a total of 50 possible solutions for the 2011 Tohoku tsunami are considered. Maximum tsunami wave amplitude and root mean square error results are used to compare tide gauge data and the simulated tsunami time series. Results of this study will facilitate SIFT users' efforts to determine if the simulated tide gauge tsunami time series from a specific tsunami source solution would be within the range of possible solutions. This study will serve as the basis for investigating more historical tsunami events and tide gauge locations.

  14. Communicating weather forecast uncertainty: Do individual differences matter?

    PubMed

    Grounds, Margaret A; Joslyn, Susan L

    2018-03-01

    Research suggests that people make better weather-related decisions when they are given numeric probabilities for critical outcomes (Joslyn & Leclerc, 2012, 2013). However, it is unclear whether all users can take advantage of probabilistic forecasts to the same extent. The research reported here assessed key cognitive and demographic factors to determine their relationship to the use of probabilistic forecasts to improve decision quality. In two studies, participants decided between spending resources to prevent icy conditions on roadways or risk a larger penalty when freezing temperatures occurred. Several forecast formats were tested, including a control condition with the night-time low temperature alone and experimental conditions that also included the probability of freezing and advice based on expected value. All but those with extremely low numeracy scores made better decisions with probabilistic forecasts. Importantly, no groups made worse decisions when probabilities were included. Moreover, numeracy was the best predictor of decision quality, regardless of forecast format, suggesting that the advantage may extend beyond understanding the forecast to general decision strategy issues. This research adds to a growing body of evidence that numerical uncertainty estimates may be an effective way to communicate weather danger to general public end users. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  15. Comparative assessment of several post-processing methods for correcting evapotranspiration forecasts derived from TIGGE datasets.

    NASA Astrophysics Data System (ADS)

    Tian, D.; Medina, H.

    2017-12-01

    Post-processing of medium range reference evapotranspiration (ETo) forecasts based on numerical weather prediction (NWP) models has the potential of improving the quality and utility of these forecasts. This work compares the performance of several post-processing methods for correcting ETo forecasts over the continental U.S. generated from The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) database using data from Europe (EC), the United Kingdom (MO), and the United States (NCEP). The pondered post-processing techniques are: simple bias correction, the use of multimodels, the Ensemble Model Output Statistics (EMOS, Gneitting et al., 2005) and the Bayesian Model Averaging (BMA, Raftery et al., 2005). ETo estimates based on quality-controlled U.S. Regional Climate Reference Network measurements, and computed with the FAO 56 Penman Monteith equation, are adopted as baseline. EMOS and BMA are generally the most efficient post-processing techniques of the ETo forecasts. Nevertheless, the simple bias correction of the best model is commonly much more rewarding than using multimodel raw forecasts. Our results demonstrate the potential of different forecasting and post-processing frameworks in operational evapotranspiration and irrigation advisory systems at national scale.

  16. 42 CFR 435.407 - Types of acceptable documentary evidence of citizenship.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... accredited institutions (including general education and high school equivalency diplomas), marriage... under penalty of perjury by a residential care facility director or administrator on behalf of an... documented and recorded in a State database subsequent changes in eligibility should not require repeating...

  17. 42 CFR 435.407 - Types of acceptable documentary evidence of citizenship.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... accredited institutions (including general education and high school equivalency diplomas), marriage... under penalty of perjury by a residential care facility director or administrator on behalf of an... documented and recorded in a State database subsequent changes in eligibility should not require repeating...

  18. An evaluation of the impact of flooring types on exposures to fine and coarse particles within the residential micro-environment using CONTAM.

    PubMed

    Bramwell, Lisa; Qian, Jing; Howard-Reed, Cynthia; Mondal, Sumona; Ferro, Andrea R

    2016-01-01

    Typical resuspension activities within the home, such as walking, have been estimated to contribute up to 25% of personal exposures to PM10. Chamber studies have shown that for moderate walking intensities, flooring type can impact the rate at which particles are re-entrained into the air. For this study, the impact of residential flooring type on incremental average daily (24 h) time-averaged exposure was investigated. Distributions of incremental time-averaged daily exposures to fine and coarse PM while walking within the residential micro-environment were predicted using CONTAM, the multizone airflow and contaminant transport program of the National Institute of Standards and Technology. Knowledge of when and where a person was walking was determined by randomly selecting 490 daily diaries from the EPA's consolidated human activity database (CHAD). On the basis of the results of this study, residential flooring type can significantly impact incremental time-averaged daily exposures to coarse and fine particles (α=0.05, P<0.05, N=490, Kruskal-Wallis test) with high-density cut pile carpeting resulting in the highest exposures. From this study, resuspension from walking within the residential micro-environment contributed 6-72% of time-averaged daily exposures to PM10.

  19. Identifying water price and population criteria for meeting future urban water demand targets

    NASA Astrophysics Data System (ADS)

    Ashoori, Negin; Dzombak, David A.; Small, Mitchell J.

    2017-12-01

    Predictive models for urban water demand can help identify the set of factors that must be satisfied in order to meet future targets for water demand. Some of the explanatory variables used in such models, such as service area population and changing temperature and rainfall rates, are outside the immediate control of water planners and managers. Others, such as water pricing and the intensity of voluntary water conservation efforts, are subject to decisions and programs implemented by the water utility. In order to understand this relationship, a multiple regression model fit to 44 years of monthly demand data (1970-2014) for Los Angeles, California was applied to predict possible future demand through 2050 under alternative scenarios for the explanatory variables: population, price, voluntary conservation efforts, and temperature and precipitation outcomes predicted by four global climate models with two CO2 emission scenarios. Future residential water demand in Los Angeles is projected to be largely driven by price and population rather than climate change and conservation. A median projection for the year 2050 indicates that residential water demand in Los Angeles will increase by approximately 36 percent, to a level of 620 million m3 per year. The Monte Carlo simulations of the fitted model for water demand were then used to find the set of conditions in the future for which water demand is predicted to be above or below the Los Angeles Department of Water and Power 2035 goal to reduce residential water demand by 25%. Results indicate that increases in price can not ensure that the 2035 water demand target can be met when population increases. Los Angeles must rely on furthering their conservation initiatives and increasing their use of stormwater capture, recycled water, and expanding their groundwater storage. The forecasting approach developed in this study can be utilized by other cities to understand the future of water demand in water-stressed areas. Improving water demand forecasts will help planners understand and optimize future investments in water supply infrastructure and related programs.

  20. Multi-Scale Enviro-HIRLAM Forecasting of Weather and Atmospheric Composition over China and its Megacities

    NASA Astrophysics Data System (ADS)

    Mahura, Alexander; Amstrup, Bjarne; Nuterman, Roman; Yang, Xiaohua; Baklanov, Alexander

    2017-04-01

    Air pollution is a serious problem in different regions of China and its continuously growing megacities. Information on air quality, and especially, in urbanized areas is important for decision making, emergency response and population. In particular, the metropolitan areas of Shanghai, Beijing, and Pearl River Delta are well known as main regions having serious air pollution problems. The on-line integrated meteorology-chemistry-aerosols Enviro-HIRLAM (Environment - HIgh Resolution Limited Area Model) model adapted for China and selected megacities is applied for forecasting of weather and atmospheric composition (with focus on aerosols). The model system is running in downscaling chain from regional to urban scales at subsequent horizontal resolutions of 15-5-2.5 km. The model setup includes also the urban Building Effects Parameterization module, describing different types of urban districts (industrial commercial, city center, high density and residential) with its own morphological and aerodynamical characteristics. The effects of urbanization are important for atmospheric transport, dispersion, deposition, and chemical transformations, in addition to better quality emission inventories for China and selected urban areas. The Enviro-HIRLAM system provides meteorology and air quality forecasts at regional-subregional-urban scales (China - East China - selected megacities). In particular, such forecasting is important for metropolitan areas, where formation and development of meteorological and chemical/aerosol patterns are especially complex. It also provides information for evaluation impact on selected megacities of China as well as for investigation relationship between air pollution and meteorology.

  1. Systems modeling and analysis for Saudi Arabian electric power requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Mohawes, N.A.

    This thesis addresses the long-range generation planning problem in Saudi Arabia up to the year 2000. The first part presents various models for electric energy consumption in the residential and industrial sectors. These models can be used by the decision makers for the purposes of policy analysis, evaluation, and forecasting. Forecasts of energy in each sector are obtained from two different models for each sector. These models are based on two forecasting techniques: (1) Hybrid econometric/time series model. The idea of adaptive smoothing was utilized to produce forecasts under several scenarios. (2) Box-Jenkins time series technique. Box-Jenkins models and forecastsmore » are developed for the monthly number of electric consumers and the monthly energy consumption per consumer. The results obtained indicate that high energy consumption is expected during the coming two decades which necessitate serious energy assessment and optimization. Optimization of a mix of energy sources was considered using the group multiattribute utility (MAU) function. The results of MAU for three classes of decision makers (managerial, technical, and consumers) are developed through personal interactions. The computer package WASP was also used to develop a tentative optimum plan. According to this plan, four heavy-water nuclear power plants (800 MW) and four light-water nuclear power plants (1200 MW) have to be introduced by the year 2000 in addition to sixteen oil-fired power plants (400 MW) and nine gas turbines (100 MW).« less

  2. An Exploration of the Relationship between Improvements in Energy Efficiency and Life-Cycle Energy and Carbon Emissions using the BIRDS Low-Energy Residential Database.

    PubMed

    Kneifel, Joshua; O'Rear, Eric; Webb, David; O'Fallon, Cheyney

    2018-02-01

    To conduct a more complete analysis of low-energy and net-zero energy buildings that considers both the operating and embodied energy/emissions, members of the building community look to life-cycle assessment (LCA) methods. This paper examines differences in the relative impacts of cost-optimal energy efficiency measure combinations depicting residential buildings up to and beyond net-zero energy consumption on operating and embodied flows using data from the Building Industry Reporting and Design for Sustainability (BIRDS) Low-Energy Residential Database. Results indicate that net-zero performance leads to a large increase in embodied flows (over 40%) that offsets some of the reductions in operational flows, but overall life-cycle flows are still reduced by over 60% relative to the state energy code. Overall, building designs beyond net-zero performance can partially offset embodied flows with negative operational flows by replacing traditional electricity generation with solar production, but would require an additional 8.34 kW (18.54 kW in total) of due south facing solar PV to reach net-zero total life-cycle flows. Such a system would meet over 239% of operational consumption of the most energy efficient design considered in this study and over 116% of a state code-compliant building design in its initial year of operation.

  3. Assessing development pressure in the Chesapeake Bay watershed: An evaluation of two land-use change models

    USGS Publications Warehouse

    Claggett, Peter; Jantz, Claire A.; Goetz, S.J.; Bisland, C.

    2004-01-01

    Natural resource lands in the Chesapeake Bay watershed are increasingly susceptible to conversion into developed land uses, particularly as the demand for residential development grows. We assessed development pressure in the Baltimore-Washington, DC region, one of the major urban and suburban centers in the watershed. We explored the utility of two modeling approaches for forecasting future development trends and patterns by comparing results from a cellular automata model, SLEUTH (slope, land use, excluded land, urban extent, transportation), and a supply/demand/allocation model, the Western Futures Model. SLEUTH can be classified as a land-cover change model and produces projections on the basis of historic trends of changes in the extent and patterns of developed land and future land protection scenarios. The Western Futures Model derives forecasts from historic trends in housing units, a U.S. Census variable, and exogenously supplied future population projections. Each approach has strengths and weaknesses, and combining the two has advantages and limitations. ?? 2004 Kluwer Academic Publishers.

  4. A statistical inference for concentrations of benzo[a]pyrene partially measured in the ambient air of an industrial city in Korea

    NASA Astrophysics Data System (ADS)

    Kim, Yongku; Seo, Young-Kyo; Baek, Sung-Ok

    2013-12-01

    Although large quantities of air pollutants are released into the atmosphere, they are partially monitored and routinely assessed for their health implications. This paper proposes a statistical model describing the temporal behavior of hazardous air pollutants (HAPs), which can have negative effects on human health. Benzo[a]pyrene (BaP) is selected for statistical modeling. The proposed model incorporates the linkage between BaP and meteorology and is specifically formulated to identify meteorological effects and allow for seasonal trends. The model is used to estimate and forecast temporal fields of BaP conditional on observed (or forecasted) meteorological conditions, including temperature, precipitation, wind speed, and air quality. The effects of BaP on human health are examined by characterizing health indicators, namely the cancer risk and the hazard quotient. The model provides useful information for the optimal monitoring period and projection of future BaP concentrations for both industrial and residential areas in Korea.

  5. Assessing development pressure in the Chesapeake Bay watershed: an evaluation of two land-use change models.

    PubMed

    Claggett, Peter R; Jantz, Claire A; Goetz, Scott J; Bisland, Carin

    2004-06-01

    Natural resource lands in the Chesapeake Bay watershed are increasingly susceptible to conversion into developed land uses, particularly as the demand for residential development grows. We assessed development pressure in the Baltimore-Washington, DC region, one of the major urban and suburban centers in the watershed. We explored the utility of two modeling approaches for forecasting future development trends and patterns by comparing results from a cellular automata model, SLEUTH (slope, land use, excluded land, urban extent, transportation), and a supply/demand/allocation model, the Western Futures Model. SLEUTH can be classified as a land-cover change model and produces projections on the basis of historic trends of changes in the extent and patterns of developed land and future land protection scenarios. The Western Futures Model derives forecasts from historic trends in housing units, a U.S. Census variable, and exogenously supplied future population projections. Each approach has strengths and weaknesses, and combining the two has advantages and limitations.

  6. The mineral content of tap water in United States households

    USDA-ARS?s Scientific Manuscript database

    The composition of tap water contributes to dietary intake of minerals. The USDA’s Nutrient Data Laboratory (NDL) conducted a study of the mineral content of residential tap water, to generate current data for the USDA National Nutrient Database. Sodium, potassium, calcium, magnesium, iron, copper...

  7. Energy supply and demand modeling. (Latest citations from the NTIS data base). Published Search

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1992-10-01

    The bibliography contains citations concerning the use of mathematical models in trend analysis and forecasting of energy supply and demand factors. Models are presented for the industrial, transportation, and residential sectors. Aspects of long term energy strategies and markets are discussed at the global, national, state, and regional levels. Energy demand and pricing, and econometrics of energy, are explored for electric utilities and natural resources, such as coal, oil, and natural gas. Energy resources are modeled both for fuel usage and for reserves. (Contains 250 citations and includes a subject term index and title list.)

  8. Global building inventory for earthquake loss estimation and risk management

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David; Porter, Keith

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey’s Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat’s demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature.

  9. Providing the Fire Risk Map in Forest Area Using a Geographically Weighted Regression Model with Gaussin Kernel and Modis Images, a Case Study: Golestan Province

    NASA Astrophysics Data System (ADS)

    Shah-Heydari pour, A.; Pahlavani, P.; Bigdeli, B.

    2017-09-01

    According to the industrialization of cities and the apparent increase in pollutants and greenhouse gases, the importance of forests as the natural lungs of the earth is felt more than ever to clean these pollutants. Annually, a large part of the forests is destroyed due to the lack of timely action during the fire. Knowledge about areas with a high-risk of fire and equipping these areas by constructing access routes and allocating the fire-fighting equipment can help to eliminate the destruction of the forest. In this research, the fire risk of region was forecasted and the risk map of that was provided using MODIS images by applying geographically weighted regression model with Gaussian kernel and ordinary least squares over the effective parameters in forest fire including distance from residential areas, distance from the river, distance from the road, height, slope, aspect, soil type, land use, average temperature, wind speed, and rainfall. After the evaluation, it was found that the geographically weighted regression model with Gaussian kernel forecasted 93.4% of the all fire points properly, however the ordinary least squares method could forecast properly only 66% of the fire points.

  10. A scoping review of malaria forecasting: past work and future directions

    PubMed Central

    Zinszer, Kate; Verma, Aman D; Charland, Katia; Brewer, Timothy F; Brownstein, John S; Sun, Zhuoyu; Buckeridge, David L

    2012-01-01

    Objectives There is a growing body of literature on malaria forecasting methods and the objective of our review is to identify and assess methods, including predictors, used to forecast malaria. Design Scoping review. Two independent reviewers searched information sources, assessed studies for inclusion and extracted data from each study. Information sources Search strategies were developed and the following databases were searched: CAB Abstracts, EMBASE, Global Health, MEDLINE, ProQuest Dissertations & Theses and Web of Science. Key journals and websites were also manually searched. Eligibility criteria for included studies We included studies that forecasted incidence, prevalence or epidemics of malaria over time. A description of the forecasting model and an assessment of the forecast accuracy of the model were requirements for inclusion. Studies were restricted to human populations and to autochthonous transmission settings. Results We identified 29 different studies that met our inclusion criteria for this review. The forecasting approaches included statistical modelling, mathematical modelling and machine learning methods. Climate-related predictors were used consistently in forecasting models, with the most common predictors being rainfall, relative humidity, temperature and the normalised difference vegetation index. Model evaluation was typically based on a reserved portion of data and accuracy was measured in a variety of ways including mean-squared error and correlation coefficients. We could not compare the forecast accuracy of models from the different studies as the evaluation measures differed across the studies. Conclusions Applying different forecasting methods to the same data, exploring the predictive ability of non-environmental variables, including transmission reducing interventions and using common forecast accuracy measures will allow malaria researchers to compare and improve models and methods, which should improve the quality of malaria forecasting. PMID:23180505

  11. Atlantic Hurricane Activity: 1851-1900

    NASA Astrophysics Data System (ADS)

    Landsea, C. W.

    2001-12-01

    This presentation reports on the second year's work of a three year project to re-analyze the North Atlantic hurricane database (or HURDAT). The original database of six-hourly positions and intensities were put together in the 1960s in support of the Apollo space program to help provide statistical track forecast guidance. In the intervening years, this database - which is now freely and easily accessible on the Internet from the National Hurricane Center's (NHC's) Webpage - has been utilized for a wide variety of uses: climatic change studies, seasonal forecasting, risk assessment for county emergency managers, analysis of potential losses for insurance and business interests, intensity forecasting techniques and verification of official and various model predictions of track and intensity. Unfortunately, HURDAT was not designed with all of these uses in mind when it was first put together and not all of them may be appropriate given its original motivation. One problem with HURDAT is that there are numerous systematic as sell as some random errors in the database which need correction. Additionally, analysis techniques have changed over the years at NHC as our understanding of tropical cyclones has developed, leading to biases in the historical database that have not been addressed. Another difficulty in applying the hurricane database to studies concerned with landfalling events is the lack exact location, time and intensity at hurricane landfall. Finally, recent efforts into uncovering undocumented historical hurricanes in the late 1800s and early 1900s led by Jose Fernandez-Partagas have greatly increased our knowledge of these past events, which are not yet incorporated into the HURDAT database. Because of all of these issues, a re-analysis of the Atlantic hurricane database is being attempted that will be completed in three years. As part of the re-analyses, three files will be made available: {* } The revised Atlantic HURDAT (with six hourly intensities & positions) {* }{* } HURDAT meta-file: A text file with detailed information about each suggested change proposed in the revised HURDAT. {* }{* }{* } A ``center fix" file: This file is composed of actual observations of tropical cyclone positions and intensity estimates from the following platforms: aircraft, satellite, radar, and synoptic. All changes made to HURDAT will be approved by a NHC Committee as this database is one that is officially maintained by them. At the conference, results will be shown including a revised climatology of U.S. hurricane strikes back to 1851. >http://www.aoml.noaa.gov/hrd/hurdat/index.html

  12. Design of online monitoring and forecasting system for electrical equipment temperature of prefabricated substation based on WSN

    NASA Astrophysics Data System (ADS)

    Qi, Weiran; Miao, Hongxia; Miao, Xuejiao; Xiao, Xuanxuan; Yan, Kuo

    2016-10-01

    In order to ensure the safe and stable operation of the prefabricated substations, temperature sensing subsystem, temperature remote monitoring and management subsystem, forecast subsystem are designed in the paper. Wireless temperature sensing subsystem which consists of temperature sensor and MCU sends the electrical equipment temperature to the remote monitoring center by wireless sensor network. Remote monitoring center can realize the remote monitoring and prediction by monitoring and management subsystem and forecast subsystem. Real-time monitoring of power equipment temperature, history inquiry database, user management, password settings, etc., were achieved by monitoring and management subsystem. In temperature forecast subsystem, firstly, the chaos of the temperature data was verified and phase space is reconstructed. Then Support Vector Machine - Particle Swarm Optimization (SVM-PSO) was used to predict the temperature of the power equipment in prefabricated substations. The simulation results found that compared with the traditional methods SVM-PSO has higher prediction accuracy.

  13. A comparison of observed and forecast energetics over North America

    NASA Technical Reports Server (NTRS)

    Baker, W. E.; Brin, Y.

    1985-01-01

    The observed kinetic energy balance is calculated over North America and compared with that computed from forecast fields for the 13-15 January 1979 cyclone. The FGGE upper-air rawinsonde network serves as the observational database while the forecast energetics are derived from a numerical integration with the GLAS fourth-order general circulation model initialized at 00 GMT 13 January. Maps of the observed and predicted kinetic energy and eddy conversion are in good qualitative agreement, although the model eddy conversion tends to be 2 to 3 times stronger than the observed values. Both the forecast and observations exhibit the lower and upper tropospheric maxima in vertical profiles of kinetic energy generation and dissipation typically found in cyclonic disturbances. An interesting time lag is noted in the observational analysis with the maximum observed kinetic energy occurring 12 h later than the maximum eddy conversion over the same region.

  14. MAG4 Versus Alternative Techniques for Forecasting Active-Region Flare Productivity

    NASA Technical Reports Server (NTRS)

    Falconer, David A.; Moore, Ronald L.; Barghouty, Abdulnasser F.; Khazanov, Igor

    2014-01-01

    MAG4 is a technique of forecasting an active region's rate of production of major flares in the coming few days from a free-magnetic-energy proxy. We present a statistical method of measuring the difference in performance between MAG4 and comparable alternative techniques that forecast an active region's major-flare productivity from alternative observed aspects of the active region. We demonstrate the method by measuring the difference in performance between the "Present MAG4" technique and each of three alternative techniques, called "McIntosh Active-Region Class," "Total Magnetic Flux," and "Next MAG4." We do this by using (1) the MAG4 database of magnetograms and major-flare histories of sunspot active regions, (2) the NOAA table of the major-flare productivity of each of 60 McIntosh active-region classes of sunspot active regions, and (3) five technique-performance metrics (Heidke Skill Score, True Skill Score, Percent Correct, Probability of Detection, and False Alarm Rate) evaluated from 2000 random two-by-two contingency tables obtained from the databases. We find that (1) Present MAG4 far outperforms both McIntosh Active-Region Class and Total Magnetic Flux, (2) Next MAG4 significantly outperforms Present MAG4, (3) the performance of Next MAG4 is insensitive to the forward and backward temporal windows used, in the range of one to a few days, and (4) forecasting from the free-energy proxy in combination with either any broad category of McIntosh active-region classes or any Mount Wilson active-region class gives no significant performance improvement over forecasting from the free-energy proxy alone (Present MAG4).

  15. HESI EXPOSURE FACTORS DATABASE FOR AGGREGATE AND CUMULATIVE RISK ASSESSMENT

    EPA Science Inventory

    In recent years, the risk analysis community has broadened its use of complex aggregate and cumulative residential exposure models (e.g., to meet the requirements of the 1996 Food Quality Protection Act). The value of these models is their ability to incorporate a range of inp...

  16. An experimental system for flood risk forecasting at global scale

    NASA Astrophysics Data System (ADS)

    Alfieri, L.; Dottori, F.; Kalas, M.; Lorini, V.; Bianchi, A.; Hirpa, F. A.; Feyen, L.; Salamon, P.

    2016-12-01

    Global flood forecasting and monitoring systems are nowadays a reality and are being applied by an increasing range of users and practitioners in disaster risk management. Furthermore, there is an increasing demand from users to integrate flood early warning systems with risk based forecasts, combining streamflow estimations with expected inundated areas and flood impacts. To this end, we have developed an experimental procedure for near-real time flood mapping and impact assessment based on the daily forecasts issued by the Global Flood Awareness System (GloFAS). The methodology translates GloFAS streamflow forecasts into event-based flood hazard maps based on the predicted flow magnitude and the forecast lead time and a database of flood hazard maps with global coverage. Flood hazard maps are then combined with exposure and vulnerability information to derive flood risk. Impacts of the forecasted flood events are evaluated in terms of flood prone areas, potential economic damage, and affected population, infrastructures and cities. To further increase the reliability of the proposed methodology we integrated model-based estimations with an innovative methodology for social media monitoring, which allows for real-time verification of impact forecasts. The preliminary tests provided good results and showed the potential of the developed real-time operational procedure in helping emergency response and management. In particular, the link with social media is crucial for improving the accuracy of impact predictions.

  17. WOVOdat Design Document: The Schema, Table Descriptions, and Create Table Statements for the Database of Worldwide Volcanic Unrest (WOVOdat Version 1.0)

    USGS Publications Warehouse

    Venezky, Dina Y.; Newhall, Christopher G.

    2007-01-01

    WOVOdat Overview During periods of volcanic unrest, the ability to forecast near future activity has been a primary concern for human populations living near volcanoes. Our ability to forecast future activity and mitigate hazards is based on knowledge of previous activity at the volcano exhibiting unrest and knowledge of previous activity at similar volcanoes. A small set of experts with past experience are often involved in forecasting. We need to both preserve the knowledge the experts use and continue to investigate volcanic data to make better forecasts. Advances in instrumentation, networking, and data storage technologies have greatly increased our ability to collect volcanic data and share observations with our colleagues. The wealth of data creates numerous opportunities for gaining a better understanding of magmatic conditions and processes, if the data can be easily accessed for comparison. To allow for comparison of volcanic unrest data, we are creating a central database called WOVOdat. WOVOdat will contain a subset of time-series and geo-referenced data from each WOVO observatory in common and easily accessible formats. WOVOdat is being created for volcano experts in charge of forecasting volcanic activity, scientists investigating volcanic processes, and the public. The types of queries each of these groups might ask range from, 'What volcanoes were active in November of 2002?' and 'What are the relationships between tectonic earthquakes and volcanic processes?' to complex analyses of volcanic unrest to determine what future activity might occur. A new structure for storing and accessing our data was needed to examine processes across a wide range of volcanologic conditions. WOVOdat provides this new structure using relationships to connect the data parameters such that searches can be created for analogs of unrest. The subset of data that will fill WOVOdat will continue to be collected by the observatories, who will remain the primary archives of raw and detailed data on individual episodes of unrest. MySQL, an Open Source database, was chosen as the WOVOdat database for its integration with common web languages. The question of where the data will be stored and how the disparate data sets will be integrated will not be discussed in detail here. The focus of this document is to explain the data types, formats, and table organization chosen for WOVOdat 1.0. It was written for database administrators, data loaders, query writers, and anyone who monitors volcanoes. We begin with an overview of several challenges faced and solutions used in creating the WOVOdat schema. Specifics are then given for the parameters and table organization. After each table organization section, basic create table statements are included for viewing the database field formats. In the next stage of the project, scripts will be needed for data conversion, entry, and cleansing. Views will also need to be created once the data have been loaded and the basic queries are better known. Many questions and opportunities remain. We look forward to the growth and continual improvement in efficiency of the system. We hope WOVOdat will improve our understanding of magmatic systems and help mitigate future volcanic hazards.

  18. Approaches in Health Human Resource Forecasting: A Roadmap for Improvement.

    PubMed

    Rafiei, Sima; Mohebbifar, Rafat; Hashemi, Fariba; Ezzatabadi, Mohammad Ranjbar; Farzianpour, Fereshteh

    2016-09-01

    Forecasting the demand and supply of health manpower in an accurate manner makes appropriate planning possible. The aim of this paper was to review approaches and methods for health manpower forecasting and consequently propose the features that improve the effectiveness of this important process of health manpower planning. A literature review was conducted for studies published in English from 1990-2014 using Pub Med, Science Direct, Pro Quest, and Google Scholar databases. Review articles, qualitative studies, retrospective and prospective studies describing or applying various types of forecasting approaches and methods in health manpower forecasting were included in the review. The authors designed an extraction data sheet based on study questions to collect data on studies' references, designs, and types of forecasting approaches, whether discussed or applied, with their strengths and weaknesses. Forty studies were included in the review. As a result, two main categories of approaches (conceptual and analytical) for health manpower forecasting were identified. Each approach had several strengths and weaknesses. As a whole, most of them were faced with some challenges, such as being static and unable to capture dynamic variables in manpower forecasting and causal relationships. They also lacked the capacity to benefit from scenario making to assist policy makers in effective decision making. An effective forecasting approach is supposed to resolve all the deficits that exist in current approaches and meet the key features found in the literature in order to develop an open system and a dynamic and comprehensive method necessary for today complex health care systems.

  19. The NRL relocatable ocean/acoustic ensemble forecast system

    NASA Astrophysics Data System (ADS)

    Rowley, C.; Martin, P.; Cummings, J.; Jacobs, G.; Coelho, E.; Bishop, C.; Hong, X.; Peggion, G.; Fabre, J.

    2009-04-01

    A globally relocatable regional ocean nowcast/forecast system has been developed to support rapid implementation of new regional forecast domains. The system is in operational use at the Naval Oceanographic Office for a growing number of regional and coastal implementations. The new system is the basis for an ocean acoustic ensemble forecast and adaptive sampling capability. We present an overview of the forecast system and the ocean ensemble and adaptive sampling methods. The forecast system consists of core ocean data analysis and forecast modules, software for domain configuration, surface and boundary condition forcing processing, and job control, and global databases for ocean climatology, bathymetry, tides, and river locations and transports. The analysis component is the Navy Coupled Ocean Data Assimilation (NCODA) system, a 3D multivariate optimum interpolation system that produces simultaneous analyses of temperature, salinity, geopotential, and vector velocity using remotely-sensed SST, SSH, and sea ice concentration, plus in situ observations of temperature, salinity, and currents from ships, buoys, XBTs, CTDs, profiling floats, and autonomous gliders. The forecast component is the Navy Coastal Ocean Model (NCOM). The system supports one-way nesting and multiple assimilation methods. The ensemble system uses the ensemble transform technique with error variance estimates from the NCODA analysis to represent initial condition error. Perturbed surface forcing or an atmospheric ensemble is used to represent errors in surface forcing. The ensemble transform Kalman filter is used to assess the impact of adaptive observations on future analysis and forecast uncertainty for both ocean and acoustic properties.

  20. Bibliography of global change, 1992

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This bibliography lists 585 reports, articles, and other documents introduced in the NASA Scientific and Technical Information Database in 1992. The areas covered include global change, decision making, earth observation (from space), forecasting, global warming, policies, and trends.

  1. 2000-2001 California statewide household travel survey. Final report

    DOT National Transportation Integrated Search

    2002-06-01

    The California Department of Transportation (Caltrans) maintains a statewide database of household socioeconomic and travel information, which is used in regional and statewide travel demand forecasting. The 2000-2001 California Statewide Household T...

  2. Early-life residential exposure to soil components in rural areas and childhood respiratory health and allergy.

    PubMed

    Devereux, Graham; Tagiyeva, Nara; Turner, Stephen W; Ayres, Jon G; Seaton, Anthony; Hudson, Gordon; Hough, Rupert L; Campbell, Colin D; Shand, Charles A

    2014-01-01

    The increase in asthma and allergies has been attributed to declining exposure to environmental microorganisms. The main source of these is soil, the composition of which varies geographically and which is a major component (40-45%) of household dust. Our hypothesis-generating study aimed to investigate associations between soil components, respiratory health and allergy in a Scottish birth cohort. The cohort was recruited in utero in 1997/8, and followed up at one, two and five years for the development of wheezing, asthma and eczema. Lung function, exhaled nitric oxide and allergic sensitization were measured at age five in a subset. The Scottish Soils Database held at The James Hutton Institute was linked to the birth cohort data by the residential postcode at birth and five years. The soil database contained information on size separates, organic matter concentration, pH and a range of inorganic elements. Soil and clinical outcome data were available for 869, 790 and 727 children at one, two and five years. Three hundred and fifty nine (35%) of children had the same address at birth and five years. No associations were found between childhood outcomes and soil content in the residential area at age five. The soil silt content (2-20 μm particle size) of the residential area at birth was associated with childhood wheeze (adjusted OR 1.20, 95% CI [1.05; 1.37]), wheeze without a cold (1.41 [1.18; 1.69]), doctor-diagnosed asthma (1.54 [1.04; 2.28]), lung function (FEV1: beta -0.025 [-0.047;-0.001]) and airway inflammation (FENO: beta 0.15 [0.03; 0.27]) at age five, but not with allergic status or eczema. Whilst residual confounding is the most likely explanation for the associations reported, the results of this study lead us to hypothesise that early life exposure to residential soil silt may adversely influence childhood respiratory health, possibly because of the organic components of silt. © 2013 Elsevier B.V. All rights reserved.

  3. Residential aged care in Auckland, New Zealand 1988-2008: do real trends over time match predictions?

    PubMed

    Broad, Joanna B; Boyd, Michal; Kerse, Ngaire; Whitehead, Noeline; Chelimo, Carol; Lay-Yee, Roy; von Randow, Martin; Foster, Susan; Connolly, Martin J

    2011-07-01

    in Auckland, New Zealand in 1988, 7.7% of those aged over 65 years lived in licenced residential aged care. Age-specific rates approximately doubled for each 5-year age group after the age of 65 years. Even with changes in policies and market forces since 1988, population increases are forecast to drive large growth in demand. This study shows previously unrecognised 20-year trends in rates of care in a geographically defined population. four cross-sectional surveys of all facilities (rest homes and hospitals) licenced for long-term care of older people were conducted in Auckland, New Zealand in 1988, 1993, 1998 and 2008. Facility staff completed survey forms for each resident. Numbers of licenced and occupied beds and trends in age-specific and age-standardised rates in residential aged care are reported. over the 20-year period, Auckland's population aged over 65 years increased by 43% (from 91,000 to 130,000) but actual numbers in care reduced slightly. Among those aged over 65 years, the proportion living in care facilities reduced from 1 in 13 to 1 in 18. Age-standardised rates in rest-home level care reduced from 65 to 33 per thousand, and in hospital level care, from 29 to 23 per thousand. Had rates remained stable, over 13,200 people, 74% more than observed, would have been in care in 2008. growth predicted in the residential aged care sector is not yet evident. The introduction of standardised needs assessments before entry, increased availability of home-based services, and growth in retirement villages may have led to reduced utilisation.

  4. Online learning algorithm for time series forecasting suitable for low cost wireless sensor networks nodes.

    PubMed

    Pardo, Juan; Zamora-Martínez, Francisco; Botella-Rocamora, Paloma

    2015-04-21

    Time series forecasting is an important predictive methodology which can be applied to a wide range of problems. Particularly, forecasting the indoor temperature permits an improved utilization of the HVAC (Heating, Ventilating and Air Conditioning) systems in a home and thus a better energy efficiency. With such purpose the paper describes how to implement an Artificial Neural Network (ANN) algorithm in a low cost system-on-chip to develop an autonomous intelligent wireless sensor network. The present paper uses a Wireless Sensor Networks (WSN) to monitor and forecast the indoor temperature in a smart home, based on low resources and cost microcontroller technology as the 8051MCU. An on-line learning approach, based on Back-Propagation (BP) algorithm for ANNs, has been developed for real-time time series learning. It performs the model training with every new data that arrive to the system, without saving enormous quantities of data to create a historical database as usual, i.e., without previous knowledge. Consequently to validate the approach a simulation study through a Bayesian baseline model have been tested in order to compare with a database of a real application aiming to see the performance and accuracy. The core of the paper is a new algorithm, based on the BP one, which has been described in detail, and the challenge was how to implement a computational demanding algorithm in a simple architecture with very few hardware resources.

  5. Online Learning Algorithm for Time Series Forecasting Suitable for Low Cost Wireless Sensor Networks Nodes

    PubMed Central

    Pardo, Juan; Zamora-Martínez, Francisco; Botella-Rocamora, Paloma

    2015-01-01

    Time series forecasting is an important predictive methodology which can be applied to a wide range of problems. Particularly, forecasting the indoor temperature permits an improved utilization of the HVAC (Heating, Ventilating and Air Conditioning) systems in a home and thus a better energy efficiency. With such purpose the paper describes how to implement an Artificial Neural Network (ANN) algorithm in a low cost system-on-chip to develop an autonomous intelligent wireless sensor network. The present paper uses a Wireless Sensor Networks (WSN) to monitor and forecast the indoor temperature in a smart home, based on low resources and cost microcontroller technology as the 8051MCU. An on-line learning approach, based on Back-Propagation (BP) algorithm for ANNs, has been developed for real-time time series learning. It performs the model training with every new data that arrive to the system, without saving enormous quantities of data to create a historical database as usual, i.e., without previous knowledge. Consequently to validate the approach a simulation study through a Bayesian baseline model have been tested in order to compare with a database of a real application aiming to see the performance and accuracy. The core of the paper is a new algorithm, based on the BP one, which has been described in detail, and the challenge was how to implement a computational demanding algorithm in a simple architecture with very few hardware resources. PMID:25905698

  6. Agroclimate.Org: Tools and Information for a Climate Resilient Agriculture in the Southeast USA

    NASA Astrophysics Data System (ADS)

    Fraisse, C.

    2014-12-01

    AgroClimate (http://agroclimate.org) is a web-based system developed to help the agricultural industry in the southeastern USA reduce risks associated with climate variability and change. It includes climate related information and dynamic application tools that interact with a climate and crop database system. Information available includes climate monitoring and forecasts combined with information about crop management practices that help increase the resiliency of the agricultural industry in the region. Recently we have included smartphone apps in the AgroClimate suite of tools, including irrigation management and crop disease alert systems. Decision support tools available in AgroClimate include: (a) Climate risk: expected (probabilistic) and historical climate information and freeze risk; (b) Crop yield risk: expected yield based on soil type, planting date, and basic management practices for selected commodities and historical county yield databases; (c) Crop diseases: disease risk monitoring and forecasting for strawberry and citrus; (d) Crop development: monitoring and forecasting of growing degree-days and chill accumulation; (e) Drought: monitoring and forecasting of selected drought indices, (f) Footprints: Carbon and water footprint calculators. The system also provides background information about the main drivers of climate variability and basic information about climate change in the Southeast USA. AgroClimate has been widely used as an educational tool by the Cooperative Extension Services in the region and also by producers. It is now being replicated internationally with version implemented in Mozambique and Paraguay.

  7. Needs of Aboriginal and Torres Strait Islander clients residing in Australian residential aged-care facilities.

    PubMed

    Brooke, Nicole J

    2011-08-01

    This review was undertaken to identify evidence-based practice guidelines to support the care needs of Aboriginal and Torres Strait Islander clients residing in residential aged-care facilities. A systematic literature review was undertaken. An electronic search of online databases and subsequent manual retrieval process was undertaken to identify relevant reports and studies that explored interventions for care of an Aboriginal and Torres Strait Islander person. Very limited published material identified strategies necessary within residential aged care. Sixty-seven articles were considered for inclusion, and a subsequent review resulted in 34 being included due to direct alignment with the study aim. Strategies recommended within the review cover areas such as care, communication, palliative care, activities and the environment. Care for an Aboriginal and Torres Strait Islander person in an Australian residential aged-care facility requires a collaborative and individual approach. Cultural safety principles should be maintained across a culturally competent workforce. Aboriginal and Torres Strait Islander persons in care is a significant experience that should not be considered 'routine' as there is much to consider in the care of this person and their community. © 2011 The Author. Australian Journal of Rural Health © National Rural Health Alliance Inc.

  8. A hybrid spatiotemporal drought forecasting model for operational use

    NASA Astrophysics Data System (ADS)

    Vasiliades, L.; Loukas, A.

    2010-09-01

    Drought forecasting plays an important role in the planning and management of natural resources and water resource systems in a river basin. Early and timelines forecasting of a drought event can help to take proactive measures and set out drought mitigation strategies to alleviate the impacts of drought. Spatiotemporal data mining is the extraction of unknown and implicit knowledge, structures, spatiotemporal relationships, or patterns not explicitly stored in spatiotemporal databases. As one of data mining techniques, forecasting is widely used to predict the unknown future based upon the patterns hidden in the current and past data. This study develops a hybrid spatiotemporal scheme for integrated spatial and temporal forecasting. Temporal forecasting is achieved using feed-forward neural networks and the temporal forecasts are extended to the spatial dimension using a spatial recurrent neural network model. The methodology is demonstrated for an operational meteorological drought index the Standardized Precipitation Index (SPI) calculated at multiple timescales. 48 precipitation stations and 18 independent precipitation stations, located at Pinios river basin in Thessaly region, Greece, were used for the development and spatiotemporal validation of the hybrid spatiotemporal scheme. Several quantitative temporal and spatial statistical indices were considered for the performance evaluation of the models. Furthermore, qualitative statistical criteria based on contingency tables between observed and forecasted drought episodes were calculated. The results show that the lead time of forecasting for operational use depends on the SPI timescale. The hybrid spatiotemporal drought forecasting model could be operationally used for forecasting up to three months ahead for SPI short timescales (e.g. 3-6 months) up to six months ahead for large SPI timescales (e.g. 24 months). The above findings could be useful in developing a drought preparedness plan in the region.

  9. Impact bias or underestimation? Outcome specifications predict the direction of affective forecasting errors.

    PubMed

    Buechel, Eva C; Zhang, Jiao; Morewedge, Carey K

    2017-05-01

    Affective forecasts are used to anticipate the hedonic impact of future events and decide which events to pursue or avoid. We propose that because affective forecasters are more sensitive to outcome specifications of events than experiencers, the outcome specification values of an event, such as its duration, magnitude, probability, and psychological distance, can be used to predict the direction of affective forecasting errors: whether affective forecasters will overestimate or underestimate its hedonic impact. When specifications are positively correlated with the hedonic impact of an event, forecasters will overestimate the extent to which high specification values will intensify and low specification values will discount its impact. When outcome specifications are negatively correlated with its hedonic impact, forecasters will overestimate the extent to which low specification values will intensify and high specification values will discount its impact. These affective forecasting errors compound additively when multiple specifications are aligned in their impact: In Experiment 1, affective forecasters underestimated the hedonic impact of winning a smaller prize that they expected to win, and they overestimated the hedonic impact of winning a larger prize that they did not expect to win. In Experiment 2, affective forecasters underestimated the hedonic impact of a short unpleasant video about a temporally distant event, and they overestimated the hedonic impact of a long unpleasant video about a temporally near event. Experiments 3A and 3B showed that differences in the affect-richness of forecasted and experienced events underlie these differences in sensitivity to outcome specifications, therefore accounting for both the impact bias and its reversal. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. Development of demand forecasting tool for natural resources recouping from municipal solid waste.

    PubMed

    Zaman, Atiq Uz; Lehmann, Steffen

    2013-10-01

    Sustainable waste management requires an integrated planning and design strategy for reliable forecasting of waste generation, collection, recycling, treatment and disposal for the successful development of future residential precincts. The success of the future development and management of waste relies to a high extent on the accuracy of the prediction and on a comprehensive understanding of the overall waste management systems. This study defies the traditional concepts of waste, in which waste was considered as the last phase of production and services, by putting forward the new concept of waste as an intermediate phase of production and services. The study aims to develop a demand forecasting tool called 'zero waste index' (ZWI) for measuring the natural resources recouped from municipal solid waste. The ZWI (ZWI demand forecasting tool) quantifies the amount of virgin materials recovered from solid waste and subsequently reduces extraction of natural resources. In addition, the tool estimates the potential amount of energy, water and emissions avoided or saved by the improved waste management system. The ZWI is tested in a case study of waste management systems in two developed cities: Adelaide (Australia) and Stockholm (Sweden). The ZWI of waste management systems in Adelaide and Stockholm is 0.33 and 0.17 respectively. The study also enumerates per capita energy savings of 2.9 GJ and 2.83 GJ, greenhouse gas emissions reductions of 0.39 tonnes (CO2e) and 0.33 tonnes (CO2e), as well as water savings of 2.8 kL and 0.92 kL in Adelaide and Stockholm respectively.

  11. Non-parametric data-based approach for the quantification and communication of uncertainties in river flood forecasts

    NASA Astrophysics Data System (ADS)

    Van Steenbergen, N.; Willems, P.

    2012-04-01

    Reliable flood forecasts are the most important non-structural measures to reduce the impact of floods. However flood forecasting systems are subject to uncertainty originating from the input data, model structure and model parameters of the different hydraulic and hydrological submodels. To quantify this uncertainty a non-parametric data-based approach has been developed. This approach analyses the historical forecast residuals (differences between the predictions and the observations at river gauging stations) without using a predefined statistical error distribution. Because the residuals are correlated with the value of the forecasted water level and the lead time, the residuals are split up into discrete classes of simulated water levels and lead times. For each class, percentile values are calculated of the model residuals and stored in a 'three dimensional error' matrix. By 3D interpolation in this error matrix, the uncertainty in new forecasted water levels can be quantified. In addition to the quantification of the uncertainty, the communication of this uncertainty is equally important. The communication has to be done in a consistent way, reducing the chance of misinterpretation. Also, the communication needs to be adapted to the audience; the majority of the larger public is not interested in in-depth information on the uncertainty on the predicted water levels, but only is interested in information on the likelihood of exceedance of certain alarm levels. Water managers need more information, e.g. time dependent uncertainty information, because they rely on this information to undertake the appropriate flood mitigation action. There are various ways in presenting uncertainty information (numerical, linguistic, graphical, time (in)dependent, etc.) each with their advantages and disadvantages for a specific audience. A useful method to communicate uncertainty of flood forecasts is by probabilistic flood mapping. These maps give a representation of the probability of flooding of a certain area, based on the uncertainty assessment of the flood forecasts. By using this type of maps, water managers can focus their attention on the areas with the highest flood probability. Also the larger public can consult these maps for information on the probability of flooding for their specific location, such that they can take pro-active measures to reduce the personal damage. The method of quantifying the uncertainty was implemented in the operational flood forecasting system for the navigable rivers in the Flanders region of Belgium. The method has shown clear benefits during the floods of the last two years.

  12. A global building inventory for earthquake loss estimation and risk management

    USGS Publications Warehouse

    Jaiswal, K.; Wald, D.; Porter, K.

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat's demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature. ?? 2010, Earthquake Engineering Research Institute.

  13. Toxicity ForeCaster (ToxCast™) Data

    EPA Pesticide Factsheets

    Data is organized into different data sets and includes descriptions of ToxCast chemicals and assays and files summarizing the screening results, a MySQL database, chemicals screened through Tox21, and available data generated from animal toxicity studies.

  14. COMET Multimedia modules and objects in the digital library system

    NASA Astrophysics Data System (ADS)

    Spangler, T. C.; Lamos, J. P.

    2003-12-01

    Over the past ten years of developing Web- and CD-ROM-based training materials, the Cooperative Program for Operational Meteorology, Education and Training (COMET) has created a unique archive of almost 10,000 multimedia objects and some 50 web based interactive multimedia modules on various aspects of weather and weather forecasting. These objects and modules, containing illustrations, photographs, animations,video sequences, audio files, are potentially a valuable resource for university faculty and students, forecasters, emergency managers, public school educators, and other individuals and groups needing such materials for educational use. The COMET Modules are available on the COMET educational web site http://www.meted.ucar.edu, and the COMET Multimedia Database (MMDB) makes a collection of the multimedia objects available in a searchable online database for viewing and download over the Internet. Some 3200 objects are already available at the MMDB Website: http://archive.comet.ucar.edu/moria/

  15. HEALTH AND ENVIRONMENTAL SCIENCES INSTITUTE'S EXPOSURE FACTORS DATABASE FOR AGGREGATE AND CUMULATIVE RISK ASSESSMENT

    EPA Science Inventory

    In recent years, the risk analysis community has broadened its use of complex aggregate and cumulative residential exposure models (e.g., to meet the requirements of the 1996 Food Quality Protection Act). The value of these models is their ability to incorporate a range of input...

  16. Utility residential new construction programs: Going beyond the code. A report from the Database on Energy Efficiency Programs (DEEP) Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vine, E.

    Based on an evaluation of 10 residential new construction programs, primarily sponsored by investor-owned utilities in the United States, we find that many of these programs are in dire straits and are in danger of being discontinued because current inclusion of only direct program effects leads to the conclusion that they are not cost-effective. We believe that the cost-effectiveness of residential new construction programs can be improved by: (1) promoting technologies and advanced building design practices that significantly exceed state and federal standards; (2) reducing program marketing costs and developing more effective marketing strategies; (3) recognizing the role of thesemore » programs in increasing compliance with existing state building codes; and (4) allowing utilities to obtain an ``energy-savings credit`` from utility regulators for program spillover (market transformation) impacts. Utilities can also leverage their resources in seizing these opportunities by forming strong and trusting partnerships with the building community and with local and state government.« less

  17. Instability versus quality: residential mobility, neighborhood poverty, and children's self-regulation.

    PubMed

    Roy, Amanda L; McCoy, Dana Charles; Raver, C Cybele

    2014-07-01

    Prior research has found that higher residential mobility is associated with increased risk for children's academic and behavioral difficulty. In contrast, evaluations of experimental housing mobility interventions have shown moving from high poverty to low poverty neighborhoods to be beneficial for children's outcomes. This study merges these disparate bodies of work by considering how poverty levels in origin and destination neighborhoods moderate the influence of residential mobility on 5th graders' self-regulation. Using inverse probability weighting with propensity scores to minimize observable selection bias, this work found that experiencing a move during early or middle childhood was related to negative child outcomes (as indicated by increased behavioral and cognitive dysregulation measured via direct assessment and teacher-report) in 5th grade. However, these relationships were moderated by neighborhood poverty; moves out of low poverty and moves into high poverty neighborhoods were detrimental, while moves out of high poverty and moves into low poverty neighborhoods were beneficial. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  18. Approaches in Health Human Resource Forecasting: A Roadmap for Improvement

    PubMed Central

    Rafiei, Sima; Mohebbifar, Rafat; Hashemi, Fariba; Ezzatabadi, Mohammad Ranjbar; Farzianpour, Fereshteh

    2016-01-01

    Introduction Forecasting the demand and supply of health manpower in an accurate manner makes appropriate planning possible. The aim of this paper was to review approaches and methods for health manpower forecasting and consequently propose the features that improve the effectiveness of this important process of health manpower planning. Methods A literature review was conducted for studies published in English from 1990–2014 using Pub Med, Science Direct, Pro Quest, and Google Scholar databases. Review articles, qualitative studies, retrospective and prospective studies describing or applying various types of forecasting approaches and methods in health manpower forecasting were included in the review. The authors designed an extraction data sheet based on study questions to collect data on studies’ references, designs, and types of forecasting approaches, whether discussed or applied, with their strengths and weaknesses Results Forty studies were included in the review. As a result, two main categories of approaches (conceptual and analytical) for health manpower forecasting were identified. Each approach had several strengths and weaknesses. As a whole, most of them were faced with some challenges, such as being static and unable to capture dynamic variables in manpower forecasting and causal relationships. They also lacked the capacity to benefit from scenario making to assist policy makers in effective decision making. Conclusions An effective forecasting approach is supposed to resolve all the deficits that exist in current approaches and meet the key features found in the literature in order to develop an open system and a dynamic and comprehensive method necessary for today complex health care systems. PMID:27790343

  19. Seasonal forecasting of groundwater levels in natural aquifers in the United Kingdom

    NASA Astrophysics Data System (ADS)

    Mackay, Jonathan; Jackson, Christopher; Pachocka, Magdalena; Brookshaw, Anca; Scaife, Adam

    2014-05-01

    Groundwater aquifers comprise the world's largest freshwater resource and provide resilience to climate extremes which could become more frequent under future climate changes. Prolonged dry conditions can induce groundwater drought, often characterised by significantly low groundwater levels which may persist for months to years. In contrast, lasting wet conditions can result in anomalously high groundwater levels which result in flooding, potentially at large economic cost. Using computational models to produce groundwater level forecasts allows appropriate management strategies to be considered in advance of extreme events. The majority of groundwater level forecasting studies to date use data-based models, which exploit the long response time of groundwater levels to meteorological drivers and make forecasts based only on the current state of the system. Instead, seasonal meteorological forecasts can be used to drive hydrological models and simulate groundwater levels months into the future. Such approaches have not been used in the past due to a lack of skill in these long-range forecast products. However systems such as the latest version of the Met Office Global Seasonal Forecast System (GloSea5) are now showing increased skill up to a 3-month lead time. We demonstrate the first groundwater level ensemble forecasting system using a multi-member ensemble of hindcasts from GloSea5 between 1996 and 2009 to force 21 simple lumped conceptual groundwater models covering most of the UK's major aquifers. We present the results from this hindcasting study and demonstrate that the system can be used to forecast groundwater levels with some skill up to three months into the future.

  20. Statewide crash analysis and forecasting.

    DOT National Transportation Integrated Search

    2008-11-20

    There is a need for the development of safety analysis tools to allow Penn DOT to better assess the safety performance of road : segments in the Commonwealth. The project utilized a safety management system database at Penn DOT that integrates crash,...

  1. Magnetogram Forecast: An All-Clear Space Weather Forecasting System

    NASA Technical Reports Server (NTRS)

    Barghouty, Nasser; Falconer, David

    2015-01-01

    Solar flares and coronal mass ejections (CMEs) are the drivers of severe space weather. Forecasting the probability of their occurrence is critical in improving space weather forecasts. The National Oceanic and Atmospheric Administration (NOAA) currently uses the McIntosh active region category system, in which each active region on the disk is assigned to one of 60 categories, and uses the historical flare rates of that category to make an initial forecast that can then be adjusted by the NOAA forecaster. Flares and CMEs are caused by the sudden release of energy from the coronal magnetic field by magnetic reconnection. It is believed that the rate of flare and CME occurrence in an active region is correlated with the free energy of an active region. While the free energy cannot be measured directly with present observations, proxies of the free energy can instead be used to characterize the relative free energy of an active region. The Magnetogram Forecast (MAG4) (output is available at the Community Coordinated Modeling Center) was conceived and designed to be a databased, all-clear forecasting system to support the operational goals of NASA's Space Radiation Analysis Group. The MAG4 system automatically downloads nearreal- time line-of-sight Helioseismic and Magnetic Imager (HMI) magnetograms on the Solar Dynamics Observatory (SDO) satellite, identifies active regions on the solar disk, measures a free-energy proxy, and then applies forecasting curves to convert the free-energy proxy into predicted event rates for X-class flares, M- and X-class flares, CMEs, fast CMEs, and solar energetic particle events (SPEs). The forecast curves themselves are derived from a sample of 40,000 magnetograms from 1,300 active region samples, observed by the Solar and Heliospheric Observatory Michelson Doppler Imager. Figure 1 is an example of MAG4 visual output

  2. Influenza forecasting in human populations: a scoping review.

    PubMed

    Chretien, Jean-Paul; George, Dylan; Shaman, Jeffrey; Chitale, Rohit A; McKenzie, F Ellis

    2014-01-01

    Forecasts of influenza activity in human populations could help guide key preparedness tasks. We conducted a scoping review to characterize these methodological approaches and identify research gaps. Adapting the PRISMA methodology for systematic reviews, we searched PubMed, CINAHL, Project Euclid, and Cochrane Database of Systematic Reviews for publications in English since January 1, 2000 using the terms "influenza AND (forecast* OR predict*)", excluding studies that did not validate forecasts against independent data or incorporate influenza-related surveillance data from the season or pandemic for which the forecasts were applied. We included 35 publications describing population-based (N = 27), medical facility-based (N = 4), and regional or global pandemic spread (N = 4) forecasts. They included areas of North America (N = 15), Europe (N = 14), and/or Asia-Pacific region (N = 4), or had global scope (N = 3). Forecasting models were statistical (N = 18) or epidemiological (N = 17). Five studies used data assimilation methods to update forecasts with new surveillance data. Models used virological (N = 14), syndromic (N = 13), meteorological (N = 6), internet search query (N = 4), and/or other surveillance data as inputs. Forecasting outcomes and validation metrics varied widely. Two studies compared distinct modeling approaches using common data, 2 assessed model calibration, and 1 systematically incorporated expert input. Of the 17 studies using epidemiological models, 8 included sensitivity analysis. This review suggests need for use of good practices in influenza forecasting (e.g., sensitivity analysis); direct comparisons of diverse approaches; assessment of model calibration; integration of subjective expert input; operational research in pilot, real-world applications; and improved mutual understanding among modelers and public health officials.

  3. Influenza Forecasting in Human Populations: A Scoping Review

    PubMed Central

    Chretien, Jean-Paul; George, Dylan; Shaman, Jeffrey; Chitale, Rohit A.; McKenzie, F. Ellis

    2014-01-01

    Forecasts of influenza activity in human populations could help guide key preparedness tasks. We conducted a scoping review to characterize these methodological approaches and identify research gaps. Adapting the PRISMA methodology for systematic reviews, we searched PubMed, CINAHL, Project Euclid, and Cochrane Database of Systematic Reviews for publications in English since January 1, 2000 using the terms “influenza AND (forecast* OR predict*)”, excluding studies that did not validate forecasts against independent data or incorporate influenza-related surveillance data from the season or pandemic for which the forecasts were applied. We included 35 publications describing population-based (N = 27), medical facility-based (N = 4), and regional or global pandemic spread (N = 4) forecasts. They included areas of North America (N = 15), Europe (N = 14), and/or Asia-Pacific region (N = 4), or had global scope (N = 3). Forecasting models were statistical (N = 18) or epidemiological (N = 17). Five studies used data assimilation methods to update forecasts with new surveillance data. Models used virological (N = 14), syndromic (N = 13), meteorological (N = 6), internet search query (N = 4), and/or other surveillance data as inputs. Forecasting outcomes and validation metrics varied widely. Two studies compared distinct modeling approaches using common data, 2 assessed model calibration, and 1 systematically incorporated expert input. Of the 17 studies using epidemiological models, 8 included sensitivity analysis. This review suggests need for use of good practices in influenza forecasting (e.g., sensitivity analysis); direct comparisons of diverse approaches; assessment of model calibration; integration of subjective expert input; operational research in pilot, real-world applications; and improved mutual understanding among modelers and public health officials. PMID:24714027

  4. Movers and stayers: The geography of residential mobility and CVD hospitalisations in Auckland, New Zealand.

    PubMed

    Exeter, Daniel J; Sabel, Clive E; Hanham, Grant; Lee, Arier C; Wells, Susan

    2015-05-01

    The association between area-level disadvantage and health and social outcomes is unequivocal. However, less is known about the health impact of residential mobility, particularly at intra-urban scales. We used an encrypted National Health Index (eNHI) number to link individual-level data recorded in routine national health databases to construct a cohort of 641,532 participants aged 30+ years to investigate the association between moving and CVD hospitalisations in Auckland, New Zealand. Residential mobility was measured for participants according to changes in the census Meshblock of usual residence, obtained from the Primary Health Organisation (PHO) database for every calendar quarter between 1/1/2006 and 31/12/2012. The NZDep2006 area deprivation score at the start and end of a participant's inclusion in the study was used to measure deprivation mobility. We investigated the relative risk of movers being hospitalised for CVD relative to stayers using multi-variable binomial regression models, controlling for age, gender, deprivation and ethnicity. Considered together, movers were 1.22 (1.19-1.26) times more likely than stayers to be hospitalised for CVD. Using the 5×5 deprivation origin-destination matrix to model a patient's risk of CVD based on upward, downward or sideways deprivation mobility, movers within the least deprived (NZDep2006 Quintile 1) areas were 10% less likely than stayers to be hospitalised for CVD, while movers within the most deprived (NZDep2006 Q5) areas were 45% more likely than stayers to have had their first CVD hospitalisation in 2006-2012 (RR: 1.45 [1.35-1.55]). Participants who moved upward also had higher relative risks of having a CVD event, although their risk was less than those observed for participants experiencing downward deprivation mobility. This research suggests that residential mobility is an important determinant of CVD in Auckland. Further investigation is required to determine the impact moving has on the risk of CVD by ethnicity. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Medium range forecasting of Hurricane Harvey flash flooding using ECMWF and social vulnerability data

    NASA Astrophysics Data System (ADS)

    Pillosu, F. M.; Jurlina, T.; Baugh, C.; Tsonevsky, I.; Hewson, T.; Prates, F.; Pappenberger, F.; Prudhomme, C.

    2017-12-01

    During hurricane Harvey the greater east Texas area was affected by extensive flash flooding. Their localised nature meant they were too small for conventional large scale flood forecasting systems to capture. We are testing the use of two real time forecast products from the European Centre for Medium-range Weather Forecasts (ECMWF) in combination with local vulnerability information to provide flash flood forecasting tools at the medium range (up to 7 days ahead). Meteorological forecasts are the total precipitation extreme forecast index (EFI), a measure of how the ensemble forecast probability distribution differs from the model-climate distribution for the chosen location, time of year and forecast lead time; and the shift of tails (SOT) which complements the EFI by quantifying how extreme an event could potentially be. Both products give the likelihood of flash flood generating precipitation. For hurricane Harvey, 3-day EFI and SOT products for the period 26th - 29th August 2017 were used, generated from the twice daily, 18 km, 51 ensemble member ECMWF Integrated Forecast System. After regridding to 1 km resolution the forecasts were combined with vulnerable area data to produce a flash flood hazard risk area. The vulnerability data were floodplains (EU Joint Research Centre), road networks (Texas Department of Transport) and urban areas (Census Bureau geographic database), together reflecting the susceptibility to flash floods from the landscape. The flash flood hazard risk area forecasts were verified using a traditional approach against observed National Weather Service flash flood reports, a total of 153 reported flash floods have been detected in that period. Forecasts performed best for SOT = 5 (hit ratio = 65%, false alarm ratio = 44%) and EFI = 0.7 (hit ratio = 74%, false alarm ratio = 45%) at 72 h lead time. By including the vulnerable areas data, our verification results improved by 5-15%, demonstrating the value of vulnerability information within natural hazard forecasts. This research shows that flash flooding from hurricane Harvey was predictable up to 4 days ahead and that filtering the forecasts to vulnerable areas provides a more focused guidance to civil protection agencies planning their emergency response.

  6. Flood forecasting using non-stationarity in a river with tidal influence - a feasibility study

    NASA Astrophysics Data System (ADS)

    Killick, Rebecca; Kretzschmar, Ann; Ilic, Suzi; Tych, Wlodek

    2017-04-01

    Flooding is the most common natural hazard causing damage, disruption and loss of life worldwide. Despite improvements in modelling and forecasting of water levels and flood inundation (Kretzschmar et al., 2014; Hoitink and Jay, 2016), there are still large discrepancies between predictions and observations particularly during storm events when accurate predictions are most important. Many models exist for forecasting river levels (Smith et al., 2013; Leedal et al., 2013) however they commonly assume that the errors in the data are independent, stationary and normally distributed. This is generally not the case especially during storm events suggesting that existing models are not describing the drivers of river level in an appropriate fashion. Further challenges exist in the lower sections of a river influenced by both river and tidal flows and their interaction and there is scope for improvement in prediction. This paper investigates the use of a powerful statistical technique to adaptively forecast river levels by modelling the process as locally stationary. The proposed methodology takes information on both upstream and downstream river levels and incorporates meteorological information (rainfall forecasts) and tidal levels when required to forecast river levels at a specified location. Using this approach, a single model will be capable of predicting water levels in both tidal and non-tidal river reaches. In this pilot project, the methodology of Smith et al. (2013) using harmonic tidal analysis and data based mechanistic modelling is compared with the methodology developed by Killick et al. (2016) utilising data-driven wavelet decomposition to account for the information contained in the upstream and downstream river data to forecast a non-stationary time-series. Preliminary modelling has been carried out using the tidal stretch of the River Lune in North-west England and initial results are presented here. Future work includes expanding the methodology to forecast river levels at a network of locations simultaneously. References Hoitink, A. J. F., and D. A. Jay (2016), Tidal river dynamics: Implications for deltas, Rev. Geophys., 54, 240-272 Killick, R., Knight, M., Nason, G.P., Eckley, I.A. (2016) The Local Partial Autocorrelation Function and its Application to the Forecasting of Locally Stationary Time Series. Submitted Kretzschmar, Ann and Tych, Wlodek and Chappell, Nick A (2014) Reversing hydrology: estimation of sub-hourly rainfall time-series from streamflow. Env. Modell Softw., 60. pp. 290-301 D. Leedal, A. H. Weerts, P. J. Smith, & K. J. Beven. (2013). Application of data-based mechanistic modelling for flood forecasting at multiple locations in the Eden catchment in the National Flood Forecasting System (England and Wales). HESS, 17(1), 177-185. Smith, P., Beven, K., Horsburgh, K., Hardaker, P., & Collier, C. (2013). Data-based mechanistic modelling of tidally affected river reaches for flood warning purposes: An example on the River Dee, UK. , Q.J.R. Meteorol. Soc. 139(671), 340-349.

  7. Integral assessment of floodplains as a basis for spatially-explicit flood loss forecasts

    NASA Astrophysics Data System (ADS)

    Zischg, Andreas Paul; Mosimann, Markus; Weingartner, Rolf

    2016-04-01

    A key aspect of disaster prevention is flood discharge forecasting which is used for early warning and therefore as a decision support for intervention forces. Hereby, the phase between the issued forecast and the time when the expected flood occurs is crucial for an optimal planning of the intervention. Typically, river discharge forecasts cover the regional level only, i.e. larger catchments. However, it is important to note that these forecasts are not useable directly for specific target groups on local level because these forecasts say nothing about the consequences of the predicted flood in terms of affected areas, number of exposed residents and houses. For this, on one hand simulations of the flooding processes and on the other hand data of vulnerable objects are needed. Furthermore, flood modelling in a high spatial and temporal resolution is required for robust flood loss estimation. This is a resource-intensive task from a computing time point of view. Therefore, in real-time applications flood modelling in 2D is not suited. Thus, forecasting flood losses in the short-term (6h-24h in advance) requires a different approach. Here, we propose a method to downscale the river discharge forecast to a spatially-explicit flood loss forecast. The principal procedure is to generate as many flood scenarios as needed in advance to represent the flooded areas for all possible flood hydrographs, e.g. very high peak discharges of short duration vs. high peak discharges with high volumes. For this, synthetic flood hydrographs were derived from the hydrologic time series. Then, the flooded areas of each scenario were modelled with a 2D flood simulation model. All scenarios were intersected with the dataset of vulnerable objects, in our case residential, agricultural and industrial buildings with information about the number of residents, the object-specific vulnerability, and the monetary value of the objects. This dataset was prepared by a data-mining approach. For each flood scenario, the resulting number of affected residents, houses and therefore the losses are computed. This integral assessment leads to a hydro-economical characterisation of each floodplain. Based on that, a transfer function between discharge forecast and damages can be elaborated. This transfer function describes the relationship between predicted peak discharge, flood volume and the number of exposed houses, residents and the related losses. It also can be used to downscale the regional discharge forecast to a local level loss forecast. In addition, a dynamic map delimiting the probable flooded areas on the basis of the forecasted discharge can be prepared. The predicted losses and the delimited flooded areas provide a complementary information for assessing the need of preventive measures on one hand on the long-term timescale and on the other hand 6h-24h in advance of a predicted flood. To conclude, we can state that the transfer function offers the possibility for an integral assessment of floodplains as a basis for spatially-explicit flood loss forecasts. The procedure has been developed and tested in the alpine and pre-alpine environment of the Aare river catchment upstream of Bern, Switzerland.

  8. Graphic comparison of reserve-growth models for conventional oil and accumulation

    USGS Publications Warehouse

    Klett, T.R.

    2003-01-01

    The U.S. Geological Survey (USGS) periodically assesses crude oil, natural gas, and natural gas liquids resources of the world. The assessment procedure requires estimated recover-able oil and natural gas volumes (field size, cumulative production plus remaining reserves) in discovered fields. Because initial reserves are typically conservative, subsequent estimates increase through time as these fields are developed and produced. The USGS assessment of petroleum resources makes estimates, or forecasts, of the potential additions to reserves in discovered oil and gas fields resulting from field development, and it also estimates the potential fully developed sizes of undiscovered fields. The term ?reserve growth? refers to the commonly observed upward adjustment of reserve estimates. Because such additions are related to increases in the total size of a field, the USGS uses field sizes to model reserve growth. Future reserve growth in existing fields is a major component of remaining U.S. oil and natural gas resources and has therefore become a necessary element of U.S. petroleum resource assessments. Past and currently proposed reserve-growth models compared herein aid in the selection of a suitable set of forecast functions to provide an estimate of potential additions to reserves from reserve growth in the ongoing National Oil and Gas Assessment Project (NOGA). Reserve growth is modeled by construction of a curve that represents annual fractional changes of recoverable oil and natural gas volumes (for fields and reservoirs), which provides growth factors. Growth factors are used to calculate forecast functions, which are sets of field- or reservoir-size multipliers. Comparisons of forecast functions were made based on datasets used to construct the models, field type, modeling method, and length of forecast span. Comparisons were also made between forecast functions based on field-level and reservoir- level growth, and between forecast functions based on older and newer data. The reserve-growth model used in the 1995 USGS National Assessment and the model currently used in the NOGA project provide forecast functions that yield similar estimates of potential additions to reserves. Both models are based on the Oil and Gas Integrated Field File from the Energy Information Administration (EIA), but different vintages of data (from 1977 through 1991 and 1977 through 1996, respectively). The model based on newer data can be used in place of the previous model, providing similar estimates of potential additions to reserves. Fore-cast functions for oil fields vary little from those for gas fields in these models; therefore, a single function may be used for both oil and gas fields, like that used in the USGS World Petroleum Assessment 2000. Forecast functions based on the field-level reserve growth model derived from the NRG Associates databases (from 1982 through 1998) differ from those derived from EIA databases (from 1977 through 1996). However, the difference may not be enough to preclude the use of the forecast functions derived from NRG data in place of the forecast functions derived from EIA data. Should the model derived from NRG data be used, separate forecast functions for oil fields and gas fields must be employed. The forecast function for oil fields from the model derived from NRG data varies significantly from that for gas fields, and a single function for both oil and gas fields may not be appropriate.

  9. A regressive storm model for extreme space weather

    NASA Astrophysics Data System (ADS)

    Terkildsen, Michael; Steward, Graham; Neudegg, Dave; Marshall, Richard

    2012-07-01

    Extreme space weather events, while rare, pose significant risk to society in the form of impacts on critical infrastructure such as power grids, and the disruption of high end technological systems such as satellites and precision navigation and timing systems. There has been an increased focus on modelling the effects of extreme space weather, as well as improving the ability of space weather forecast centres to identify, with sufficient lead time, solar activity with the potential to produce extreme events. This paper describes the development of a data-based model for predicting the occurrence of extreme space weather events from solar observation. The motivation for this work was to develop a tool to assist space weather forecasters in early identification of solar activity conditions with the potential to produce extreme space weather, and with sufficient lead time to notify relevant customer groups. Data-based modelling techniques were used to construct the model, and an extensive archive of solar observation data used to train, optimise and test the model. The optimisation of the base model aimed to eliminate false negatives (missed events) at the expense of a tolerable increase in false positives, under the assumption of an iterative improvement in forecast accuracy during progression of the solar disturbance, as subsequent data becomes available.

  10. A Comparison Study of Two Numerical Tsunami Forecasting Systems

    NASA Astrophysics Data System (ADS)

    Greenslade, Diana J. M.; Titov, Vasily V.

    2008-12-01

    This paper presents a comparison of two tsunami forecasting systems: the NOAA/PMEL system (SIFT) and the Australian Bureau of Meteorology system (T1). Both of these systems are based on a tsunami scenario database and both use the same numerical model. However, there are some major differences in the way in which the scenarios are constructed and in the implementation of the systems. Two tsunami events are considered here: Tonga 2006 and Sumatra 2007. The results show that there are some differences in the distribution of maximum wave amplitude, particularly for the Tonga event, however both systems compare well to the available tsunameter observations. To assess differences in the forecasts for coastal amplitude predictions, the offshore forecast results from both systems were used as boundary conditions for a high-resolution model for Hilo, Hawaii. The minor differences seen between the two systems in deep water become considerably smaller at the tide gauge and both systems compare very well with the observations.

  11. Provincial Variation of Cochlear Implantation Surgical Volumes and Cost in Canada.

    PubMed

    Crowson, Matthew G; Chen, Joseph M; Tucci, Debara

    2017-01-01

    Objectives To investigate provincial cochlear implantation (CI) annual volume and cost trends. Study Design Database analysis. Setting National surgical volume and cost database. Subjects and Methods Aggregate-level provincial CI volumes and cost data for adult and pediatric CI surgery from 2005 to 2014 were obtained from the Canadian Institute for Health Information. Population-level aging forecast estimates were obtained from the Ontario Ministry of Finance and Statistics Canada. Linear fit, analysis of variance, and Tukey's analyses were utilized to compare variances and means. Results The national volume of annual CI procedures is forecasted to increase by <30 per year ( R 2 = 0.88). Ontario has the highest mean annual CI volume (282; 95% confidence interval, 258-308), followed by Alberta (92.0; 95% confidence interval, 66.3-118), which are significantly higher than all other provinces ( P < .05 for each). Ontario's annual CI procedure volume is forecasted to increase by <11 per year ( R 2 = 0.62). Newfoundland and Nova Scotia have the highest CI procedures per 100,000 residents as compared with all other provinces ( P < .05). Alberta, Newfoundland, and Manitoba have the highest estimated implantation cost of all provinces ( P < .05). Conclusions Historical trends of CI forecast modest national volume growth. Potential bottlenecks include provincial funding and access to surgical expertise. The proportion of older adult patients who may benefit from a CI will rise, and there may be insufficient capacity to meet this need. Delayed access to CI for pediatric patients is also a concern, given recent reports of long wait times for CI surgery.

  12. 7 CFR 1710.208 - RUS criteria for approval of all load forecasts by power supply borrowers and by distribution...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... borrower developed an adequate supporting database and analyzed a reasonable range of relevant assumptions and alternative futures; (d) The borrower adopted methods and procedures in general use by the...

  13. 47 CFR 52.15 - Central office code administration.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... assignment databases; (3) Conducting the Numbering Resource Utilization and Forecast (NRUF) data collection... telecommunications carrier that receives numbering resources from the NANPA, a Pooling Administrator or another... Administrator. (2) State commissions may investigate and determine whether service providers have activated...

  14. 47 CFR 52.15 - Central office code administration.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... assignment databases; (3) Conducting the Numbering Resource Utilization and Forecast (NRUF) data collection... telecommunications carrier that receives numbering resources from the NANPA, a Pooling Administrator or another... Administrator. (2) State commissions may investigate and determine whether service providers have activated...

  15. 47 CFR 52.15 - Central office code administration.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... assignment databases; (3) Conducting the Numbering Resource Utilization and Forecast (NRUF) data collection... telecommunications carrier that receives numbering resources from the NANPA, a Pooling Administrator or another... Administrator. (2) State commissions may investigate and determine whether service providers have activated...

  16. The national database of wildfire mitigation programs: state, county and local efforts reduce wildfire risk

    Treesearch

    Terry Haines; Cheryl Renner; Margaret Reams; James Granskog

    2005-01-01

    The growth of residential communities within forested areas has increased the danger to life and property from uncontrolled wildfire. In response, states, counties and local governments in the United States have dramatically increased their wildfire mitigation efforts. Policymakers and fire officials are employing a wide range of regulatory and voluntary wildfire risk...

  17. Deaf and Hard of Hearing Students' Perspectives on Bullying and School Climate

    ERIC Educational Resources Information Center

    Weiner, Mary T.; Day, Stefanie J.; Galvan, Dennis

    2013-01-01

    Student perspectives reflect school climate. The study examined perspectives among deaf and hard of hearing students in residential and large day schools regarding bullying, and compared these perspectives with those of a national database of hearing students. The participants were 812 deaf and hard of hearing students in 11 U.S. schools. Data…

  18. Residential Radon and Brain Tumour Incidence in a Danish Cohort

    PubMed Central

    Bräuner, Elvira V.; Andersen, Zorana J.; Andersen, Claus E.; Pedersen, Camilla; Gravesen, Peter; Ulbak, Kaare; Hertel, Ole; Loft, Steffen; Raaschou-Nielsen, Ole

    2013-01-01

    Background Increased brain tumour incidence over recent decades may reflect improved diagnostic methods and clinical practice, but remain unexplained. Although estimated doses are low a relationship between radon and brain tumours may exist. Objective To investigate the long-term effect of exposure to residential radon on the risk of primary brain tumour in a prospective Danish cohort. Methods During 1993–1997 we recruited 57,053 persons. We followed each cohort member for cancer occurrence from enrolment until 31 December 2009, identifying 121 primary brain tumour cases. We traced residential addresses from 1 January 1971 until 31 December 2009 and calculated radon concentrations at each address using information from central databases regarding geology and house construction. Cox proportional hazards models were used to estimate incidence rate-ratios (IRR) and 95% confidence intervals (CI) for the risk of primary brain tumours associated with residential radon exposure with adjustment for age, sex, occupation, fruit and vegetable consumption and traffic-related air pollution. Effect modification by air pollution was assessed. Results Median estimated radon was 40.5 Bq/m3. The adjusted IRR for primary brain tumour associated with each 100 Bq/m3 increment in average residential radon levels was 1.96 (95% CI: 1.07; 3.58) and this was exposure-dependently higher over the four radon exposure quartiles. This association was not modified by air pollution. Conclusions We found significant associations and exposure-response patterns between long-term residential radon exposure radon in a general population and risk of primary brain tumours, adding new knowledge to this field. This finding could be chance and needs to be challenged in future studies. PMID:24066143

  19. Residential Pesticides and Childhood Leukemia: A Systematic Review and Meta-Analysis

    PubMed Central

    Turner, Michelle C.; Wigle, Donald T.; Krewski, Daniel

    2010-01-01

    Objective We conducted a systematic review and meta-analysis of previous observational epidemiologic studies examining the relationship between residential pesticide exposures during critical exposure time windows (preconception, pregnancy, and childhood) and childhood leukemia. Data sources Searches of MEDLINE and other electronic databases were performed (1950–2009). Reports were included if they were original epidemiologic studies of childhood leukemia, followed a case–control or cohort design, and assessed at least one index of residential/household pesticide exposure/use. No language criteria were applied. Data extraction Study selection, data abstraction, and quality assessment were performed by two independent reviewers. Random effects models were used to obtain summary odds ratios (ORs) and 95% confidence intervals (CIs). Data synthesis Of the 17 identified studies, 15 were included in the meta-analysis. Exposures during pregnancy to unspecified residential pesticides (summary OR = 1.54; 95% CI, 1.13–2.11; I2 = 66%), insecticides (OR = 2.05; 95% CI, 1.80–2.32; I2 = 0%), and herbicides (OR = 1.61; 95% CI, 1.20–2.16; I2 = 0%) were positively associated with childhood leukemia. Exposures during childhood to unspecified residential pesticides (OR = 1.38; 95% CI, 1.12–1.70; I2 = 4%) and insecticides (OR = 1.61; 95% CI, 1.33–1.95; I2 = 0%) were also positively associated with childhood leukemia, but there was no association with herbicides. Conclusions Positive associations were observed between childhood leukemia and residential pesticide exposures. Further work is needed to confirm previous findings based on self-report, to examine potential exposure–response relationships, and to assess specific pesticides and toxicologically related subgroups of pesticides in more detail. PMID:20056585

  20. Flash flood forecasting using simplified hydrological models, radar rainfall forecasts and data assimilation

    NASA Astrophysics Data System (ADS)

    Smith, P. J.; Beven, K.; Panziera, L.

    2012-04-01

    The issuing of timely flood alerts may be dependant upon the ability to predict future values of water level or discharge at locations where observations are available. Catchments at risk of flash flooding often have a rapid natural response time, typically less then the forecast lead time desired for issuing alerts. This work focuses on the provision of short-range (up to 6 hours lead time) predictions of discharge in small catchments based on utilising radar forecasts to drive a hydrological model. An example analysis based upon the Verzasca catchment (Ticino, Switzerland) is presented. Parsimonious time series models with a mechanistic interpretation (so called Data-Based Mechanistic model) have been shown to provide reliable accurate forecasts in many hydrological situations. In this study such a model is developed to predict the discharge at an observed location from observed precipitation data. The model is shown to capture the snow melt response at this site. Observed discharge data is assimilated to improve the forecasts, of up to two hours lead time, that can be generated from observed precipitation. To generate forecasts with greater lead time ensemble precipitation forecasts are utilised. In this study the Nowcasting ORographic precipitation in the Alps (NORA) product outlined in more detail elsewhere (Panziera et al. Q. J. R. Meteorol. Soc. 2011; DOI:10.1002/qj.878) is utilised. NORA precipitation forecasts are derived from historical analogues based on the radar field and upper atmospheric conditions. As such, they avoid the need to explicitly model the evolution of the rainfall field through for example Lagrangian diffusion. The uncertainty in the forecasts is represented by characterisation of the joint distribution of the observed discharge, the discharge forecast using the (in operational conditions unknown) future observed precipitation and that forecast utilising the NORA ensembles. Constructing the joint distribution in this way allows the full historic record of data at the site to inform the predictive distribution. It is shown that, in part due to the limited availability of forecasts, the uncertainty in the relationship between the NORA based forecasts and other variates dominated the resulting predictive uncertainty.

  1. Job Satisfaction among Care Aides in Residential Long-Term Care: A Systematic Review of Contributing Factors, Both Individual and Organizational

    PubMed Central

    Squires, Janet E.; Hoben, Matthias; Linklater, Stefanie; Carleton, Heather L.; Graham, Nicole; Estabrooks, Carole A.

    2015-01-01

    Despite an increasing literature on professional nurses' job satisfaction, job satisfaction by nonprofessional nursing care providers and, in particular, in residential long-term care facilities, is sparsely described. The purpose of this study was to systematically review the evidence on which factors (individual and organizational) are associated with job satisfaction among care aides, nurse aides, and nursing assistants, who provide the majority of direct resident care, in residential long-term care facilities. Nine online databases were searched. Two authors independently screened, and extracted data and assessed the included publications for methodological quality. Decision rules were developed a priori to draw conclusions on which factors are important to care aide job satisfaction. Forty-two publications were included. Individual factors found to be important were empowerment and autonomy. Six additional individual factors were found to be not important: age, ethnicity, gender, education level, attending specialized training, and years of experience. Organizational factors found to be important were facility resources and workload. Two additional factors were found to be not important: satisfaction with salary/benefits and job performance. Factors important to care aide job satisfaction differ from those reported among hospital nurses, supporting the need for different strategies to improve care aide job satisfaction in residential long-term care. PMID:26345545

  2. You are where you shop: grocery store locations, weight, and neighborhoods.

    PubMed

    Inagami, Sanae; Cohen, Deborah A; Finch, Brian Karl; Asch, Steven M

    2006-07-01

    Residents in poor neighborhoods have higher body mass index (BMI) and eat less healthfully. One possible reason might be the quality of available foods in their area. Location of grocery stores where individuals shop and its association with BMI were examined. The 2000 U.S. Census data were linked with the Los Angeles Family and Neighborhood Study (L.A.FANS) database, which consists of 2620 adults sampled from 65 neighborhoods in Los Angeles County between 2000 and 2002. In 2005, multilevel linear regressions were used to estimate the associations between BMI and socioeconomic characteristics of grocery store locations after adjustment for individual-level factors and socioeconomic characteristics of residential neighborhoods. Individuals have higher BMI if they reside in disadvantaged areas and in areas where the average person frequents grocery stores located in more disadvantaged neighborhoods. Those who own cars and travel farther to their grocery stores also have higher BMI. When controlling for grocery store census tract socioeconomic status (SES), the association between residential census tract SES and BMI becomes stronger. Where people shop for groceries and distance traveled to grocery stores are independently associated with BMI. Exposure to grocery store mediates and suppresses the association of residential neighborhoods with BMI and could explain why previous studies may not have found robust associations between residential neighborhood predictors and BMI.

  3. Leveraging organismal biology to forecast the effects of climate change.

    PubMed

    Buckley, Lauren B; Cannistra, Anthony F; John, Aji

    2018-04-26

    Despite the pressing need for accurate forecasts of ecological and evolutionary responses to environmental change, commonly used modelling approaches exhibit mixed performance because they omit many important aspects of how organisms respond to spatially and temporally variable environments. Integrating models based on organismal phenotypes at the physiological, performance and fitness levels can improve model performance. We summarize current limitations of environmental data and models and discuss potential remedies. The paper reviews emerging techniques for sensing environments at fine spatial and temporal scales, accounting for environmental extremes, and capturing how organisms experience the environment. Intertidal mussel data illustrate biologically important aspects of environmental variability. We then discuss key challenges in translating environmental conditions into organismal performance including accounting for the varied timescales of physiological processes, for responses to environmental fluctuations including the onset of stress and other thresholds, and for how environmental sensitivities vary across lifecycles. We call for the creation of phenotypic databases to parameterize forecasting models and advocate for improved sharing of model code and data for model testing. We conclude with challenges in organismal biology that must be solved to improve forecasts over the next decade.acclimation, biophysical models, ecological forecasting, extremes, microclimate, spatial and temporal variability.

  4. The psychology of intelligence analysis: drivers of prediction accuracy in world politics.

    PubMed

    Mellers, Barbara; Stone, Eric; Atanasov, Pavel; Rohrbaugh, Nick; Metz, S Emlen; Ungar, Lyle; Bishop, Michael M; Horowitz, Michael; Merkle, Ed; Tetlock, Philip

    2015-03-01

    This article extends psychological methods and concepts into a domain that is as profoundly consequential as it is poorly understood: intelligence analysis. We report findings from a geopolitical forecasting tournament that assessed the accuracy of more than 150,000 forecasts of 743 participants on 199 events occurring over 2 years. Participants were above average in intelligence and political knowledge relative to the general population. Individual differences in performance emerged, and forecasting skills were surprisingly consistent over time. Key predictors were (a) dispositional variables of cognitive ability, political knowledge, and open-mindedness; (b) situational variables of training in probabilistic reasoning and participation in collaborative teams that shared information and discussed rationales (Mellers, Ungar, et al., 2014); and (c) behavioral variables of deliberation time and frequency of belief updating. We developed a profile of the best forecasters; they were better at inductive reasoning, pattern detection, cognitive flexibility, and open-mindedness. They had greater understanding of geopolitics, training in probabilistic reasoning, and opportunities to succeed in cognitively enriched team environments. Last but not least, they viewed forecasting as a skill that required deliberate practice, sustained effort, and constant monitoring of current affairs. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  5. Forecasting ESKAPE infections through a time-varying auto-adaptive algorithm using laboratory-based surveillance data.

    PubMed

    Ballarin, Antonio; Posteraro, Brunella; Demartis, Giuseppe; Gervasi, Simona; Panzarella, Fabrizio; Torelli, Riccardo; Paroni Sterbini, Francesco; Morandotti, Grazia; Posteraro, Patrizia; Ricciardi, Walter; Gervasi Vidal, Kristian A; Sanguinetti, Maurizio

    2014-12-06

    Mathematical or statistical tools are capable to provide a valid help to improve surveillance systems for healthcare and non-healthcare-associated bacterial infections. The aim of this work is to evaluate the time-varying auto-adaptive (TVA) algorithm-based use of clinical microbiology laboratory database to forecast medically important drug-resistant bacterial infections. Using TVA algorithm, six distinct time series were modelled, each one representing the number of episodes per single 'ESKAPE' (E nterococcus faecium, S taphylococcus aureus, K lebsiella pneumoniae, A cinetobacter baumannii, P seudomonas aeruginosa and E nterobacter species) infecting pathogen, that had occurred monthly between 2002 and 2011 calendar years at the Università Cattolica del Sacro Cuore general hospital. Monthly moving averaged numbers of observed and forecasted ESKAPE infectious episodes were found to show a complete overlapping of their respective smoothed time series curves. Overall good forecast accuracy was observed, with percentages ranging from 82.14% for E. faecium infections to 90.36% for S. aureus infections. Our approach may regularly provide physicians with forecasted bacterial infection rates to alert them about the spread of antibiotic-resistant bacterial species, especially when clinical microbiological results of patients' specimens are delayed.

  6. Long-range seasonal streamflow forecasting over the Iberian Peninsula using large-scale atmospheric and oceanic information

    NASA Astrophysics Data System (ADS)

    Hidalgo-Muñoz, J. M.; Gámiz-Fortis, S. R.; Castro-Díez, Y.; Argüeso, D.; Esteban-Parra, M. J.

    2015-05-01

    Identifying the relationship between large-scale climate signals and seasonal streamflow may provide a valuable tool for long-range seasonal forecasting in regions under water stress, such as the Iberian Peninsula (IP). The skill of the main teleconnection indices as predictors of seasonal streamflow in the IP was evaluated. The streamflow database used was composed of 382 stations, covering the period 1975-2008. Predictions were made using a leave-one-out cross-validation approach based on multiple linear regression, combining Variance Inflation Factor and Stepwise Backward selection to avoid multicollinearity and select the best subset of predictors. Predictions were made for four forecasting scenarios, from one to four seasons in advance. The correlation coefficient (RHO), Root Mean Square Error Skill Score (RMSESS), and the Gerrity Skill Score (GSS) were used to evaluate the forecasting skill. For autumn streamflow, good forecasting skill (RHO>0.5, RMSESS>20%, GSS>0.4) was found for a third of the stations located in the Mediterranean Andalusian Basin, the North Atlantic Oscillation of the previous winter being the main predictor. Also, fair forecasting skill (RHO>0.44, RMSESS>10%, GSS>0.2) was found in stations in the northwestern IP (16 of these located in the Douro and Tagus Basins) with two seasons in advance. For winter streamflow, fair forecasting skill was found for one season in advance in 168 stations, with the Snow Advance Index as the main predictor. Finally, forecasting was poorer for spring streamflow than for autumn and winter, since only 16 stations showed fair forecasting skill in with one season in advance, particularly in north-western of IP.

  7. Development of an Adaptable Display and Diagnostic System for the Evaluation of Tropical Cyclone Forecasts

    NASA Astrophysics Data System (ADS)

    Kucera, P. A.; Burek, T.; Halley-Gotway, J.

    2015-12-01

    NCAR's Joint Numerical Testbed Program (JNTP) focuses on the evaluation of experimental forecasts of tropical cyclones (TCs) with the goal of developing new research tools and diagnostic evaluation methods that can be transitioned to operations. Recent activities include the development of new TC forecast verification methods and the development of an adaptable TC display and diagnostic system. The next generation display and diagnostic system is being developed to support evaluation needs of the U.S. National Hurricane Center (NHC) and broader TC research community. The new hurricane display and diagnostic capabilities allow forecasters and research scientists to more deeply examine the performance of operational and experimental models. The system is built upon modern and flexible technology that includes OpenLayers Mapping tools that are platform independent. The forecast track and intensity along with associated observed track information are stored in an efficient MySQL database. The system provides easy-to-use interactive display system, and provides diagnostic tools to examine forecast track stratified by intensity. Consensus forecasts can be computed and displayed interactively. The system is designed to display information for both real-time and for historical TC cyclones. The display configurations are easily adaptable to meet the needs of the end-user preferences. Ongoing enhancements include improving capabilities for stratification and evaluation of historical best tracks, development and implementation of additional methods to stratify and compute consensus hurricane track and intensity forecasts, and improved graphical display tools. The display is also being enhanced to incorporate gridded forecast, satellite, and sea surface temperature fields. The presentation will provide an overview of the display and diagnostic system development and demonstration of the current capabilities.

  8. Evaluation of weather forecast systems for storm surge modeling in the Chesapeake Bay

    NASA Astrophysics Data System (ADS)

    Garzon, Juan L.; Ferreira, Celso M.; Padilla-Hernandez, Roberto

    2018-01-01

    Accurate forecast of sea-level heights in coastal areas depends, among other factors, upon a reliable coupling of a meteorological forecast system to a hydrodynamic and wave system. This study evaluates the predictive skills of the coupled circulation and wind-wave model system (ADCIRC+SWAN) for simulating storm tides in the Chesapeake Bay, forced by six different products: (1) Global Forecast System (GFS), (2) Climate Forecast System (CFS) version 2, (3) North American Mesoscale Forecast System (NAM), (4) Rapid Refresh (RAP), (5) European Center for Medium-Range Weather Forecasts (ECMWF), and (6) the Atlantic hurricane database (HURDAT2). This evaluation is based on the hindcasting of four events: Irene (2011), Sandy (2012), Joaquin (2015), and Jonas (2016). By comparing the simulated water levels to observations at 13 monitoring stations, we have found that the ADCIR+SWAN System forced by the following: (1) the HURDAT2-based system exhibited the weakest statistical skills owing to a noteworthy overprediction of the simulated wind speed; (2) the ECMWF, RAP, and NAM products captured the moment of the peak and moderately its magnitude during all storms, with a correlation coefficient ranging between 0.98 and 0.77; (3) the CFS system exhibited the worst averaged root-mean-square difference (excepting HURDAT2); (4) the GFS system (the lowest horizontal resolution product tested) resulted in a clear underprediction of the maximum water elevation. Overall, the simulations forced by NAM and ECMWF systems induced the most accurate results best accuracy to support water level forecasting in the Chesapeake Bay during both tropical and extra-tropical storms.

  9. [Medical human resources planning in Europe: A literature review of the forecasting models].

    PubMed

    Benahmed, N; Deliège, D; De Wever, A; Pirson, M

    2018-02-01

    Healthcare is a labor-intensive sector in which half of the expenses are dedicated to human resources. Therefore, policy makers, at national and internal levels, attend to the number of practicing professionals and the skill mix. This paper aims to analyze the European forecasting model for supply and demand of physicians. To describe the forecasting tools used for physician planning in Europe, a grey literature search was done in the OECD, WHO, and European Union libraries. Electronic databases such as Pubmed, Medine, Embase and Econlit were also searched. Quantitative methods for forecasting medical supply rely mainly on stock-and-flow simulations and less often on systemic dynamics. Parameters included in forecasting models exhibit wide variability for data availability and quality. The forecasting of physician needs is limited to healthcare consumption and rarely considers overall needs and service targets. Besides quantitative methods, horizon scanning enables an evaluation of the changes in supply and demand in an uncertain future based on qualitative techniques such as semi-structured interviews, Delphi Panels, or focus groups. Finally, supply and demand forecasting models should be regularly updated. Moreover, post-hoc analyze is also needed but too rarely implemented. Medical human resource planning in Europe is inconsistent. Political implementation of the results of forecasting projections is essential to insure efficient planning. However, crucial elements such as mobility data between Member States are poorly understood, impairing medical supply regulation policies. These policies are commonly limited to training regulations, while horizontal and vertical substitution is less frequently taken into consideration. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  10. Forecast Method of Solar Irradiance with Just-In-Time Modeling

    NASA Astrophysics Data System (ADS)

    Suzuki, Takanobu; Goto, Yusuke; Terazono, Takahiro; Wakao, Shinji; Oozeki, Takashi

    PV power output mainly depends on the solar irradiance which is affected by various meteorological factors. So, it is required to predict solar irradiance in the future for the efficient operation of PV systems. In this paper, we develop a novel approach for solar irradiance forecast, in which we introduce to combine the black-box model (JIT Modeling) with the physical model (GPV data). We investigate the predictive accuracy of solar irradiance over wide controlled-area of each electric power company by utilizing the measured data on the 44 observation points throughout Japan offered by JMA and the 64 points around Kanto by NEDO. Finally, we propose the application forecast method of solar irradiance to the point which is difficulty in compiling the database. And we consider the influence of different GPV default time on solar irradiance prediction.

  11. Waste Information Management System-2012 - 12114

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upadhyay, H.; Quintero, W.; Shoffner, P.

    2012-07-01

    The Waste Information Management System (WIMS) -2012 was updated to support the Department of Energy (DOE) accelerated cleanup program. The schedule compression required close coordination and a comprehensive review and prioritization of the barriers that impeded treatment and disposition of the waste streams at each site. Many issues related to waste treatment and disposal were potential critical path issues under the accelerated schedule. In order to facilitate accelerated cleanup initiatives, waste managers at DOE field sites and at DOE Headquarters in Washington, D.C., needed timely waste forecast and transportation information regarding the volumes and types of radioactive waste that wouldmore » be generated by DOE sites over the next 40 years. Each local DOE site historically collected, organized, and displayed waste forecast information in separate and unique systems. In order for interested parties to understand and view the complete DOE complex-wide picture, the radioactive waste and shipment information of each DOE site needed to be entered into a common application. The WIMS application was therefore created to serve as a common application to improve stakeholder comprehension and improve DOE radioactive waste treatment and disposal planning and scheduling. WIMS allows identification of total forecasted waste volumes, material classes, disposition sites, choke points, technological or regulatory barriers to treatment and disposal, along with forecasted waste transportation information by rail, truck and inter-modal shipments. The Applied Research Center (ARC) at Florida International University (FIU) in Miami, Florida, developed and deployed the web-based forecast and transportation system and is responsible for updating the radioactive waste forecast and transportation data on a regular basis to ensure the long-term viability and value of this system. WIMS continues to successfully accomplish the goals and objectives set forth by DOE for this project. It has replaced the historic process of each DOE site gathering, organizing, and reporting their waste forecast information utilizing different databases and display technologies. In addition, WIMS meets DOE's objective to have the complex-wide waste forecast and transportation information available to all stakeholders and the public in one easy-to-navigate system. The enhancements to WIMS made since its initial deployment include the addition of new DOE sites and facilities, an updated waste and transportation information, and the ability to easily display and print customized waste forecast, the disposition maps, GIS maps and transportation information. The system also allows users to customize and generate reports over the web. These reports can be exported to various formats, such as Adobe{sup R} PDF, Microsoft Excel{sup R}, and Microsoft Word{sup R} and downloaded to the user's computer. Future enhancements will include database/application migration to the next level. A new data import interface will be developed to integrate 2012-13 forecast waste streams. In addition, the application is updated on a continuous basis based on DOE feedback. (authors)« less

  12. Upgrade Summer Severe Weather Tool in MIDDS

    NASA Technical Reports Server (NTRS)

    Wheeler, Mark M.

    2010-01-01

    The goal of this task was to upgrade the severe weather database from the previous phase by adding weather observations from the years 2004 - 2009, re-analyze the data to determine the important parameters, make adjustments to the index weights depending on the analysis results, and update the MIDDS GUI. The added data increased the period of record from 15 to 21 years. Data sources included local forecast rules, archived sounding data, surface and upper air maps, and two severe weather event databases covering east-central Florida. Four of the stability indices showed increased severe weather predication. The Total Threat Score (TTS) of the previous work was verified for the warm season of 2009 with very good skill. The TTS Probability of Detection (POD) was 88% and the False alarm rate (FAR) of 8%. Based on the results of the analyses, the MIDDS Severe Weather Worksheet GUI was updated to assist the duty forecaster by providing a level of objective guidance based on the analysis of the stability parameters and synoptic-scale dynamics.

  13. Measurement of the local food environment: a comparison of existing data sources.

    PubMed

    Bader, Michael D M; Ailshire, Jennifer A; Morenoff, Jeffrey D; House, James S

    2010-03-01

    Studying the relation between the residential environment and health requires valid, reliable, and cost-effective methods to collect data on residential environments. This 2002 study compared the level of agreement between measures of the presence of neighborhood businesses drawn from 2 common sources of data used for research on the built environment and health: listings of businesses from commercial databases and direct observations of city blocks by raters. Kappa statistics were calculated for 6 types of businesses-drugstores, liquor stores, bars, convenience stores, restaurants, and grocers-located on 1,663 city blocks in Chicago, Illinois. Logistic regressions estimated whether disagreement between measurement methods was systematically correlated with the socioeconomic and demographic characteristics of neighborhoods. Levels of agreement between the 2 sources were relatively high, with significant (P < 0.001) kappa statistics for each business type ranging from 0.32 to 0.70. Most business types were more likely to be reported by direct observations than in the commercial database listings. Disagreement between the 2 sources was not significantly correlated with the socioeconomic and demographic characteristics of neighborhoods. Results suggest that researchers should have reasonable confidence using whichever method (or combination of methods) is most cost-effective and theoretically appropriate for their research design.

  14. Understanding heterogeneity in metropolitan India: The added value of remote sensing data for analyzing sub-standard residential areas

    NASA Astrophysics Data System (ADS)

    Baud, Isa; Kuffer, Monika; Pfeffer, Karin; Sliuzas, Richard; Karuppannan, Sadasivam

    2010-10-01

    Analyzing the heterogeneity in metropolitan areas of India utilizing remote sensing data can help to identify more precise patterns of sub-standard residential areas. Earlier work analyzing inequalities in Indian cities employed a constructed index of multiple deprivations (IMDs) utilizing data from the Census of India 2001 ( http://censusindia.gov.in). While that index, described in an earlier paper, provided a first approach to identify heterogeneity at the citywide scale, it neither provided information on spatial variations within the geographical boundaries of the Census database, nor about physical characteristics, such as green spaces and the variation in housing density and quality. In this article, we analyze whether different types of sub-standard residential areas can be identified through remote sensing data, combined, where relevant, with ground-truthing and local knowledge. The specific questions address: (1) the extent to which types of residential sub-standard areas can be drawn from remote sensing data, based on patterns of green space, structure of layout, density of built-up areas, size of buildings and other site characteristics; (2) the spatial diversity of these residential types for selected electoral wards; and (3) the correlation between different types of sub-standard residential areas and the results of the index of multiple deprivations utilized at electoral ward level found previously. The results of a limited number of test wards in Delhi showed that it was possible to extract different residential types matching existing settlement categories using the physical indicators structure of layout, built-up density, building size and other site characteristics. However, the indicator 'amount of green spaces' was not useful to identify informal areas. The analysis of heterogeneity showed that wards with higher IMD scores displayed more or less the full range of residential types, implying that visual image interpretation is able to zoom in on clusters of deprivation of varying size. Finally, the visual interpretation of the diversity of residential types matched the results of the IMD analysis quite well, although the limited number of test wards would need to be expanded to strengthen this statement. Visual image analysis strengthens the robustness of the IMD, and in addition, gives a better idea of the degree of heterogeneity in deprivations within a ward.

  15. Meter reading for smaller customers in an open-access environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-05-01

    As the ability to buy gas and transportation services directly from suppliers becomes available to small commercial and residential customers, local distribution companies (LDCs) are evaluating how to measure and track their consumption. The LDCs often measure the gas use of large commercial and industrial customers with remote, automated meter-reading (AMR) devices, many of which provide real-time data. The utility can justify the expense of installing these devices because of the customers` considerable gas consumption. But for customers who already contribute very little to margins, AMR investments by LDCs are more problematic. The paper discusses some options for remote meteringmore » and forecasts future trends in the industry.« less

  16. Demand Forecasting and Revenue Requirements, with Implications for Consideration in British Columbia,

    DTIC Science & Technology

    1983-05-01

    future financing obligations will be 782 largely determined by the conditions of the financial 783 markets , the amount of construction activity that is...L 4 w- MM t 14 * 0 00 M00 0 0 0nU M4 C? 0C 4 4 ’C) * 0 3 0 0N% AI 0 % M 41 C 0 MNl -4 N4 -4-4 0 6. n ad 4 LI L 1 4p % N 61-4 I 01 -54- t: 1% r.a -4 "E...34Residential Demand for Electricity," Quarterly Review of Economics and Business , Vol. 11, Spring 1971, pp. 7-22. Yang, Yung Y., "Temporal Stability

  17. Energy supply and demand modeling. February 1985-March 1988 (A Bibliography from the NTIS data base). Report for February 1985-March 1988

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1990-06-01

    This bibliography contains citations concerning the use of mathematical models in trend analysis and forecasting of energy supply and demand factors. Models are presented for the industrial, transportation, and residential sectors. Aspects of long term energy strategies and markets are discussed at the global, national, state, and regional levels. Energy demand and pricing, and econometrics of energy, are explored for electric utilities and natural resources, such as coal, oil, and natural gas. Energy resources are modeled both for fuel usage and for reserves. (This updated bibliography contains 201 citations, none of which are new entries to the previous edition.)

  18. Energy supply and demand modeling. February 1985-March 1988 (Citations from the NTIS data base). Report for February 1985-March 1988

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1988-04-01

    This bibliography contains citations concerning the utilization of mathematical models in trend analysis and forecasting of energy supply and demand factors. Models are presented for the industrial, transportation, and residential sectors. Aspects of long-term energy strategies and markets are discussed at the global, national, state, and regional levels. Energy demand and pricing, and econometrics of energy, are explored for electric utilities and natural resources, such as coal, oil, and natural gas. Energy resources are modeled both for fuel usage and for reserves. (This updated bibliography contains 201 citations, 129 of which are new entries to the previous edition.)

  19. Energy supply and demand modeling. April 1988-June 1990 (A Bibliography from the NTIS data base). Report for April 1988-June 1990

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1990-06-01

    This bibliography contains citations concerning the use of mathematical models in trend analysis and forecasting of energy supply and demand factors. Models are presented for the industrial, transportation, and residential sectors. Aspects of long term energy strategies and markets are discussed at the global, national, state, and regional levels. Energy demand and pricing, and econometrics of energy, are explored for electric utilities and natural resources, such as coal, oil, and natural gas. Energy resources are modeled both for fuel usage and for reserves. (This updated bibliography contains 200 citations, all of which are new entries to the previous edition.)

  20. Assessing the potential for improving S2S forecast skill through multimodel ensembling

    NASA Astrophysics Data System (ADS)

    Vigaud, N.; Robertson, A. W.; Tippett, M. K.; Wang, L.; Bell, M. J.

    2016-12-01

    Non-linear logistic regression is well suited to probability forecasting and has been successfully applied in the past to ensemble weather and climate predictions, providing access to the full probabilities distribution without any Gaussian assumption. However, little work has been done at sub-monthly lead times where relatively small re-forecast ensembles and lengths represent new challenges for which post-processing avenues have yet to be investigated. A promising approach consists in extending the definition of non-linear logistic regression by including the quantile of the forecast distribution as one of the predictors. So-called Extended Logistic Regression (ELR), which enables mutually consistent individual threshold probabilities, is here applied to ECMWF, CFSv2 and CMA re-forecasts from the S2S database in order to produce rainfall probabilities at weekly resolution. The ELR model is trained on seasonally-varying tercile categories computed for lead times of 1 to 4 weeks. It is then tested in a cross-validated manner, i.e. allowing real-time predictability applications, to produce rainfall tercile probabilities from individual weekly hindcasts that are finally combined by equal pooling. Results will be discussed over a broader North American region, where individual and MME forecasts generated out to 4 weeks lead are characterized by good probabilistic reliability but low sharpness, exhibiting systematically more skill in winter than summer.

  1. Process-based modeling of species' responses to climate change - a proof of concept using western North American trees

    NASA Astrophysics Data System (ADS)

    Evans, M. E.; Merow, C.; Record, S.; Menlove, J.; Gray, A.; Cundiff, J.; McMahon, S.; Enquist, B. J.

    2013-12-01

    Current attempts to forecast how species' distributions will change in response to climate change suffer under a fundamental trade-off: between modeling many species superficially vs. few species in detail (between correlative vs. mechanistic models). The goals of this talk are two-fold: first, we present a Bayesian multilevel modeling framework, dynamic range modeling (DRM), for building process-based forecasts of many species' distributions at a time, designed to address the trade-off between detail and number of distribution forecasts. In contrast to 'species distribution modeling' or 'niche modeling', which uses only species' occurrence data and environmental data, DRMs draw upon demographic data, abundance data, trait data, occurrence data, and GIS layers of climate in a single framework to account for two processes known to influence range dynamics - demography and dispersal. The vision is to use extensive databases on plant demography, distributions, and traits - in the Botanical Information and Ecology Network, the Forest Inventory and Analysis database (FIA), and the International Tree Ring Data Bank - to develop DRMs for North American trees. Second, we present preliminary results from building the core submodel of a DRM - an integral projection model (IPM) - for a sample of dominant tree species in western North America. IPMs are used to infer demographic niches - i.e., the set of environmental conditions under which population growth rate is positive - and project population dynamics through time. Based on >550,000 data points derived from FIA for nine tree species in western North America, we show IPM-based models of their current and future distributions, and discuss how IPMs can be used to forecast future forest productivity, mortality patterns, and inform efforts at assisted migration.

  2. PAI-OFF: A new proposal for online flood forecasting in flash flood prone catchments

    NASA Astrophysics Data System (ADS)

    Schmitz, G. H.; Cullmann, J.

    2008-10-01

    SummaryThe Process Modelling and Artificial Intelligence for Online Flood Forecasting (PAI-OFF) methodology combines the reliability of physically based, hydrologic/hydraulic modelling with the operational advantages of artificial intelligence. These operational advantages are extremely low computation times and straightforward operation. The basic principle of the methodology is to portray process models by means of ANN. We propose to train ANN flood forecasting models with synthetic data that reflects the possible range of storm events. To this end, establishing PAI-OFF requires first setting up a physically based hydrologic model of the considered catchment and - optionally, if backwater effects have a significant impact on the flow regime - a hydrodynamic flood routing model of the river reach in question. Both models are subsequently used for simulating all meaningful and flood relevant storm scenarios which are obtained from a catchment specific meteorological data analysis. This provides a database of corresponding input/output vectors which is then completed by generally available hydrological and meteorological data for characterizing the catchment state prior to each storm event. This database subsequently serves for training both a polynomial neural network (PoNN) - portraying the rainfall-runoff process - and a multilayer neural network (MLFN), which mirrors the hydrodynamic flood wave propagation in the river. These two ANN models replace the hydrological and hydrodynamic model in the operational mode. After presenting the theory, we apply PAI-OFF - essentially consisting of the coupled "hydrologic" PoNN and "hydrodynamic" MLFN - to the Freiberger Mulde catchment in the Erzgebirge (Ore-mountains) in East Germany (3000 km 2). Both the demonstrated computational efficiency and the prediction reliability underline the potential of the new PAI-OFF methodology for online flood forecasting.

  3. Forecasting and modelling ice layer formation on the snowpack due to freezing precipitations in the Pyrenees

    NASA Astrophysics Data System (ADS)

    Quéno, Louis; Vionnet, Vincent; Cabot, Frédéric; Vrécourt, Dominique; Dombrowski-Etchevers, Ingrid

    2017-04-01

    In the Pyrenees, freezing precipitations in altitude occur at least once per winter, leading to the formation of a pure ice layer on the surface of the snowpack. It may lead to accidents and fatalities among mountaineers and skiers, with sometimes a higher human toll than avalanches. Such events are not predicted by the current operational systems for snow and avalanche hazard forecasting. A crowd-sourced database of surface ice layer occurrences is first built up, using reports from Internet mountaineering and ski-touring communities, to mitigate the lack of observations from conventional observation networks. A simple diagnostic of freezing precipitation is then developed, based on the cloud water content and screen temperature forecast by the Numerical Weather Prediction model AROME, operating at 2.5-km resolution. The performance of this diagnostic is assessed for the event of 5-6 January 2012, with a good representation of altitudinal and spatial distributions of the ice layer. An evaluation of the diagnostic for major events over five winters gives good skills of detection compared to the occurrences reported in the observation database. A new modelling of ice formation on the surface of the snowpack due to impinging supercooled water is added to the detailed snowpack model Crocus. It is combined to the atmospheric diagnostic of freezing precipitations and resulting snowpack simulations over a winter season capture well the formation of the main ice layers. Their influence on the snowpack stratigraphy is also realistically simulated. These simple methods enable to forecast the occurrence of surface ice layer formations with good confidence and to simulate their evolution within the snowpack, even if an accurate estimation of freezing precipitation amounts remains the main challenge.

  4. Characterization of Residential Pesticide Use and Chemical Formulations through Self-Report and Household Inventory: The Northern California Childhood Leukemia Study

    PubMed Central

    Guha, Neela; Ward, Mary H.; Gunier, Robert; Colt, Joanne S.; Lea, C. Suzanne; Buffler, Patricia A.

    2012-01-01

    Background: Home and garden pesticide use has been linked to cancer and other health outcomes in numerous epidemiological studies. Exposure has generally been self-reported, so the assessment is potentially limited by recall bias and lack of information on specific chemicals. Objectives: As part of an integrated assessment of residential pesticide exposure, we identified active ingredients and described patterns of storage and use. Methods: During a home interview of 500 residentially stable households enrolled in the Northern California Childhood Leukemia Study during 2001–2006, trained interviewers inventoried residential pesticide products and queried participants about their storage and use. U.S. Environmental Protection Agency registration numbers, recorded from pesticide product labels, and pesticide chemical codes were matched to public databases to obtain information on active ingredients and chemical class. Poisson regression was used to identify independent predictors of pesticide storage. Analyses were restricted to 259 participating control households. Results: Ninety-five percent (246 of 259) of the control households stored at least one pesticide product (median, 4). Indicators of higher sociodemographic status predicted more products in storage. We identified the most common characteristics: storage areas (garage, 40%; kitchen, 20%), pests treated (ants, 33%; weeds, 20%), pesticide types (insecticides, 46%; herbicides, 24%), chemical classes (pyrethroids, 77%; botanicals, 50%), active ingredients (pyrethrins, 43%) and synergists (piperonyl butoxide, 42%). Products could contain multiple active ingredients. Conclusions: Our data on specific active ingredients and patterns of storage and use will inform future etiologic analyses of residential pesticide exposures from self-reported data, particularly among households with young children. PMID:23110983

  5. Prevalence of inappropriate medication use in residential long-term care facilities for the elderly: A systematic review.

    PubMed

    Storms, Hannelore; Marquet, Kristel; Aertgeerts, Bert; Claes, Neree

    2017-12-01

    Multi-morbidity and polypharmacy of the elderly population enhances the probability of elderly in residential long-term care facilities experiencing inappropriate medication use. The aim is to systematically review literature to assess the prevalence of inappropriate medication use in residential long-term care facilities for the elderly. Databases (MEDLINE, EMBASE) were searched for literature from 2004 to 2016 to identify studies examining inappropriate medication use in residential long-term care facilities for the elderly. Studies were eligible when relying on Beers criteria, STOPP, START, PRISCUS list, ACOVE, BEDNURS or MAI instruments. Inappropriate medication use was defined by the criteria of these seven instruments. Twenty-one studies met inclusion criteria. Seventeen studies relied on a version of Beers criteria with prevalence ranging between 18.5% and 82.6% (median 46.5%) residents experiencing inappropriate medication use. A smaller range, from 21.3% to 63.0% (median 35.1%), was reported when considering solely the 10 studies that used Beers criteria updated in 2003. Prevalence varied from 23.7% to 79.8% (median 61.1%) in seven studies relying on STOPP. START and ACOVE were relied on in respectively four (prevalence: 30.5-74.0%) and two studies (prevalence: 28.9-58.0%); PRISCUS, BEDNURS and MAI were all used in one study each. Beers criteria of 2003 and STOPP were most frequently used to determine inappropriate medication use in residential long-term care facilities. Prevalence of inappropriate medication use strongly varied, despite similarities in research design and assessment with identical instrument(s).

  6. Prevalence of inappropriate medication use in residential long-term care facilities for the elderly: A systematic review

    PubMed Central

    Storms, Hannelore; Marquet, Kristel; Aertgeerts, Bert; Claes, Neree

    2017-01-01

    Abstract Background: Multi-morbidity and polypharmacy of the elderly population enhances the probability of elderly in residential long-term care facilities experiencing inappropriate medication use. Objectives: The aim is to systematically review literature to assess the prevalence of inappropriate medication use in residential long-term care facilities for the elderly. Methods: Databases (MEDLINE, EMBASE) were searched for literature from 2004 to 2016 to identify studies examining inappropriate medication use in residential long-term care facilities for the elderly. Studies were eligible when relying on Beers criteria, STOPP, START, PRISCUS list, ACOVE, BEDNURS or MAI instruments. Inappropriate medication use was defined by the criteria of these seven instruments. Results: Twenty-one studies met inclusion criteria. Seventeen studies relied on a version of Beers criteria with prevalence ranging between 18.5% and 82.6% (median 46.5%) residents experiencing inappropriate medication use. A smaller range, from 21.3% to 63.0% (median 35.1%), was reported when considering solely the 10 studies that used Beers criteria updated in 2003. Prevalence varied from 23.7% to 79.8% (median 61.1%) in seven studies relying on STOPP. START and ACOVE were relied on in respectively four (prevalence: 30.5–74.0%) and two studies (prevalence: 28.9–58.0%); PRISCUS, BEDNURS and MAI were all used in one study each. Conclusions: Beers criteria of 2003 and STOPP were most frequently used to determine inappropriate medication use in residential long-term care facilities. Prevalence of inappropriate medication use strongly varied, despite similarities in research design and assessment with identical instrument(s). PMID:28271916

  7. Interpersonal pathoplasticity and trajectories of change in routine adolescent and young adult residential substance abuse treatment.

    PubMed

    Boswell, James F; Cain, Nicole M; Oswald, Jennifer M; McAleavey, Andrew A; Adelman, Robert

    2017-07-01

    Partnerships between mental health care stakeholders provide a context for generalizable clinical research with implications for quality improvement. In the context of a partnership between an adolescent residential substance abuse disorder (SUD) treatment center and clinical researchers, stakeholders identified knowledge gaps (internal and the field broadly) with regard to patient interpersonal factors that influence working alliance and acute SUD residential treatment outcome trajectories. To (a) examine interpersonal pathoplasticity and identify interpersonal subtypes in a naturalistic sample of adolescent and young-adult patients presenting for routine residential SUD treatment and (b) investigate the association between identified interpersonal subtypes and working alliance and acute treatment outcome trajectories. N = 100 patients (Mage = 17.39 years, 68% male, 84% White) completed self-reports of symptom and functioning outcomes, interpersonal problems, and the working alliance on multiple occasions between admission and discharge. Multiple methods were used to identify interpersonal subtypes and test pathoplasticity. Interpersonal subtype was entered as a predictor in respective multilevel models of working alliance and symptom outcome. Interpersonal subtypes of vindictive and exploitable patients demonstrated pathoplasticity. Subtype did not predict working alliance trajectories; however, a significant interaction between interpersonal subtype and a quadratic effect for time demonstrated that exploitable patients with longer than average treatment lengths experienced attenuated symptom change over the course of treatment whereas vindictive patients appeared to demonstrate steady progress. Interpersonal assessments should be integrated into residential SUD treatment to identify patients with an exploitable interpersonal style who might require additional attention or alternative interventions. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  8. Evaluating the Effectiveness of DART® Buoy Networks Based on Forecast Accuracy

    NASA Astrophysics Data System (ADS)

    Percival, Donald B.; Denbo, Donald W.; Gica, Edison; Huang, Paul Y.; Mofjeld, Harold O.; Spillane, Michael C.; Titov, Vasily V.

    2018-04-01

    A performance measure for a DART® tsunami buoy network has been developed. DART® buoys are used to detect tsunamis, but the full potential of the data they collect is realized through accurate forecasts of inundations caused by the tsunamis. The performance measure assesses how well the network achieves its full potential through a statistical analysis of simulated forecasts of wave amplitudes outside an impact site and a consideration of how much the forecasts are degraded in accuracy when one or more buoys are inoperative. The analysis uses simulated tsunami amplitude time series collected at each buoy from selected source segments in the Short-term Inundation Forecast for Tsunamis database and involves a set for 1000 forecasts for each buoy/segment pair at sites just offshore of selected impact communities. Random error-producing scatter in the time series is induced by uncertainties in the source location, addition of real oceanic noise, and imperfect tidal removal. Comparison with an error-free standard leads to root-mean-square errors (RMSEs) for DART® buoys located near a subduction zone. The RMSEs indicate which buoy provides the best forecast (lowest RMSE) for sections of the zone, under a warning-time constraint for the forecasts of 3 h. The analysis also shows how the forecasts are degraded (larger minimum RMSE among the remaining buoys) when one or more buoys become inoperative. The RMSEs provide a way to assess array augmentation or redesign such as moving buoys to more optimal locations. Examples are shown for buoys off the Aleutian Islands and off the West Coast of South America for impact sites at Hilo HI and along the US West Coast (Crescent City CA and Port San Luis CA, USA). A simple measure (coded green, yellow or red) of the current status of the network's ability to deliver accurate forecasts is proposed to flag the urgency of buoy repair.

  9. Near real time wind energy forecasting incorporating wind tunnel modeling

    NASA Astrophysics Data System (ADS)

    Lubitz, William David

    A series of experiments and investigations were carried out to inform the development of a day-ahead wind power forecasting system. An experimental near-real time wind power forecasting system was designed and constructed that operates on a desktop PC and forecasts 12--48 hours in advance. The system uses model output of the Eta regional scale forecast (RSF) to forecast the power production of a wind farm in the Altamont Pass, California, USA from 12 to 48 hours in advance. It is of modular construction and designed to also allow diagnostic forecasting using archived RSF data, thereby allowing different methods of completing each forecasting step to be tested and compared using the same input data. Wind-tunnel investigations of the effect of wind direction and hill geometry on wind speed-up above a hill were conducted. Field data from an Altamont Pass, California site was used to evaluate several speed-up prediction algorithms, both with and without wind direction adjustment. These algorithms were found to be of limited usefulness for the complex terrain case evaluated. Wind-tunnel and numerical simulation-based methods were developed for determining a wind farm power curve (the relation between meteorological conditions at a point in the wind farm and the power production of the wind farm). Both methods, as well as two methods based on fits to historical data, ultimately showed similar levels of accuracy: mean absolute errors predicting power production of 5 to 7 percent of the wind farm power capacity. The downscaling of RSF forecast data to the wind farm was found to be complicated by the presence of complex terrain. Poor results using the geostrophic drag law and regression methods motivated the development of a database search method that is capable of forecasting not only wind speeds but also power production with accuracy better than persistence.

  10. Evaluating the Effectiveness of DART® Buoy Networks Based on Forecast Accuracy

    NASA Astrophysics Data System (ADS)

    Percival, Donald B.; Denbo, Donald W.; Gica, Edison; Huang, Paul Y.; Mofjeld, Harold O.; Spillane, Michael C.; Titov, Vasily V.

    2018-03-01

    A performance measure for a DART® tsunami buoy network has been developed. DART® buoys are used to detect tsunamis, but the full potential of the data they collect is realized through accurate forecasts of inundations caused by the tsunamis. The performance measure assesses how well the network achieves its full potential through a statistical analysis of simulated forecasts of wave amplitudes outside an impact site and a consideration of how much the forecasts are degraded in accuracy when one or more buoys are inoperative. The analysis uses simulated tsunami amplitude time series collected at each buoy from selected source segments in the Short-term Inundation Forecast for Tsunamis database and involves a set for 1000 forecasts for each buoy/segment pair at sites just offshore of selected impact communities. Random error-producing scatter in the time series is induced by uncertainties in the source location, addition of real oceanic noise, and imperfect tidal removal. Comparison with an error-free standard leads to root-mean-square errors (RMSEs) for DART® buoys located near a subduction zone. The RMSEs indicate which buoy provides the best forecast (lowest RMSE) for sections of the zone, under a warning-time constraint for the forecasts of 3 h. The analysis also shows how the forecasts are degraded (larger minimum RMSE among the remaining buoys) when one or more buoys become inoperative. The RMSEs provide a way to assess array augmentation or redesign such as moving buoys to more optimal locations. Examples are shown for buoys off the Aleutian Islands and off the West Coast of South America for impact sites at Hilo HI and along the US West Coast (Crescent City CA and Port San Luis CA, USA). A simple measure (coded green, yellow or red) of the current status of the network's ability to deliver accurate forecasts is proposed to flag the urgency of buoy repair.

  11. Oceanic sources of predictability for MJO propagation across the Maritime Continent in a subset of S2S forecast models

    NASA Astrophysics Data System (ADS)

    DeMott, C. A.; Klingaman, N. P.

    2017-12-01

    Skillful prediction of the Madden-Julian oscillation (MJO) passage across the Maritime Continent (MC) has important implications for global forecasts of high-impact weather events, such as atmospheric rivers and heat waves. The North American teleconnection response to the MJO is strongest when MJO convection is located in the western Pacific Ocean, but many climate and forecast models are deficient in their simulation of MC-crossing MJO events. Compared to atmosphere-only general circulation models (AGCMs), MJO simulation skill generally improves with the addition of ocean feedbacks in coupled GCMs (CGCMs). Using observations, previous studies have noted that the degree of ocean coupling may vary considerably from one MJO event to the next. The coupling mechanisms may be linked to the presence of ocean Equatorial Rossby waves, the sign and amplitude of Equatorial surface currents, and the upper ocean temperature and salinity profiles. In this study, we assess the role of ocean feedbacks to MJO prediction skill using a subset of CGCMs participating in the Subseasonal-to-Seasonal (S2S) Project database. Oceanic observational and reanalysis datasets are used to characterize the upper ocean background state for observed MJO events that do and do not propagate beyond the MC. The ability of forecast models to capture the oceanic influence on the MJO is first assessed by quantifying SST forecast skill. Next, a set of previously developed air-sea interaction diagnostics is applied to model output to measure the role of SST perturbations on the forecast MJO. The "SST effect" in forecast MJO events is compared to that obtained from reanalysis data. Leveraging all ensemble members of a given forecast helps disentangle oceanic model biases from atmospheric model biases, both of which can influence the expression of ocean feedbacks in coupled forecast systems. Results of this study will help identify areas of needed model improvement for improved MJO forecasts.

  12. 76 FR 25409 - Privacy Act of 1974

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-04

    ... Medicare beneficiaries from CMS databases including: health care usage, demographic, enrollment, and survey... and timely assess the current health care usage by the patient population served by VA, to forecast..., and to understand the numerous implications of cross-usage between VA and non-VA health care systems...

  13. Development of a method for comprehensive water quality forecasting and its application in Miyun reservoir of Beijing, China.

    PubMed

    Zhang, Lei; Zou, Zhihong; Shan, Wei

    2017-06-01

    Water quality forecasting is an essential part of water resource management. Spatiotemporal variations of water quality and their inherent constraints make it very complex. This study explored a data-based method for short-term water quality forecasting. Prediction of water quality indicators including dissolved oxygen, chemical oxygen demand by KMnO 4 and ammonia nitrogen using support vector machine was taken as inputs of the particle swarm algorithm based optimal wavelet neural network to forecast the whole status index of water quality. Gubeikou monitoring section of Miyun reservoir in Beijing, China was taken as the study case to examine effectiveness of this approach. The experiment results also revealed that the proposed model has advantages of stability and time reduction in comparison with other data-driven models including traditional BP neural network model, wavelet neural network model and Gradient Boosting Decision Tree model. It can be used as an effective approach to perform short-term comprehensive water quality prediction. Copyright © 2016. Published by Elsevier B.V.

  14. A probabilistic approach to the drag-based model

    NASA Astrophysics Data System (ADS)

    Napoletano, Gianluca; Forte, Roberta; Moro, Dario Del; Pietropaolo, Ermanno; Giovannelli, Luca; Berrilli, Francesco

    2018-02-01

    The forecast of the time of arrival (ToA) of a coronal mass ejection (CME) to Earth is of critical importance for our high-technology society and for any future manned exploration of the Solar System. As critical as the forecast accuracy is the knowledge of its precision, i.e. the error associated to the estimate. We propose a statistical approach for the computation of the ToA using the drag-based model by introducing the probability distributions, rather than exact values, as input parameters, thus allowing the evaluation of the uncertainty on the forecast. We test this approach using a set of CMEs whose transit times are known, and obtain extremely promising results: the average value of the absolute differences between measure and forecast is 9.1h, and half of these residuals are within the estimated errors. These results suggest that this approach deserves further investigation. We are working to realize a real-time implementation which ingests the outputs of automated CME tracking algorithms as inputs to create a database of events useful for a further validation of the approach.

  15. Processing of next generation weather radar-multisensor precipitation estimates and quantitative precipitation forecast data for the DuPage County streamflow simulation system

    USGS Publications Warehouse

    Bera, Maitreyee; Ortel, Terry W.

    2018-01-12

    The U.S. Geological Survey, in cooperation with DuPage County Stormwater Management Department, is testing a near real-time streamflow simulation system that assists in the management and operation of reservoirs and other flood-control structures in the Salt Creek and West Branch DuPage River drainage basins in DuPage County, Illinois. As part of this effort, the U.S. Geological Survey maintains a database of hourly meteorological and hydrologic data for use in this near real-time streamflow simulation system. Among these data are next generation weather radar-multisensor precipitation estimates and quantitative precipitation forecast data, which are retrieved from the North Central River Forecasting Center of the National Weather Service. The DuPage County streamflow simulation system uses these quantitative precipitation forecast data to create streamflow predictions for the two simulated drainage basins. This report discusses in detail how these data are processed for inclusion in the Watershed Data Management files used in the streamflow simulation system for the Salt Creek and West Branch DuPage River drainage basins.

  16. Factors Affecting Hurricane Evacuation Intentions.

    PubMed

    Lazo, Jeffrey K; Bostrom, Ann; Morss, Rebecca E; Demuth, Julie L; Lazrus, Heather

    2015-10-01

    Protective actions for hurricane threats are a function of the environmental and information context; individual and household characteristics, including cultural worldviews, past hurricane experiences, and risk perceptions; and motivations and barriers to actions. Using survey data from the Miami-Dade and Houston-Galveston areas, we regress individuals' stated evacuation intentions on these factors in two information conditions: (1) seeing a forecast that a hurricane will hit one's area, and (2) receiving an evacuation order. In both information conditions having an evacuation plan, wanting to keep one's family safe, and viewing one's home as vulnerable to wind damage predict increased evacuation intentions. Some predictors of evacuation intentions differ between locations; for example, Florida respondents with more egalitarian worldviews are more likely to evacuate under both information conditions, and Florida respondents with more individualist worldviews are less likely to evacuate under an evacuation order, but worldview was not significantly associated with evacuation intention for Texas respondents. Differences by information condition also emerge, including: (1) evacuation intentions decrease with age in the evacuation order condition but increase with age in the saw forecast condition, and (2) evacuation intention in the evacuation order condition increases among those who rely on public sources of information on hurricane threats, whereas in the saw forecast condition evacuation intention increases among those who rely on personal sources. Results reinforce the value of focusing hurricane information efforts on evacuation plans and residential vulnerability and suggest avenues for future research on how hurricane contexts shape decision making. © 2015 Society for Risk Analysis.

  17. Earthquake forecasting studies using radon time series data in Taiwan

    NASA Astrophysics Data System (ADS)

    Walia, Vivek; Kumar, Arvind; Fu, Ching-Chou; Lin, Shih-Jung; Chou, Kuang-Wu; Wen, Kuo-Liang; Chen, Cheng-Hong

    2017-04-01

    For few decades, growing number of studies have shown usefulness of data in the field of seismogeochemistry interpreted as geochemical precursory signals for impending earthquakes and radon is idendified to be as one of the most reliable geochemical precursor. Radon is recognized as short-term precursor and is being monitored in many countries. This study is aimed at developing an effective earthquake forecasting system by inspecting long term radon time series data. The data is obtained from a network of radon monitoring stations eastblished along different faults of Taiwan. The continuous time series radon data for earthquake studies have been recorded and some significant variations associated with strong earthquakes have been observed. The data is also examined to evaluate earthquake precursory signals against environmental factors. An automated real-time database operating system has been developed recently to improve the data processing for earthquake precursory studies. In addition, the study is aimed at the appraisal and filtrations of these environmental parameters, in order to create a real-time database that helps our earthquake precursory study. In recent years, automatic operating real-time database has been developed using R, an open source programming language, to carry out statistical computation on the data. To integrate our data with our working procedure, we use the popular and famous open source web application solution, AMP (Apache, MySQL, and PHP), creating a website that could effectively show and help us manage the real-time database.

  18. On the frequency-magnitude distribution of converging boundaries

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Laura, S.; Heuret, A.; Funiciello, F.

    2011-12-01

    The occurrence of the last mega-thrust earthquake in Japan has clearly remarked the high risk posed to society by such events in terms of social and economic losses even at large spatial scale. The primary component for a balanced and objective mitigation of the impact of these earthquakes is the correct forecast of where such kind of events may occur in the future. To date, there is a wide range of opinions about where mega-thrust earthquakes can occur. Here, we aim at presenting some detailed statistical analysis of a database of worldwide interplate earthquakes occurring at current subduction zones. The database has been recently published in the framework of the EURYI Project 'Convergent margins and seismogenesis: defining the risk of great earthquakes by using statistical data and modelling', and it provides a unique opportunity to explore in detail the seismogenic process in subducting lithosphere. In particular, the statistical analysis of this database allows us to explore many interesting scientific issues such as the existence of different frequency-magnitude distributions across the trenches, the quantitative characterization of subduction zones that are able to produce more likely mega-thrust earthquakes, the prominent features that characterize converging boundaries with different seismic activity and so on. Besides the scientific importance, such issues may lead to improve our mega-thrust earthquake forecasting capability.

  19. Design and implementation of ticket price forecasting system

    NASA Astrophysics Data System (ADS)

    Li, Yuling; Li, Zhichao

    2018-05-01

    With the advent of the aviation travel industry, a large number of data mining technologies have been developed to increase profits for airlines in the past two decades. The implementation of the digital optimization strategy leads to price discrimination, for example, similar seats on the same flight are purchased at different prices, depending on the time of purchase, the supplier, and so on. Price fluctuations make the prediction of ticket prices have application value. In this paper, a combination of ARMA algorithm and random forest algorithm is proposed to predict the price of air ticket. The experimental results show that the model is more reliable by comparing the forecasting results with the actual results of each price model. The model is helpful for passengers to buy tickets and to save money. Based on the proposed model, using Python language and SQL Server database, we design and implement the ticket price forecasting system.

  20. Applied Meteorology Unit Quarterly Report, Second Quarter FY-13

    NASA Technical Reports Server (NTRS)

    Bauman, William; Crawford, Winifred; Watson, Leela; Shafer, Jaclyn; Huddleston, Lisa

    2013-01-01

    The AMU team worked on six tasks for their customers: (1) Ms. Crawford continued work on the objective lightning forecast task for airports in east-central Florida, and began work on developing a dual-Doppler analysis with local Doppler radars, (2) Ms. Shafer continued work for Vandenberg Air Force Base on an automated tool to relate pressure gradients to peak winds, (3) Dr. Huddleston continued work to develop a lightning timing forecast tool for the Kennedy Space Center/Cape Canaveral Air Force Station area, (4) Dr. Bauman continued work on a severe weather forecast tool focused on east-central Florida, (5) Mr. Decker began developing a wind pairs database for the Launch Services Program to use when evaluating upper-level winds for launch vehicles, and (6) Dr. Watson began work to assimilate observational data into the high-resolution model configurations, she created for Wallops Flight Facility and the Eastern Range.

  1. SCADA-based Operator Support System for Power Plant Equipment Fault Forecasting

    NASA Astrophysics Data System (ADS)

    Mayadevi, N.; Ushakumari, S. S.; Vinodchandra, S. S.

    2014-12-01

    Power plant equipment must be monitored closely to prevent failures from disrupting plant availability. Online monitoring technology integrated with hybrid forecasting techniques can be used to prevent plant equipment faults. A self learning rule-based expert system is proposed in this paper for fault forecasting in power plants controlled by supervisory control and data acquisition (SCADA) system. Self-learning utilizes associative data mining algorithms on the SCADA history database to form new rules that can dynamically update the knowledge base of the rule-based expert system. In this study, a number of popular associative learning algorithms are considered for rule formation. Data mining results show that the Tertius algorithm is best suited for developing a learning engine for power plants. For real-time monitoring of the plant condition, graphical models are constructed by K-means clustering. To build a time-series forecasting model, a multi layer preceptron (MLP) is used. Once created, the models are updated in the model library to provide an adaptive environment for the proposed system. Graphical user interface (GUI) illustrates the variation of all sensor values affecting a particular alarm/fault, as well as the step-by-step procedure for avoiding critical situations and consequent plant shutdown. The forecasting performance is evaluated by computing the mean absolute error and root mean square error of the predictions.

  2. A hybrid ARIMA and neural network model applied to forecast catch volumes of Selar crumenophthalmus

    NASA Astrophysics Data System (ADS)

    Aquino, Ronald L.; Alcantara, Nialle Loui Mar T.; Addawe, Rizavel C.

    2017-11-01

    The Selar crumenophthalmus with the English name big-eyed scad fish, locally known as matang-baka, is one of the fishes commonly caught along the waters of La Union, Philippines. The study deals with the forecasting of catch volumes of big-eyed scad fish for commercial consumption. The data used are quarterly caught volumes of big-eyed scad fish from 2002 to first quarter of 2017. This actual data is available from the open stat database published by the Philippine Statistics Authority (PSA)whose task is to collect, compiles, analyzes and publish information concerning different aspects of the Philippine setting. Autoregressive Integrated Moving Average (ARIMA) models, Artificial Neural Network (ANN) model and the Hybrid model consisting of ARIMA and ANN were developed to forecast catch volumes of big-eyed scad fish. Statistical errors such as Mean Absolute Errors (MAE) and Root Mean Square Errors (RMSE) were computed and compared to choose the most suitable model for forecasting the catch volume for the next few quarters. A comparison of the results of each model and corresponding statistical errors reveals that the hybrid model, ARIMA-ANN (2,1,2)(6:3:1), is the most suitable model to forecast the catch volumes of the big-eyed scad fish for the next few quarters.

  3. National Operational Hydrologic Remote Sensing Center - The ultimate source

    Science.gov Websites

    Analysis Satellite Obs Forecasts Data Archive SHEF Products Observations near City, ST Go Science Database Airborne Snow Surveys Satellite Snow Cover Mapping Snow Modeling and Data Assimilation Analyses polar-orbiting and geostationary satellite imagery. Maps are provided for the U.S. and the northern

  4. A pan-African medium-range ensemble flood forecast system

    NASA Astrophysics Data System (ADS)

    Thiemig, Vera; Bisselink, Bernard; Pappenberger, Florian; Thielen, Jutta

    2015-04-01

    The African Flood Forecasting System (AFFS) is a probabilistic flood forecast system for medium- to large-scale African river basins, with lead times of up to 15 days. The key components are the hydrological model LISFLOOD, the African GIS database, the meteorological ensemble predictions of the ECMWF and critical hydrological thresholds. In this study the predictive capability is investigated, to estimate AFFS' potential as an operational flood forecasting system for the whole of Africa. This is done in a hindcast mode, by reproducing pan-African hydrological predictions for the whole year of 2003 where important flood events were observed. Results were analysed in two ways, each with its individual objective. The first part of the analysis is of paramount importance for the assessment of AFFS as a flood forecasting system, as it focuses on the detection and prediction of flood events. Here, results were verified with reports of various flood archives such as Dartmouth Flood Observatory, the Emergency Event Database, the NASA Earth Observatory and Reliefweb. The number of hits, false alerts and missed alerts as well as the Probability of Detection, False Alarm Rate and Critical Success Index were determined for various conditions (different regions, flood durations, average amount of annual precipitations, size of affected areas and mean annual discharge). The second part of the analysis complements the first by giving a basic insight into the prediction skill of the general streamflow. For this, hydrological predictions were compared against observations at 36 key locations across Africa and the Continuous Rank Probability Skill Score (CRPSS), the limit of predictability and reliability were calculated. Results showed that AFFS detected around 70 % of the reported flood events correctly. In particular, the system showed good performance in predicting riverine flood events of long duration (> 1 week) and large affected areas (> 10 000 km2) well in advance, whereas AFFS showed limitations for small-scale and short duration flood events. Also the forecasts showed on average a good reliability, and the CRPSS helped identifying regions to focus on for future improvements. The case study for the flood event in March 2003 in the Sabi Basin (Zimbabwe and Mozambique) illustrated the good performance of AFFS in forecasting timing and severity of the floods, gave an example of the clear and concise output products, and showed that the system is capable of producing flood warnings even in ungauged river basins. Hence, from a technical perspective, AFFS shows a good prospective as an operational system, as it has demonstrated its significant potential to contribute to the reduction of flood-related losses in Africa by providing national and international aid organizations timely with medium-range flood forecast information. However, issues related to the practical implication will still need to be investigated.

  5. The quality and value of seasonal precipitation forecasts for an early warning of large-scale droughts and floods in West Africa

    NASA Astrophysics Data System (ADS)

    Bliefernicht, Jan; Seidel, Jochen; Salack, Seyni; Waongo, Moussa; Laux, Patrick; Kunstmann, Harald

    2017-04-01

    Seasonal precipitation forecasts are a crucial source of information for an early warning of hydro-meteorological extremes in West Africa. However, the current seasonal forecasting system used by the West African weather services in the framework of the West African Climate Outlook forum (PRESAO) is limited to probabilistic precipitation forecasts of 1-month lead time. To improve this provision, we use an ensemble-based quantile-quantile transformation for bias correction of precipitation forecasts provided by a global seasonal ensemble prediction system, the Climate Forecast System Version 2 (CFS2). The statistical technique eliminates systematic differences between global forecasts and observations with the potential to preserve the signal from the model. The technique has also the advantage that it can be easily implemented at national weather services with low capacities. The statistical technique is used to generate probabilistic forecasts of monthly and seasonal precipitation amount and other precipitation indices useful for an early warning of large-scale drought and floods in West Africa. The evaluation of the statistical technique is done using CFS hindcasts (1982 to 2009) in a cross-validation mode to determine the performance of the precipitation forecasts for several lead times focusing on drought and flood events depicted over the Volta and Niger basins. In addition, operational forecasts provided by PRESAO are analyzed from 1998 to 2015. The precipitation forecasts are compared to low-skill reference forecasts generated from gridded observations (i.e. GPCC, CHIRPS) and a novel in-situ gauge database from national observation networks (see Poster EGU2017-10271). The forecasts are evaluated using state-of-the-art verification techniques to determine specific quality attributes of probabilistic forecasts such as reliability, accuracy and skill. In addition, cost-loss approaches are used to determine the value of probabilistic forecasts for multiple users in warning situations. The outcomes of the hindcasts experiment for the Volta basin illustrate that the statistical technique can clearly improve the CFS precipitation forecasts with the potential to provide skillful and valuable early precipitation warnings for large-scale drought and flood situations several months in ahead. In this presentation we give a detailed overview about the ensemble-based quantile-quantile-transformation, its validation and verification and the possibilities of this technique to complement PRESAO. We also highlight the performance of this technique for extremes such as the Sahel drought in the 80ties and in comparison to the various reference data sets (e.g. CFS2, PRESAO, observational data sets) used in this study.

  6. Design and Prototype Implementation of non-Triggered Database-driven Real-time Tsunami Forecast System using Multi-index Method

    NASA Astrophysics Data System (ADS)

    Yamamoto, N.; Aoi, S.; Suzuki, W.; Hirata, K.; Takahashi, N.; Kunugi, T.; Nakamura, H.

    2016-12-01

    We have launched a new project to develop real-time tsunami inundation forecast system for the Pacific coast of Chiba prefecture (Kujukuri-Sotobo region), Japan (Aoi et al., 2015, AGU). In this study, we design a database-driven real-time tsunami forecast system using the multi-index method (Yamamoto et al., 2016, EPS) and implement a prototype system. In the previous study (Yamamoto et al., 2015, AGU), we assumed that the origin-time of tsunami was known before a forecast based on comparing observed and calculated ocean-bottom pressure waveforms stored in the Tsunami Scenario Bank (TSB). As shown in the figure, we assume the scenario origin-times by defining the scenario elapsed timeτp to compare observed and calculated waveforms. In this design, when several appropriate tsunami scenarios were selected by multiple indices (two variance reductions and correlation coefficient), the system could make tsunami forecast using the selected tsunami scenarios for the target coastal region without any triggered information derived from observed seismic and/or tsunami data. In addition, we define the time range Tq shown in the figure for masking perturbations contaminated by ocean-acoustic and seismic waves on the observed pressure records (Saito, 2015, JpGU). Following the proposed design, we implement a prototype system of real-time tsunami inundation forecast system for the exclusive use of the target coastal region using ocean-bottom pressure data from the Seafloor Observation Network for Earthquakes and Tsunamis along the Japan Trench (S-net) (Kanazawa et al., 2012, JpGU; Uehira et al., 2015, IUGG), which is constructed by National Research institute for Earth Science and Disaster Resilience (NIED). For the prototype system, we construct a prototype TSB using interplate earthquake fault models located along the Japan Trench (Mw 7.6-9.8), the Sagami Trough (Mw 7.6-8.6), and the Nankai Trough (Mw 7.6-8.6) as well as intraplate earthquake fault models (Mw 7.6-8.6) within the subducting Pacific plate, which could affect the target coastal region. This work was partially supported by the Council for Science, Technology and Innovation (CSTI) through the Cross-ministerial Strategic Innovation Promotion Program (SIP), titled "Enhancement of societal resiliency against natural disasters" (Funding agency: JST).

  7. A Real-time Irrigation Forecasting System in Jiefangzha Irrigation District, China

    NASA Astrophysics Data System (ADS)

    Cong, Z.

    2015-12-01

    In order to improve the irrigation efficiency, we need to know when and how much to irrigate in real time. If we know the soil moisture content at this time, we can forecast the soil moisture content in the next days based on the rainfall forecasting and the crop evapotranspiration forecasting. Then the irrigation should be considered when the forecasting soil moisture content reaches to a threshold. Jiefangzha Irrigation District, a part of Hetao Irrigation District, is located in Inner Mongolia, China. The irrigated area of this irrigation district is about 140,000 ha mainly planting wheat, maize and sunflower. The annual precipitation is below 200mm, so the irrigation is necessary and the irrigation water comes from the Yellow river. We set up 10 sites with 4 TDR sensors at each site (20cm, 40cm, 60cm and 80cm depth) to monitor the soil moisture content. The weather forecasting data are downloaded from the website of European Centre for Medium-Range Weather Forecasts (ECMWF). The reference evapotranspiration is estimated based on FAO-Blaney-Criddle equation with only the air temperature from ECMWF. Then the crop water requirement is forecasted by the crop coefficient multiplying the reference evapotranspiration. Finally, the soil moisture content is forecasted based on soil water balance with the initial condition is set as the monitoring soil moisture content. When the soil moisture content reaches to a threshold, the irrigation warning will be announced. The irrigation mount can be estimated through three ways: (1) making the soil moisture content be equal to the field capacity; (2) making the soil moisture saturated; or (3) according to the irrigation quota. The forecasting period is 10 days. The system is developed according to B2C model with Java language. All the databases and the data analysis are carried out in the server. The customers can log in the website with their own username and password then get the information about the irrigation forecasting and other information about the irrigation. This system can be expanded in other irrigation districts. In future, it is even possible to upgrade the system for the mobile user.

  8. Data-driven agent-based modeling, with application to rooftop solar adoption

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Haifeng; Vorobeychik, Yevgeniy; Letchford, Joshua

    Agent-based modeling is commonly used for studying complex system properties emergent from interactions among many agents. We present a novel data-driven agent-based modeling framework applied to forecasting individual and aggregate residential rooftop solar adoption in San Diego county. Our first step is to learn a model of individual agent behavior from combined data of individual adoption characteristics and property assessment. We then construct an agent-based simulation with the learned model embedded in artificial agents, and proceed to validate it using a holdout sequence of collective adoption decisions. We demonstrate that the resulting agent-based model successfully forecasts solar adoption trends andmore » provides a meaningful quantification of uncertainty about its predictions. We utilize our model to optimize two classes of policies aimed at spurring solar adoption: one that subsidizes the cost of adoption, and another that gives away free systems to low-income house- holds. We find that the optimal policies derived for the latter class are significantly more efficacious, whereas the policies similar to the current California Solar Initiative incentive scheme appear to have a limited impact on overall adoption trends.« less

  9. Data-driven agent-based modeling, with application to rooftop solar adoption

    DOE PAGES

    Zhang, Haifeng; Vorobeychik, Yevgeniy; Letchford, Joshua; ...

    2016-01-25

    Agent-based modeling is commonly used for studying complex system properties emergent from interactions among many agents. We present a novel data-driven agent-based modeling framework applied to forecasting individual and aggregate residential rooftop solar adoption in San Diego county. Our first step is to learn a model of individual agent behavior from combined data of individual adoption characteristics and property assessment. We then construct an agent-based simulation with the learned model embedded in artificial agents, and proceed to validate it using a holdout sequence of collective adoption decisions. We demonstrate that the resulting agent-based model successfully forecasts solar adoption trends andmore » provides a meaningful quantification of uncertainty about its predictions. We utilize our model to optimize two classes of policies aimed at spurring solar adoption: one that subsidizes the cost of adoption, and another that gives away free systems to low-income house- holds. We find that the optimal policies derived for the latter class are significantly more efficacious, whereas the policies similar to the current California Solar Initiative incentive scheme appear to have a limited impact on overall adoption trends.« less

  10. How accurate are the weather forecasts for Bierun (southern Poland)?

    NASA Astrophysics Data System (ADS)

    Gawor, J.

    2012-04-01

    Weather forecast accuracy has increased in recent times mainly thanks to significant development of numerical weather prediction models. Despite the improvements, the forecasts should be verified to control their quality. The evaluation of forecast accuracy can also be an interesting learning activity for students. It joins natural curiosity about everyday weather and scientific process skills: problem solving, database technologies, graph construction and graphical analysis. The examination of the weather forecasts has been taken by a group of 14-year-old students from Bierun (southern Poland). They participate in the GLOBE program to develop inquiry-based investigations of the local environment. For the atmospheric research the automatic weather station is used. The observed data were compared with corresponding forecasts produced by two numerical weather prediction models, i.e. COAMPS (Coupled Ocean/Atmosphere Mesoscale Prediction System) developed by Naval Research Laboratory Monterey, USA; it runs operationally at the Interdisciplinary Centre for Mathematical and Computational Modelling in Warsaw, Poland and COSMO (The Consortium for Small-scale Modelling) used by the Polish Institute of Meteorology and Water Management. The analysed data included air temperature, precipitation, wind speed, wind chill and sea level pressure. The prediction periods from 0 to 24 hours (Day 1) and from 24 to 48 hours (Day 2) were considered. The verification statistics that are commonly used in meteorology have been applied: mean error, also known as bias, for continuous data and a 2x2 contingency table to get the hit rate and false alarm ratio for a few precipitation thresholds. The results of the aforementioned activity became an interesting basis for discussion. The most important topics are: 1) to what extent can we rely on the weather forecasts? 2) How accurate are the forecasts for two considered time ranges? 3) Which precipitation threshold is the most predictable? 4) Why are some weather elements easier to verify than others? 5) What factors may contribute to the quality of the weather forecast?

  11. Spatiotemporal drought forecasting using nonlinear models

    NASA Astrophysics Data System (ADS)

    Vasiliades, Lampros; Loukas, Athanasios

    2010-05-01

    Spatiotemporal data mining is the extraction of unknown and implicit knowledge, structures, spatiotemporal relationships, or patterns not explicitly stored in spatiotemporal databases. As one of data mining techniques, forecasting is widely used to predict the unknown future based upon the patterns hidden in the current and past data. In order to achieve spatiotemporal forecasting, some mature analysis tools, e.g., time series and spatial statistics are extended to the spatial dimension and the temporal dimension, respectively. Drought forecasting plays an important role in the planning and management of natural resources and water resource systems in a river basin. Early and timelines forecasting of a drought event can help to take proactive measures and set out drought mitigation strategies to alleviate the impacts of drought. Despite the widespread application of nonlinear mathematical models, comparative studies on spatiotemporal drought forecasting using different models are still a huge task for modellers. This study uses a promising approach, the Gamma Test (GT), to select the input variables and the training data length, so that the trial and error workload could be greatly reduced. The GT enables to quickly evaluate and estimate the best mean squared error that can be achieved by a smooth model on any unseen data for a given selection of inputs, prior to model construction. The GT is applied to forecast droughts using monthly Standardized Precipitation Index (SPI) timeseries at multiple timescales in several precipitation stations at Pinios river basin in Thessaly region, Greece. Several nonlinear models have been developed efficiently, with the aid of the GT, for 1-month up to 12-month ahead forecasting. Several temporal and spatial statistical indices were considered for the performance evaluation of the models. The predicted results show reasonably good agreement with the actual data for short lead times, whereas the forecasting accuracy decreases with increase in lead time. Finally, the developed nonlinear models could be used in an early warning system for risk and decision analyses at the study area.

  12. Evaluating sub-seasonal skill in probabilistic forecasts of Atmospheric Rivers and associated extreme events

    NASA Astrophysics Data System (ADS)

    Subramanian, A. C.; Lavers, D.; Matsueda, M.; Shukla, S.; Cayan, D. R.; Ralph, M.

    2017-12-01

    Atmospheric rivers (ARs) - elongated plumes of intense moisture transport - are a primary source of hydrological extremes, water resources and impactful weather along the West Coast of North America and Europe. There is strong demand in the water management, societal infrastructure and humanitarian sectors for reliable sub-seasonal forecasts, particularly of extreme events, such as floods and droughts so that actions to mitigate disastrous impacts can be taken with sufficient lead-time. Many recent studies have shown that ARs in the Pacific and the Atlantic are modulated by large-scale modes of climate variability. Leveraging the improved understanding of how these large-scale climate modes modulate the ARs in these two basins, we use the state-of-the-art multi-model forecast systems such as the North American Multi-Model Ensemble (NMME) and the Subseasonal-to-Seasonal (S2S) database to help inform and assess the probabilistic prediction of ARs and related extreme weather events over the North American and European West Coasts. We will present results from evaluating probabilistic forecasts of extreme precipitation and AR activity at the sub-seasonal scale. In particular, results from the comparison of two winters (2015-16 and 2016-17) will be shown, winters which defied canonical El Niño teleconnection patterns over North America and Europe. We further extend this study to analyze probabilistic forecast skill of AR events in these two basins and the variability in forecast skill during certain regimes of large-scale climate modes.

  13. Rapid wave and storm surge warning system for tropical cyclones in Mexico

    NASA Astrophysics Data System (ADS)

    Appendini, C. M.; Rosengaus, M.; Meza, R.; Camacho, V.

    2015-12-01

    The National Hurricane Center (NHC) in Miami, is responsible for the forecast of tropical cyclones in the North Atlantic and Eastern North Pacific basins. As such, Mexico, Central America and Caribbean countries depend on the information issued by the NHC related to the characteristics of a particular tropical cyclone and associated watch and warning areas. Despite waves and storm surge are important hazards for marine operations and coastal dwellings, their forecast is not part of the NHC responsibilities. This work presents a rapid wave and storm surge warning system based on 3100 synthetic tropical cyclones doing landfall in Mexico. Hydrodynamic and wave models were driven by the synthetic events to create a robust database composed of maximum envelops of wind speed, significant wave height and storm surge for each event. The results were incorporated into a forecast system that uses the NHC advisory to locate the synthetic events passing inside specified radiuses for the present and forecast position of the real event. Using limited computer resources, the system displays the information meeting the search criteria, and the forecaster can select specific events to generate the desired hazard map (i.e. wind, waves, and storm surge) based on the maximum envelop maps. This system was developed in a limited time frame to be operational in 2015 by the National Hurricane and Severe Storms Unit of the Mexican National Weather Service, and represents a pilot project for other countries in the region not covered by detailed storm surge and waves forecasts.

  14. The state of the residential fire fatality problem in Sweden: Epidemiology, risk factors, and event typologies.

    PubMed

    Jonsson, Anders; Bonander, Carl; Nilson, Finn; Huss, Fredrik

    2017-09-01

    Residential fires represent the largest category of fatal fires in Sweden. The purpose of this study was to describe the epidemiology of fatal residential fires in Sweden and to identify clusters of events. Data was collected from a database that combines information on fatal fires with data from forensic examinations and the Swedish Cause of Death-register. Mortality rates were calculated for different strata using population statistics and rescue service turnout reports. Cluster analysis was performed using multiple correspondence analysis with agglomerative hierarchical clustering. Male sex, old age, smoking, and alcohol were identified as risk factors, and the most common primary injury diagnosis was exposure to toxic gases. Compared to non-fatal fires, fatal residential fires more often originated in the bedroom, were more often caused by smoking, and were more likely to occur at night. Six clusters were identified. The first two clusters were both smoking-related, but were separated into (1) fatalities that often involved elderly people, usually female, whose clothes were ignited (17% of the sample), (2) middle-aged (45-64years old), (often) intoxicated men, where the fire usually originated in furniture (30%). Other clusters that were identified in the analysis were related to (3) fires caused by technical fault, started in electrical installations in single houses (13%), (4) cooking appliances left on (8%), (5) events with unknown cause, room and object of origin (25%), and (6) deliberately set fires (7%). Fatal residential fires were unevenly distributed in the Swedish population. To further reduce the incidence of fire mortality, specialized prevention efforts that focus on the different needs of each cluster are required. Cooperation between various societal functions, e.g. rescue services, elderly care, psychiatric clinics and other social services, with an application of both human and technological interventions, should reduce residential fire mortality in Sweden. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Where's the evidence? a systematic review of economic analyses of residential aged care infrastructure.

    PubMed

    Easton, Tiffany; Milte, Rachel; Crotty, Maria; Ratcliffe, Julie

    2017-03-21

    Residential care infrastructure, in terms of the characteristics of the organisation (such as proprietary status, size, and location) and the physical environment, have been found to directly influence resident outcomes. This review aimed to summarise the existing literature of economic evaluations of residential care infrastructure. A systematic review of English language articles using AgeLine, CINAHL, Econlit, Informit (databases in Health; Business and Law; Social Sciences), Medline, ProQuest, Scopus, and Web of Science with retrieval up to 14 December 2015. The search strategy combined terms relating to nursing homes, economics, and older people. Full economic evaluations, partial economic evaluations, and randomised trials reporting more limited economic information, such as estimates of resource use or costs of interventions were included. Data was extracted using predefined data fields and synthesized in a narrative summary to address the stated review objective. Fourteen studies containing an economic component were identified. None of the identified studies attempted to systematically link costs and outcomes in the form of a cost-benefit, cost-effectiveness, or cost-utility analysis. There was a wide variation in approaches taken for valuing the outcomes associated with differential residential care infrastructures: 8 studies utilized various clinical outcomes as proxies for the quality of care provided, and 2 focused on resident outcomes including agitation, quality of life, and the quality of care interactions. Only 2 studies included residents living with dementia. Robust economic evidence is needed to inform aged care facility design. Future research should focus on identifying appropriate and meaningful outcome measures that can be used at a service planning level, as well as the broader health benefits and cost-saving potential of different organisational and environmental characteristics in residential care. International Prospective Register of Systematic Reviews (PROSPERO) registration number CRD42015015977 .

  16. Statistical and Biophysical Models for Predicting Total and Outdoor Water Use in Los Angeles

    NASA Astrophysics Data System (ADS)

    Mini, C.; Hogue, T. S.; Pincetl, S.

    2012-04-01

    Modeling water demand is a complex exercise in the choice of the functional form, techniques and variables to integrate in the model. The goal of the current research is to identify the determinants that control total and outdoor residential water use in semi-arid cities and to utilize that information in the development of statistical and biophysical models that can forecast spatial and temporal urban water use. The City of Los Angeles is unique in its highly diverse socio-demographic, economic and cultural characteristics across neighborhoods, which introduces significant challenges in modeling water use. Increasing climate variability also contributes to uncertainties in water use predictions in urban areas. Monthly individual water use records were acquired from the Los Angeles Department of Water and Power (LADWP) for the 2000 to 2010 period. Study predictors of residential water use include socio-demographic, economic, climate and landscaping variables at the zip code level collected from US Census database. Climate variables are estimated from ground-based observations and calculated at the centroid of each zip code by inverse-distance weighting method. Remotely-sensed products of vegetation biomass and landscape land cover are also utilized. Two linear regression models were developed based on the panel data and variables described: a pooled-OLS regression model and a linear mixed effects model. Both models show income per capita and the percentage of landscape areas in each zip code as being statistically significant predictors. The pooled-OLS model tends to over-estimate higher water use zip codes and both models provide similar RMSE values.Outdoor water use was estimated at the census tract level as the residual between total water use and indoor use. This residual is being compared with the output from a biophysical model including tree and grass cover areas, climate variables and estimates of evapotranspiration at very high spatial resolution. A genetic algorithm based model (Shuffled Complex Evolution-UA; SCE-UA) is also being developed to provide estimates of the predictions and parameters uncertainties and to compare against the linear regression models. Ultimately, models will be selected to undertake predictions for a range of climate change and landscape scenarios. Finally, project results will contribute to a better understanding of water demand to help predict future water use and implement targeted landscaping conservation programs to maintain sustainable water needs for a growing population under uncertain climate variability.

  17. Seasonal forecasting of discharge for the Raccoon River, Iowa

    NASA Astrophysics Data System (ADS)

    Slater, Louise; Villarini, Gabriele; Bradley, Allen; Vecchi, Gabriel

    2016-04-01

    The state of Iowa (central United States) is regularly afflicted by severe natural hazards such as the 2008/2013 floods and the 2012 drought. To improve preparedness for these catastrophic events and allow Iowans to make more informed decisions about the most suitable water management strategies, we have developed a framework for medium to long range probabilistic seasonal streamflow forecasting for the Raccoon River at Van Meter, a 8900-km2 catchment located in central-western Iowa. Our flow forecasts use statistical models to predict seasonal discharge for low to high flows, with lead forecasting times ranging from one to ten months. Historical measurements of daily discharge are obtained from the U.S. Geological Survey (USGS) at the Van Meter stream gage, and used to compute quantile time series from minimum to maximum seasonal flow. The model is forced with basin-averaged total seasonal precipitation records from the PRISM Climate Group and annual row crop production acreage from the U.S. Department of Agriculture's National Agricultural Statistics Services database. For the forecasts, we use corn and soybean production from the previous year (persistence forecast) as a proxy for the impacts of agricultural practices on streamflow. The monthly precipitation forecasts are provided by eight Global Climate Models (GCMs) from the North American Multi-Model Ensemble (NMME), with lead times ranging from 0.5 to 11.5 months, and a resolution of 1 decimal degree. Additionally, precipitation from the month preceding each season is used to characterize antecedent soil moisture conditions. The accuracy of our modelled (1927-2015) and forecasted (2001-2015) discharge values is assessed by comparison with the observed USGS data. We explore the sensitivity of forecast skill over the full range of lead times, flow quantiles, forecast seasons, and with each GCM. Forecast skill is also examined using different formulations of the statistical models, as well as NMME forecast weighting procedures based on the computed potential skill (historical forecast accuracy) of the different GCMs. We find that the models describe the year-to-year variability in streamflow accurately, as well as the overall tendency towards increasing (and more variable) discharge over time. Surprisingly, forecast skill does not decrease markedly with lead time, and high flows tend to be well predicted, suggesting that these forecasts may have considerable practical applications. Further, the seasonal flow forecast accuracy is substantially improved by weighting the contribution of individual GCMs to the forecasts, and also by the inclusion of antecedent precipitation. Our results can provide critical information for adaptation strategies aiming to mitigate the costs and disruptions arising from flood and drought conditions, and allow us to determine how far in advance skillful forecasts can be issued. The availability of these discharge forecasts would have major societal and economic benefits for hydrology and water resources management, agriculture, disaster forecasts and prevention, energy, finance and insurance, food security, policy-making and public authorities, and transportation.

  18. Using Enabling Technologies to Facilitate the Comparison of Satellite Observations with the Model Forecasts for Hurricane Study

    NASA Astrophysics Data System (ADS)

    Li, P.; Knosp, B.; Hristova-Veleva, S. M.; Niamsuwan, N.; Johnson, M. P.; Shen, T. P. J.; Tanelli, S.; Turk, J.; Vu, Q. A.

    2014-12-01

    Due to their complexity and volume, the satellite data are underutilized in today's hurricane research and operations. To better utilize these data, we developed the JPL Tropical Cyclone Information System (TCIS) - an Interactive Data Portal providing fusion between Near-Real-Time satellite observations and model forecasts to facilitate model evaluation and improvement. We have collected satellite observations and model forecasts in the Atlantic Basin and the East Pacific for the hurricane seasons since 2010 and supported the NASA Airborne Campaigns for Hurricane Study such as the Genesis and Rapid Intensification Processes (GRIP) in 2010 and the Hurricane and Severe Storm Sentinel (HS3) from 2012 to 2014. To enable the direct inter-comparisons of the satellite observations and the model forecasts, the TCIS was integrated with the NASA Earth Observing System Simulator Suite (NEOS3) to produce synthetic observations (e.g. simulated passive microwave brightness temperatures) from a number of operational hurricane forecast models (HWRF and GFS). An automated process was developed to trigger NEOS3 simulations via web services given the location and time of satellite observations, monitor the progress of the NEOS3 simulations, display the synthetic observation and ingest them into the TCIS database when they are done. In addition, three analysis tools, the joint PDF analysis of the brightness temperatures, ARCHER for finding the storm-center and the storm organization and the Wave Number Analysis tool for storm asymmetry and morphology analysis were integrated into TCIS to provide statistical and structural analysis on both observed and synthetic data. Interactive tools were built in the TCIS visualization system to allow the spatial and temporal selections of the datasets, the invocation of the tools with user specified parameters, and the display and the delivery of the results. In this presentation, we will describe the key enabling technologies behind the design of the TCIS interactive data portal and analysis tools, including the spatial database technology for the representation and query of the level 2 satellite data, the automatic process flow using web services, the interactive user interface using the Google Earth API, and a common and expandable Python wrapper to invoke the analysis tools.

  19. Residential radon and environmental burden of disease among Non-smokers.

    PubMed

    Noh, Juhwan; Sohn, Jungwoo; Cho, Jaelim; Kang, Dae Ryong; Joo, Sowon; Kim, Changsoo; Shin, Dong Chun

    2016-01-01

    Lung cancer was the second highest absolute cancer incidence globally and the first cause of cancer mortality in 2014. Indoor radon is the second leading risk factor of lung cancer after cigarette smoking among ever smokers and the first among non-smokers. Environmental burden of disease (EBD) attributable to residential radon among non-smokers is critical for identifying threats to population health and planning health policy. To identify and retrieve literatures describing environmental burden of lung cancer attributable to residential radon, we searched databases including Ovid-MEDLINE, -EMBASE from 1980 to 2016. Search terms included patient keywords using 'lung', 'neoplasm', exposure keywords using 'residential', 'radon', and outcomes keywords using 'years of life lost', 'years of life lost due to disability', 'burden'. Searching through literatures identified 261 documents; further 9 documents were identified using manual searching. Two researchers independently assessed 271 abstracts eligible for inclusion at the abstract level. Full text reviews were conducted for selected publications after the first assessment. Ten studies were included in the final evaluation. Global disability-adjusted life years (DALYs)(95 % uncertainty interval) for lung cancer were increased by 35.9 % from 23,850,000(18,835,000-29,845,000) in 1900 to 32,405,000(24,400,000-38,334,000) in 2000. DALYs attributable to residential radon were 2,114,000(273,000-4,660,000) DALYs in 2010. Lung cancer caused 34,732,900(33,042,600 ~ 36,328,100) DALYs in 2013. DALYs attributable to residential radon were 1,979,000(1,331,000-2,768,000) DALYs for in 2013. The number of attributable lung cancer cases was 70-900 and EBD for radon was 1,000-14,000 DALYs in Netherland. The years of life lost were 0.066 years among never-smokers and 0.198 years among ever-smoker population in Canada. In summary, estimated global EBD attributable to residential radon was 1,979,000 DALYs for both sexes in 2013. In Netherlands, EBD for radon was 1,000-14,000 DALYs. Smoking population lost three times more years than never-smokers in Canada. There was no study estimating EBD of residential radon among never smokers in Korea and Asian country. In addition, there were a few studies reflecting the age of building, though residential radon exposure level depends on the age of building. Further EBD study reflecting Korean disability weight and the age of building is required to estimate EBD precisely.

  20. Rapid Tsunami Inundation Forecast from Near-field or Far-field Earthquakes using Pre-computed Tsunami Database: Pelabuhan Ratu, Indonesia

    NASA Astrophysics Data System (ADS)

    Gusman, A. R.; Setiyono, U.; Satake, K.; Fujii, Y.

    2017-12-01

    We built pre-computed tsunami inundation database in Pelabuhan Ratu, one of tsunami-prone areas on the southern coast of Java, Indonesia. The tsunami database can be employed for a rapid estimation of tsunami inundation during an event. The pre-computed tsunami waveforms and inundations are from a total of 340 scenarios ranging from 7.5 to 9.2 in moment magnitude scale (Mw), including simple fault models of 208 thrust faults and 44 tsunami earthquakes on the plate interface, as well as 44 normal faults and 44 reverse faults in the outer-rise region. Using our tsunami inundation forecasting algorithm (NearTIF), we could rapidly estimate the tsunami inundation in Pelabuhan Ratu for three different hypothetical earthquakes. The first hypothetical earthquake is a megathrust earthquake type (Mw 9.0) offshore Sumatra which is about 600 km from Pelabuhan Ratu to represent a worst-case event in the far-field. The second hypothetical earthquake (Mw 8.5) is based on a slip deficit rate estimation from geodetic measurements and represents a most likely large event near Pelabuhan Ratu. The third hypothetical earthquake is a tsunami earthquake type (Mw 8.1) which often occur south off Java. We compared the tsunami inundation maps produced by the NearTIF algorithm with results of direct forward inundation modeling for the hypothetical earthquakes. The tsunami inundation maps produced from both methods are similar for the three cases. However, the tsunami inundation map from the inundation database can be obtained in much shorter time (1 min) than the one from a forward inundation modeling (40 min). These indicate that the NearTIF algorithm based on pre-computed inundation database is reliable and useful for tsunami warning purposes. This study also demonstrates that the NearTIF algorithm can work well even though the earthquake source is located outside the area of fault model database because it uses a time shifting procedure for the best-fit scenario searching.

  1. A Look Under the Hood: How the JPL Tropical Cyclone Information System Uses Database Technologies to Present Big Data to Users

    NASA Astrophysics Data System (ADS)

    Knosp, B.; Gangl, M.; Hristova-Veleva, S. M.; Kim, R. M.; Li, P.; Turk, J.; Vu, Q. A.

    2015-12-01

    The JPL Tropical Cyclone Information System (TCIS) brings together satellite, aircraft, and model forecast data from several NASA, NOAA, and other data centers to assist researchers in comparing and analyzing data and model forecast related to tropical cyclones. The TCIS has been running a near-real time (NRT) data portal during North Atlantic hurricane season that typically runs from June through October each year, since 2010. Data collected by the TCIS varies by type, format, contents, and frequency and is served to the user in two ways: (1) as image overlays on a virtual globe and (2) as derived output from a suite of analysis tools. In order to support these two functions, the data must be collected and then made searchable by criteria such as date, mission, product, pressure level, and geospatial region. Creating a database architecture that is flexible enough to manage, intelligently interrogate, and ultimately present this disparate data to the user in a meaningful way has been the primary challenge. The database solution for the TCIS has been to use a hybrid MySQL + Solr implementation. After testing other relational database and NoSQL solutions, such as PostgreSQL and MongoDB respectively, this solution has given the TCIS the best offerings in terms of query speed and result reliability. This database solution also supports the challenging (and memory overwhelming) geospatial queries that are necessary to support analysis tools requested by users. Though hardly new technologies on their own, our implementation of MySQL + Solr had to be customized and tuned to be able to accurately store, index, and search the TCIS data holdings. In this presentation, we will discuss how we arrived on our MySQL + Solr database architecture, why it offers us the most consistent fast and reliable results, and how it supports our front end so that we can offer users a look into our "big data" holdings.

  2. A hydrophysical database to develop pedotransfer functions for Brazilian soils: challenges and perspectives

    USDA-ARS?s Scientific Manuscript database

    Access to soil hydrological data is vital for hydrology projects and for supporting decision-making in issues related to the availability of food and water and the forecasting of phenomena related to soil surface stability. Brazil is a country of continental dimensions and has accumulated a signific...

  3. Developing the U.S. Wildland Fire Decision Support System

    Treesearch

    Erin Noonan-Wright; Tonja S. Opperman; Mark A. Finney; Tom Zimmerman; Robert C. Seli; Lisa M. Elenz; David E. Calkin; John R. Fiedler

    2011-01-01

    A new decision support tool, the Wildland Fire Decision Support System (WFDSS) has been developed to support risk-informed decision-making for individual fires in the United States. WFDSS accesses national weather data and forecasts, fire behavior prediction, economic assessment, smoke management assessment, and landscape databases to efficiently formulate and apply...

  4. Measurement of the Local Food Environment: A Comparison of Existing Data Sources

    PubMed Central

    Bader, Michael D. M.; Ailshire, Jennifer A.; Morenoff, Jeffrey D.; House, James S.

    2010-01-01

    Studying the relation between the residential environment and health requires valid, reliable, and cost-effective methods to collect data on residential environments. This 2002 study compared the level of agreement between measures of the presence of neighborhood businesses drawn from 2 common sources of data used for research on the built environment and health: listings of businesses from commercial databases and direct observations of city blocks by raters. Kappa statistics were calculated for 6 types of businesses—drugstores, liquor stores, bars, convenience stores, restaurants, and grocers—located on 1,663 city blocks in Chicago, Illinois. Logistic regressions estimated whether disagreement between measurement methods was systematically correlated with the socioeconomic and demographic characteristics of neighborhoods. Levels of agreement between the 2 sources were relatively high, with significant (P < 0.001) kappa statistics for each business type ranging from 0.32 to 0.70. Most business types were more likely to be reported by direct observations than in the commercial database listings. Disagreement between the 2 sources was not significantly correlated with the socioeconomic and demographic characteristics of neighborhoods. Results suggest that researchers should have reasonable confidence using whichever method (or combination of methods) is most cost-effective and theoretically appropriate for their research design. PMID:20123688

  5. Application of Recent Advances in Forward Modeling of Emissions from Boreal and Temperate Wildfires to Real-time Forecasting of Aerosol and Trace Gas Concentrations

    NASA Astrophysics Data System (ADS)

    Hyer, E. J.; Reid, J. S.; Kasischke, E. S.; Allen, D. J.

    2005-12-01

    The magnitude of trace gas and aerosol emissions from wildfires is a scientific problem with important implications for atmospheric composition, and is also integral to understanding carbon cycling in terrestrial ecosystems. Recent ecological research on modeling wildfire emissions has integrated theoretical advances derived from ecological fieldwork with improved spatial and temporal databases to produce "post facto" estimates of emissions with high spatial and temporal resolution. These advances have been shown to improve agreement with atmospheric observations at coarse scales, but can in principle be applied to applications, such as forecasting, at finer scales. However, several of the approaches employed in these forward models are incompatible with the requirements of real-time forecasting, requiring modification of data inputs and calculation methods. Because of the differences in data inputs used for real-time and "post-facto" emissions modeling, the key uncertainties in the forward problem are not necessarily the same for these two applications. However, adaptation of these advances in forward modeling to forecasting applications has the potential to improve air quality forecasts, and also to provide a large body of experimental data which can be used to constrain crucial uncertainties in current conceptual models of wildfire emissions. This talk describes a forward modeling method developed at the University of Maryland and its application to the Fire Locating and Modeling of Burning Emissions (FLAMBE) system at the Naval Research Laboratory. Methods for applying the outputs of the NRL aerosol forecasting system to the inverse problem of constraining emissions will also be discussed. The system described can use the feedback supplied by atmospheric observations to improve the emissions source description in the forecasting model, and can also be used for hypothesis testing regarding fire behavior and data inputs.

  6. National Utilization and Forecasting of Ototopical Antibiotics: Medicaid Data Versus "Dr. Google".

    PubMed

    Crowson, Matthew G; Schulz, Kristine; Tucci, Debara L

    2016-09-01

    To forecast national Medicaid prescription volumes for common ototopical antibiotics, and correlate prescription volumes with internet user search interest using Google Trends (GT). National United States Medicaid prescription and GT user search database analysis. Quarterly national Medicaid summary drug utilization data and weekly GT search engine data for ciprofloxacin-dexamethasone (CD), ofloxacin (OF), and Cortisporin (CS) ototopicals were obtained from January 2008 to July 2014. Time series analysis was used to assess prescription seasonality, Holt-Winter's method for forecasting quarterly prescription volumes, and Pearson correlations to compare GT and Medicaid data. Medicaid prescription volumes demonstrated sinusoidal seasonality for OF (r = 0.91), CS (r = 0.71), and CD (r = 0.62) with annual peaks in July, August, and September. In 2017, OF was forecasted to be the most widely prescribed ototopical, followed by CD. CS was the least prescribed, and volumes were forecasted to decrease 9.0% by 2017 from 2014. GT user search interest demonstrated analogous sinusoidal seasonality and significant correlations with Medicaid data prescriptions for CD (r = 0.38, p = 0.046), OF (r = 0.74, p < 0.001), CS (r = 0.49, p = 0.008). We found that OF, CD, and CS ototopicals have sinusoidal seasonal variation with Medicaid prescription volume peaks occurring in the summer. After 2012, OF was the most commonly prescribed ototopical, and this trend was forecasted to continue. CS use was forecasted to decrease. Google user search interest in these ototopical agents demonstrated analogous seasonal variation. Analyses of GT for interest in ototopical antibiotics may be useful for health care providers and administrators as a complementary method for assessing healthcare utilization trends.

  7. Forecasting Hospitalization and Emergency Department Visit Rates for Chronic Obstructive Pulmonary Disease. A Time-Series Analysis.

    PubMed

    Gershon, Andrea; Thiruchelvam, Deva; Moineddin, Rahim; Zhao, Xiu Yan; Hwee, Jeremiah; To, Teresa

    2017-06-01

    Knowing trends in and forecasting hospitalization and emergency department visit rates for chronic obstructive pulmonary disease (COPD) can enable health care providers, hospitals, and health care decision makers to plan for the future. We conducted a time-series analysis using health care administrative data from the Province of Ontario, Canada, to determine previous trends in acute care hospitalization and emergency department visit rates for COPD and then to forecast future rates. Individuals aged 35 years and older with physician-diagnosed COPD were identified using four universal government health administrative databases and a validated case definition. Monthly COPD hospitalization and emergency department visit rates per 1,000 people with COPD were determined from 2003 to 2014 and then forecasted to 2024 using autoregressive integrated moving average models. Between 2003 and 2014, COPD prevalence increased from 8.9 to 11.1%. During that time, there were 274,951 hospitalizations and 290,482 emergency department visits for COPD. After accounting for seasonality, we found that monthly COPD hospitalization and emergency department visit rates per 1,000 individuals with COPD remained stable. COPD prevalence was forecasted to increase to 12.7% (95% confidence interval [CI], 11.4-14.1) by 2024, whereas monthly COPD hospitalization and emergency department visit rates per 1,000 people with COPD were forecasted to remain stable at 2.7 (95% CI, 1.6-4.4) and 3.7 (95% CI, 2.3-5.6), respectively. Forecasted age- and sex-stratified rates were also stable. COPD hospital and emergency department visit rates per 1,000 people with COPD have been stable for more than a decade and are projected to remain stable in the near future. Given increasing COPD prevalence, this means notably more COPD health service use in the future.

  8. A global flash flood forecasting system

    NASA Astrophysics Data System (ADS)

    Baugh, Calum; Pappenberger, Florian; Wetterhall, Fredrik; Hewson, Tim; Zsoter, Ervin

    2016-04-01

    The sudden and devastating nature of flash flood events means it is imperative to provide early warnings such as those derived from Numerical Weather Prediction (NWP) forecasts. Currently such systems exist on basin, national and continental scales in Europe, North America and Australia but rely on high resolution NWP forecasts or rainfall-radar nowcasting, neither of which have global coverage. To produce global flash flood forecasts this work investigates the possibility of using forecasts from a global NWP system. In particular we: (i) discuss how global NWP can be used for flash flood forecasting and discuss strengths and weaknesses; (ii) demonstrate how a robust evaluation can be performed given the rarity of the event; (iii) highlight the challenges and opportunities in communicating flash flood uncertainty to decision makers; and (iv) explore future developments which would significantly improve global flash flood forecasting. The proposed forecast system uses ensemble surface runoff forecasts from the ECMWF H-TESSEL land surface scheme. A flash flood index is generated using the ERIC (Enhanced Runoff Index based on Climatology) methodology [Raynaud et al., 2014]. This global methodology is applied to a series of flash floods across southern Europe. Results from the system are compared against warnings produced using the higher resolution COSMO-LEPS limited area model. The global system is evaluated by comparing forecasted warning locations against a flash flood database of media reports created in partnership with floodlist.com. To deal with the lack of objectivity in media reports we carefully assess the suitability of different skill scores and apply spatial uncertainty thresholds to the observations. To communicate the uncertainties of the flash flood system output we experiment with a dynamic region-growing algorithm. This automatically clusters regions of similar return period exceedence probabilities, thus presenting the at-risk areas at a spatial resolution appropriate to the NWP system. We then demonstrate how these warning areas could eventually complement existing global systems such as the Global Flood Awareness System (GloFAS), to give warnings of flash floods. This work demonstrates the possibility of creating a global flash flood forecasting system based on forecasts from existing global NWP systems. Future developments, in post-processing for example, will need to address an under-prediction bias, for extreme point rainfall, that is innate to current-generation global models.

  9. Healthcare costs of burn patients from homes without fire sprinklers

    PubMed Central

    Banfield, Joanne; Rehou, Sarah; Gomez, Manuel; Redelmeier, Donald A.; Jeschke, Marc G.

    2014-01-01

    The treatment of burn injuries requires high-cost services for healthcare and society. Automatic fire sprinklers are a preventive measure that can decrease fire injuries, deaths, property damage and environmental toxins. This study’s aim was to conduct a cost-analysis of patients with burn or inhalation injuries due to residential fires, and to compare this to the cost of implementing residential automatic fire sprinklers. We conducted a cohort analysis of adult burn patients admitted to our provincial burn center (1995–2012). Patient demographics and injury characteristics were collected from medical records, and clinical and coroner databases. Resource costs included average cost per day at our intensive care and rehabilitation program, transportation, and property loss. During the study period there were 1,557 residential fire-related deaths province-wide and 1,139 patients were admitted to our provincial burn center due to a flame injury occurring at home. At our burn center, the average cost was CAN$84,678 per patient with a total cost of CAN$96,448,194. All resources totaled CAN$3,605,775,200. This study shows the considerable healthcare costs of burn patients from homes without fire sprinklers. PMID:25412056

  10. Real-Time CME Forecasting Using HMI Active-Region Magnetograms and Flare History

    NASA Technical Reports Server (NTRS)

    Falconer, David; Moore, Ron; Barghouty, Abdulnasser F.; Khazanov, Igor

    2011-01-01

    We have recently developed a method of predicting an active region s probability of producing a CME, an X-class Flare, an M-class Flare, or a Solar Energetic Particle Event from a free-energy proxy measured from SOHO/MDI line-of-sight magnetograms. This year we have added three major improvements to our forecast tool: 1) Transition from MDI magnetogram to SDO/HMI magnetogram allowing us near-real-time forecasts, 2) Automation of acquisition and measurement of HMI magnetograms giving us near-real-time forecasts (no older than 2 hours), and 3) Determination of how to improve forecast by using the active region s previous flare history in combination with its free-energy proxy. HMI was turned on in May 2010 and MDI was turned off in April 2011. Using the overlap period, we have calibrated HMI to yield what MDI would measure. This is important since the value of the free-energy proxy used for our forecast is resolution dependent, and the forecasts are made from results of a 1996-2004 database of MDI observations. With near-real-time magnetograms from HMI, near-real-time forecasts are now possible. We have augmented the code so that it continually acquires and measures new magnetograms as they become available online, and updates the whole-sun forecast from the coming day. The next planned improvement is to use an active region s previous flare history, in conjunction with its free-energy proxy, to forecast the active region s event rate. It has long been known that active regions that have produced flares in the past are likely to produce flares in the future, and that active regions that are nonpotential (have large free-energy) are more likely to produce flares in the future. This year we have determined that persistence of flaring is not just a reflection of an active region s free energy. In other words, after controlling for free energy, we have found that active regions that have flared recently are more likely to flare in the future.

  11. EXPOSURES AND INTERNAL DOSES OF ...

    EPA Pesticide Factsheets

    The National Center for Environmental Assessment (NCEA) has released a final report that presents and applies a method to estimate distributions of internal concentrations of trihalomethanes (THMs) in humans resulting from a residential drinking water exposure. The report presents simulations of oral, dermal and inhalation exposures and demonstrates the feasibility of linking the US EPA’s information Collection Rule database with other databases on external exposure factors and physiologically based pharmacokinetic modeling to refine population-based estimates of exposure. Review Draft - by 2010, develop scientifically sound data and approaches to assess and manage risks to human health posed by exposure to specific regulated waterborne pathogens and chemicals, including those addressed by the Arsenic, M/DBP and Six-Year Review Rules.

  12. On the reliable use of satellite-derived surface water products for global flood monitoring

    NASA Astrophysics Data System (ADS)

    Hirpa, F. A.; Revilla-Romero, B.; Thielen, J.; Salamon, P.; Brakenridge, R.; Pappenberger, F.; de Groeve, T.

    2015-12-01

    Early flood warning and real-time monitoring systems play a key role in flood risk reduction and disaster response management. To this end, real-time flood forecasting and satellite-based detection systems have been developed at global scale. However, due to the limited availability of up-to-date ground observations, the reliability of these systems for real-time applications have not been assessed in large parts of the globe. In this study, we performed comparative evaluations of the commonly used satellite-based global flood detections and operational flood forecasting system using 10 major flood cases reported over three years (2012-2014). Specially, we assessed the flood detection capabilities of the near real-time global flood maps from the Global Flood Detection System (GFDS), and from the Moderate Resolution Imaging Spectroradiometer (MODIS), and the operational forecasts from the Global Flood Awareness System (GloFAS) for the major flood events recorded in global flood databases. We present the evaluation results of the global flood detection and forecasting systems in terms of correctly indicating the reported flood events and highlight the exiting limitations of each system. Finally, we propose possible ways forward to improve the reliability of large scale flood monitoring tools.

  13. From multi-disciplinary monitoring observation to probabilistic eruption forecasting: a Bayesian view

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.

    2011-12-01

    Eruption forecasting is the probability of eruption in a specific time-space-magnitude window. The use of probabilities to track the evolution of a phase of unrest is unavoidable for two main reasons: first, eruptions are intrinsically unpredictable in a deterministic sense, and, second, probabilities represent a quantitative tool that can be rationally used by decision-makers (this is usually done in many other fields). The primary information for the probability assessment during a phase of unrest come from monitoring data of different quantities, such as the seismic activity, ground deformation, geochemical signatures, and so on. Nevertheless, the probabilistic forecast based on monitoring data presents two main difficulties. First, many high-risk volcanoes do not have monitoring pre-eruptive and unrest databases, making impossible a probabilistic assessment based on the frequency of past observations. The ongoing project WOVOdat (led by Christopher Newhall) is trying to tackle this limitation creating a sort of worldwide epidemiological database that may cope with the lack of monitoring pre-eruptive and unrest databases for a specific volcano using observations of 'analogs' volcanoes. Second, the quantity and quality of monitoring data are rapidly increasing in many volcanoes, creating strongly inhomogeneous dataset. In these cases, classical statistical analysis can be performed on high quality monitoring observations only for (usually too) short periods of time, or alternatively using only few specific monitoring data that are available for longer times (such as the number of earthquakes), therefore neglecting a lot of information carried out by the most recent kind of monitoring. Here, we explore a possible strategy to cope with these limitations. In particular, we present a Bayesian strategy that merges different kinds of information. In this approach, all relevant monitoring observations are embedded into a probabilistic scheme through expert opinion, conceptual models, and, possibly, real past data. After discussing all scientific and philosophical aspects of such approach, we present some applications for Campi Flegrei and Vesuvius.

  14. Highly accurate prediction of emotions surrounding the attacks of September 11, 2001 over 1-, 2-, and 7-year prediction intervals.

    PubMed

    Doré, Bruce P; Meksin, Robert; Mather, Mara; Hirst, William; Ochsner, Kevin N

    2016-06-01

    In the aftermath of a national tragedy, important decisions are predicated on judgments of the emotional significance of the tragedy in the present and future. Research in affective forecasting has largely focused on ways in which people fail to make accurate predictions about the nature and duration of feelings experienced in the aftermath of an event. Here we ask a related but understudied question: can people forecast how they will feel in the future about a tragic event that has already occurred? We found that people were strikingly accurate when predicting how they would feel about the September 11 attacks over 1-, 2-, and 7-year prediction intervals. Although people slightly under- or overestimated their future feelings at times, they nonetheless showed high accuracy in forecasting (a) the overall intensity of their future negative emotion, and (b) the relative degree of different types of negative emotion (i.e., sadness, fear, or anger). Using a path model, we found that the relationship between forecasted and actual future emotion was partially mediated by current emotion and remembered emotion. These results extend theories of affective forecasting by showing that emotional responses to an event of ongoing national significance can be predicted with high accuracy, and by identifying current and remembered feelings as independent sources of this accuracy. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  15. A computerized system to measure and predict air quality for emission control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crooks, G.; Ciccone, A.; Frattolillo, P.

    1997-12-31

    A Supplementary Emission Control (SEC) system has been developed on behalf of the Association Industrielle de l`Est de Montreal (AIEM). The objective of the SEC is to avoid exceedences of the Montreal Urban Community (MUC) 24 hour ambient Air Quality Standard (AQS) for sulphur dioxide in the industrial East Montreal area. The SEC system is comprised of: 3 continuous SO{sub 2} monitoring stations with data loggers and remote communications; a meteorological tower with data logger and modem for acquiring local meteorology; communications with Environment Canada to download meteorological forecast data; a polling PC for data retrieval; and Windows NT basedmore » software running on the AIEM computer server. The SEC software utilizes relational databases to store and maintain measured SO{sub 2} concentration data, emission data, as well as observed and forecast meteorological data. The SEC system automatically executes a numerical dispersion model to forecast SO{sub 2} concentrations up to six hours in the future. Based on measured SO{sub 2} concentrations at the monitoring stations and the six hour forecast concentrations, the system determines if local sources should reduce their emission levels to avoid potential exceedences of the AQS. The SEC system also includes a Graphical User Interface (GUI) for user access to the system. The SEC system and software are described, and the accuracy of the system at forecasting SO{sub 2} concentrations is examined.« less

  16. Variation of microorganism concentrations in urban stormwater runoff with land use and seasons.

    PubMed

    Selvakumar, Ariamalar; Borst, Michael

    2006-03-01

    Stormwater runoff samples were collected from outfalls draining small municipal separate storm sewer systems. The samples were collected from three different land use areas based on local designation (high-density residential, low-density residential and landscaped commercial). The concentrations of microorganisms in the stormwater runoff were found to be similar in magnitude to, but less variable than, those reported in the stormwater National Pollutant Discharge Elimination System (NPDES) database. Microorganism concentrations from high-density residential areas were higher than those associated with low-density residential and landscaped commercial areas. Since the outfalls were free of sanitary wastewater cross-connections, the major sources of microorganisms to the stormwater runoff were most likely from the feces of domestic animals and wildlife. Concentrations of microorganisms were significantly affected by the season during which the samples were collected. The lowest concentrations were observed during winter except for Staphylococcus aureus. The Pearson correlation coefficients among different indicators showed weak linear relationships and the relationships were statistically significant. However, the relationships between indicators and pathogens were poorly correlated and were not statistically significant, suggesting the use of indicators as evidence of the presence of pathogens is not appropriate. Further, the correlation between the concentration of the traditionally monitored indicators (total coliforms and fecal coliforms) and the suggested substitutes (enterococci and E. coli) is weak, but statistically significant, suggesting that historical time series will be only a qualitative indicator of impaired waters under the revised criteria for recreational water quality by the US EPA.

  17. Analysis of Rapidly Developing Low Cloud Ceilings in a Stable Environment

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III; Barrett, Joe H., III; Case, Jonathan L.; Wheeler, Mark M.; Baggett, G. Wayne

    2006-01-01

    Forecasters at the Space Meteorology Group (SMG) issue 30 to 90 minute forecasts for low cloud ceilings at the Space Shuttle Landing Facility (TTS) to support Space Shuttle landings. Mission verification statistics have shown ceilings to be the number one forecast challenge for SMG. More specifically, forecasters at SMG are concerned with any rapidly developing clouds/ceilings below 8000 ft in a stable, capped thermodynamic environment. Therefore, the Applied Meteorology Unit (AMU) was tasked to examine archived events of rapid stable cloud formation resulting in ceilings below 8000 ft, and document the atmospheric regimes favoring this type of cloud development. The AMU examined the cool season months of November to March during the years of 1993-2003 for days that had low-level inversions and rapid, stable low cloud formation that resulted in ceilings violating the Space Shuttle Flight Rules. The AMU wrote and modified existing code to identify inversions from the morning (-10 UTC) Cape Canaveral, FL rawinsonde (XMR) during the cool season and output pertinent sounding information. They parsed all days with cloud ceilings below 8000 ft at TTS, forming a database of possible rapidly-developing low ceiling events. Days with precipitation or noticeable fog burn-off situations were excluded from the database. In the first phase of this work, only the daytime hours were examined for possible ceiling development events since low clouds are easier to diagnose with visible satellite imagery. Phase II of this work includes expanding the database to include nighttime cases which is underway as this abstract is being written. For the nighttime cases, the AMU will analyze both the 00 UTC soundings and the 10 UTC soundings to examine those data for the presence of a low-level inversion. The 00 UTC soundings will probably not have a surface-based inversion, but the presence of inversions or "neutral" layers aloft and below 8,000 ft will most likely help define the stable regime, being a thermodynamically "capped" environment. Occurrences of elevated low-level inversions or stable layers will be highlighted in conjunction with nights that experienced a possible development or onset of cloud ceilings below 8,000 ft. Using these criteria to narrow down the database, the AMU will then use archived IR satellite imagery for these possible events. This presentation summarizes the composite meteorological conditions for 20 daytime event days with rapid low cloud ceiling formation and 48 non-events days consisting of advection or widespread low cloud ceilings and describes two sample cases of daytime rapidly-developing low cloud ceilings. The authors will also summarize the work from the nighttime cases and describe a representative sample case from this data set.

  18. Overview of Hydrometeorologic Forecasting Procedures at BC Hydro

    NASA Astrophysics Data System (ADS)

    McCollor, D.

    2004-12-01

    Energy utility companies must balance production from limited sources with increasing demand from industrial, business, and residential consumers. The utility planning process requires a balanced, efficient, and effective distribution of energy from source to consumer. Therefore utility planners must consider the impact of weather on energy production and consumption. Hydro-electric companies should be particularly tuned to weather because their source of energy is water, and water supply depends on precipitation. BC Hydro operates as the largest hydro-electric company in western Canada, managing over 30 reservoirs within the province of British Columbia, and generating electricity for 1.6 million people. BC Hydro relies on weather forecasts of watershed precipitation and temperature to drive hydrologic reservoir inflow models and of urban temperatures to meet energy demand requirements. Operations and planning specialists in the company rely on current, value-added weather forecasts for extreme high-inflow events, daily reservoir operations planning, and long-term water resource management. Weather plays a dominant role for BC Hydro financial planners in terms of sensitive economic responses. For example, a two percent change in hydropower generation, due in large part to annual precipitation patterns, results in an annual net change of \\50 million in earnings. A five percent change in temperature produces a \\5 million change in yearly earnings. On a daily basis, significant precipitation events or temperature extremes involve potential profit/loss decisions in the tens of thousands of dollars worth of power generation. These factors are in addition to environmental and societal costs that must be considered equally as part of a triple bottom line reporting structure. BC Hydro water resource managers require improved meteorological information from recent advancements in numerical weather prediction. At BC Hydro, methods of providing meteorological forecast data are changing as new downscaling and ensemble techniques evolve to improve environmental information supplied to water managers.

  19. The Norwegian forecasting and warning service for rainfall- and snowmelt-induced landslides

    NASA Astrophysics Data System (ADS)

    Krøgli, Ingeborg K.; Devoli, Graziella; Colleuille, Hervé; Boje, Søren; Sund, Monica; Engen, Inger Karin

    2018-05-01

    The Norwegian Water Resources and Energy Directorate (NVE) have run a national flood forecasting and warning service since 1989. In 2009, the directorate was given the responsibility of also initiating a national forecasting service for rainfall-induced landslides. Both services are part of a political effort to improve flood and landslide risk prevention. The Landslide Forecasting and Warning Service was officially launched in 2013 and is developed as a joint initiative across public agencies between NVE, the Norwegian Meteorological Institute (MET), the Norwegian Public Road Administration (NPRA) and the Norwegian Rail Administration (Bane NOR). The main goal of the service is to reduce economic and human losses caused by landslides. The service performs daily a national landslide hazard assessment describing the expected awareness level at a regional level (i.e. for a county and/or group of municipalities). The service is operative 7 days a week throughout the year. Assessments and updates are published at the warning portal http://www.varsom.no/ at least twice a day, for the three coming days. The service delivers continuous updates on the current situation and future development to national and regional stakeholders and to the general public. The service is run in close cooperation with the flood forecasting service. Both services are based on the five pillars: automatic hydrological and meteorological stations, landslide and flood historical database, hydro-meteorological forecasting models, thresholds or return periods, and a trained group of forecasters. The main components of the service are herein described. A recent evaluation, conducted on the 4 years of operation, shows a rate of over 95 % correct daily assessments. In addition positive feedbacks have been received from users through a questionnaire. The capability of the service to forecast landslides by following the hydro-meteorological conditions is illustrated by an example from autumn 2017. The case shows how the landslide service has developed into a well-functioning system providing useful information, effectively and on time.

  20. The FASTER Approach: A New Tool for Calculating Real-Time Tsunami Flood Hazards

    NASA Astrophysics Data System (ADS)

    Wilson, R. I.; Cross, A.; Johnson, L.; Miller, K.; Nicolini, T.; Whitmore, P.

    2014-12-01

    In the aftermath of the 2010 Chile and 2011 Japan tsunamis that struck the California coastline, emergency managers requested that the state tsunami program provide more detailed information about the flood potential of distant-source tsunamis well ahead of their arrival time. The main issue is that existing tsunami evacuation plans call for evacuation of the predetermined "worst-case" tsunami evacuation zone (typically at a 30- to 50-foot elevation) during any "Warning" level event; the alternative is to not call an evacuation at all. A solution to provide more detailed information for secondary evacuation zones has been the development of tsunami evacuation "playbooks" to plan for tsunami scenarios of various sizes and source locations. To determine a recommended level of evacuation during a distant-source tsunami, an analytical tool has been developed called the "FASTER" approach, an acronym for factors that influence the tsunami flood hazard for a community: Forecast Amplitude, Storm, Tides, Error in forecast, and the Run-up potential. Within the first couple hours after a tsunami is generated, the National Tsunami Warning Center provides tsunami forecast amplitudes and arrival times for approximately 60 coastal locations in California. At the same time, the regional NOAA Weather Forecast Offices in the state calculate the forecasted coastal storm and tidal conditions that will influence tsunami flooding. Providing added conservatism in calculating tsunami flood potential, we include an error factor of 30% for the forecast amplitude, which is based on observed forecast errors during recent events, and a site specific run-up factor which is calculated from the existing state tsunami modeling database. The factors are added together into a cumulative FASTER flood potential value for the first five hours of tsunami activity and used to select the appropriate tsunami phase evacuation "playbook" which is provided to each coastal community shortly after the forecast is provided.

  1. Development, testing, and applications of site-specific tsunami inundation models for real-time forecasting

    NASA Astrophysics Data System (ADS)

    Tang, L.; Titov, V. V.; Chamberlin, C. D.

    2009-12-01

    The study describes the development, testing and applications of site-specific tsunami inundation models (forecast models) for use in NOAA's tsunami forecast and warning system. The model development process includes sensitivity studies of tsunami wave characteristics in the nearshore and inundation, for a range of model grid setups, resolutions and parameters. To demonstrate the process, four forecast models in Hawaii, at Hilo, Kahului, Honolulu, and Nawiliwili are described. The models were validated with fourteen historical tsunamis and compared with numerical results from reference inundation models of higher resolution. The accuracy of the modeled maximum wave height is greater than 80% when the observation is greater than 0.5 m; when the observation is below 0.5 m the error is less than 0.3 m. The error of the modeled arrival time of the first peak is within 3% of the travel time. The developed forecast models were further applied to hazard assessment from simulated magnitude 7.5, 8.2, 8.7 and 9.3 tsunamis based on subduction zone earthquakes in the Pacific. The tsunami hazard assessment study indicates that use of a seismic magnitude alone for a tsunami source assessment is inadequate to achieve such accuracy for tsunami amplitude forecasts. The forecast models apply local bathymetric and topographic information, and utilize dynamic boundary conditions from the tsunami source function database, to provide site- and event-specific coastal predictions. Only by combining a Deep-ocean Assessment and Reporting of Tsunami-constrained tsunami magnitude with site-specific high-resolution models can the forecasts completely cover the evolution of earthquake-generated tsunami waves: generation, deep ocean propagation, and coastal inundation. Wavelet analysis of the tsunami waves suggests the coastal tsunami frequency responses at different sites are dominated by the local bathymetry, yet they can be partially related to the locations of the tsunami sources. The study also demonstrates the nonlinearity between offshore and nearshore maximum wave amplitudes.

  2. Cognitive methodology for forecasting oil and gas industry using pattern-based neural information technologies

    NASA Astrophysics Data System (ADS)

    Gafurov, O.; Gafurov, D.; Syryamkin, V.

    2018-05-01

    The paper analyses a field of computer science formed at the intersection of such areas of natural science as artificial intelligence, mathematical statistics, and database theory, which is referred to as "Data Mining" (discovery of knowledge in data). The theory of neural networks is applied along with classical methods of mathematical analysis and numerical simulation. The paper describes the technique protected by the patent of the Russian Federation for the invention “A Method for Determining Location of Production Wells during the Development of Hydrocarbon Fields” [1–3] and implemented using the geoinformation system NeuroInformGeo. There are no analogues in domestic and international practice. The paper gives an example of comparing the forecast of the oil reservoir quality made by the geophysicist interpreter using standard methods and the forecast of the oil reservoir quality made using this technology. The technical result achieved shows the increase of efficiency, effectiveness, and ecological compatibility of development of mineral deposits and discovery of a new oil deposit.

  3. Assessment of WRF Simulated Precipitation by Meteorological Regimes

    NASA Astrophysics Data System (ADS)

    Hagenhoff, Brooke Anne

    This study evaluated warm-season precipitation events in a multi-year (2007-2014) database of Weather Research and Forecasting (WRF) simulations over the Northern Plains and Southern Great Plains. These WRF simulations were run daily in support of the National Oceanic and Atmospheric Administration (NOAA) Hazardous Weather Testbed (HWT) by the National Severe Storms Laboratory (NSSL) for operational forecasts. Evaluating model skill by synoptic pattern allows for an understanding of how model performance varies with particular atmospheric states and will aid forecasters with pattern recognition. To conduct this analysis, a competitive neural network known as the Self-Organizing Map (SOM) was used. SOMs allow the user to represent atmospheric patterns in an array of nodes that represent a continuum of synoptic categorizations. North American Regional Reanalysis (NARR) data during the warm season (April-September) was used to perform the synoptic typing over the study domains. Simulated precipitation was evaluated against observations provided by the National Centers for Environmental Prediction (NCEP) Stage IV precipitation analysis.

  4. Cold Climate Foundation Retrofit Experimental Hygrothermal Performance. Cloquet Residential Research Facility Laboratory Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, Louise F.; Harmon, Anna C.

    2015-04-09

    This project was funded jointly by the National Renewable Energy Laboratory (NREL) and Oak Ridge National Laboratory (ORNL). ORNL focused on developing a full basement wall system experimental database to enable others to validate hygrothermal simulation codes. NREL focused on testing the moisture durability of practical basement wall interior insulation retrofit solutions for cold climates. The project has produced a physically credible and reliable long-term hygrothermal performance database for retrofit foundation wall insulation systems in zone 6 and 7 climates that are fully compliant with the performance criteria in the 2009 Minnesota Energy Code. These data currently span the periodmore » from November 10, 2012 through May 31, 2014 and are anticipated to be extended through November 2014. The experimental data were configured into a standard format that can be published online and that is compatible with standard commercially available spreadsheet and database software.« less

  5. Should we use seasonnal meteorological ensemble forecasts for hydrological forecasting? A case study for nordic watersheds in Canada.

    NASA Astrophysics Data System (ADS)

    Bazile, Rachel; Boucher, Marie-Amélie; Perreault, Luc; Leconte, Robert; Guay, Catherine

    2017-04-01

    Hydro-electricity is a major source of energy for many countries throughout the world, including Canada. Long lead-time streamflow forecasts are all the more valuable as they help decision making and dam management. Different techniques exist for long-term hydrological forecasting. Perhaps the most well-known is 'Extended Streamflow Prediction' (ESP), which considers past meteorological scenarios as possible, often equiprobable, future scenarios. In the ESP framework, those past-observed meteorological scenarios (climatology) are used in turn as the inputs of a chosen hydrological model to produce ensemble forecasts (one member corresponding to each year in the available database). Many hydropower companies, including Hydro-Québec (province of Quebec, Canada) use variants of the above described ESP system operationally for long-term operation planning. The ESP system accounts for the hydrological initial conditions and for the natural variability of the meteorological variables. However, it cannot consider the current initial state of the atmosphere. Climate models can help remedy this drawback. In the context of a changing climate, dynamical forecasts issued from climate models seem to be an interesting avenue to improve upon the ESP method and could help hydropower companies to adapt their management practices to an evolving climate. Long-range forecasts from climate models can also be helpful for water management at locations where records of past meteorological conditions are short or nonexistent. In this study, we compare 7-month hydrological forecasts obtained from climate model outputs to an ESP system. The ESP system mimics the one used operationally at Hydro-Québec. The dynamical climate forecasts are produced by the European Center for Medium range Weather Forecasts (ECMWF) System4. Forecasts quality is assessed using numerical scores such as the Continuous Ranked Probability Score (CRPS) and the Ignorance score and also graphical tools such as the reliability diagram. This study covers 10 nordic watersheds. We show that forecast performance according to the CRPS varies with lead-time but also with the period of the year. The raw forecasts from the ECMWF System4 display important biases for both temperature and precipitation, which need to be corrected. The linear scaling method is used for this purpose and is found effective. Bias correction improves forecasts performance, especially during the summer when the precipitations are over-estimated. According to the CRPS, bias corrected forecasts from System4 show performances comparable to those of the ESP system. However, the Ignorance score, which penalizes the lack of calibration (under-dispersive forecasts in this case) more severely than the CRPS, provides a different outlook for the comparison of the two systems. In fact, according to the Ignorance score, the ESP system outperforms forecasts based on System4 in most cases. This illustrates that the joint use of several metrics is crucial to assess the quality of a forecasts system thoroughly. Globally, ESP provide reliable forecasts which can be over-dispersed whereas bias corrected ECMWF System4 forecasts are sharper but at the risk of missing events.

  6. NED-IIS: An Intelligent Information System for Forest Ecosystem Management

    Treesearch

    W.D. Potter; S. Somasekar; R. Kommineni; H.M. Rauscher

    1999-01-01

    We view Intelligent Information System (IIS) as composed of a unified knowledge base, database, and model base. The model base includes decision support models, forecasting models, and cvsualization models for example. In addition, we feel that the model base should include domain specific porblems solving modules as well as decision support models. This, then,...

  7. Depth-area-duration characteristics of storm rainfall in Texas using Multi-Sensor Precipitation Estimates

    NASA Astrophysics Data System (ADS)

    McEnery, J. A.; Jitkajornwanich, K.

    2012-12-01

    This presentation will describe the methodology and overall system development by which a benchmark dataset of precipitation information has been used to characterize the depth-area-duration relations in heavy rain storms occurring over regions of Texas. Over the past two years project investigators along with the National Weather Service (NWS) West Gulf River Forecast Center (WGRFC) have developed and operated a gateway data system to ingest, store, and disseminate NWS multi-sensor precipitation estimates (MPE). As a pilot project of the Integrated Water Resources Science and Services (IWRSS) initiative, this testbed uses a Standard Query Language (SQL) server to maintain a full archive of current and historic MPE values within the WGRFC service area. These time series values are made available for public access as web services in the standard WaterML format. Having this volume of information maintained in a comprehensive database now allows the use of relational analysis capabilities within SQL to leverage these multi-sensor precipitation values and produce a valuable derivative product. The area of focus for this study is North Texas and will utilize values that originated from the West Gulf River Forecast Center (WGRFC); one of three River Forecast Centers currently represented in the holdings of this data system. Over the past two decades, NEXRAD radar has dramatically improved the ability to record rainfall. The resulting hourly MPE values, distributed over an approximate 4 km by 4 km grid, are considered by the NWS to be the "best estimate" of rainfall. The data server provides an accepted standard interface for internet access to the largest time-series dataset of NEXRAD based MPE values ever assembled. An automated script has been written to search and extract storms over the 18 year period of record from the contents of this massive historical precipitation database. Not only can it extract site-specific storms, but also duration-specific storms and storms separated by user defined inter-event periods. A separate storm database has been created to store the selected output. By storing output within tables in a separate database, we can make use of powerful SQL capabilities to perform flexible pattern analysis. Previous efforts have made use of historic data from limited clusters of irregularly spaced physical gauges. Spatial extent of the observational network has been a limiting factor. The relatively dense distribution of MPE provides a virtual mesh of observations stretched over the landscape. This work combines a unique hydrologic data resource with programming and database analysis to characterize storm depth-area-duration relationships.

  8. Combining empirical approaches and error modelling to enhance predictive uncertainty estimation in extrapolation for operational flood forecasting. Tests on flood events on the Loire basin, France.

    NASA Astrophysics Data System (ADS)

    Berthet, Lionel; Marty, Renaud; Bourgin, François; Viatgé, Julie; Piotte, Olivier; Perrin, Charles

    2017-04-01

    An increasing number of operational flood forecasting centres assess the predictive uncertainty associated with their forecasts and communicate it to the end users. This information can match the end-users needs (i.e. prove to be useful for an efficient crisis management) only if it is reliable: reliability is therefore a key quality for operational flood forecasts. In 2015, the French flood forecasting national and regional services (Vigicrues network; www.vigicrues.gouv.fr) implemented a framework to compute quantitative discharge and water level forecasts and to assess the predictive uncertainty. Among the possible technical options to achieve this goal, a statistical analysis of past forecasting errors of deterministic models has been selected (QUOIQUE method, Bourgin, 2014). It is a data-based and non-parametric approach based on as few assumptions as possible about the forecasting error mathematical structure. In particular, a very simple assumption is made regarding the predictive uncertainty distributions for large events outside the range of the calibration data: the multiplicative error distribution is assumed to be constant, whatever the magnitude of the flood. Indeed, the predictive distributions may not be reliable in extrapolation. However, estimating the predictive uncertainty for these rare events is crucial when major floods are of concern. In order to improve the forecasts reliability for major floods, an attempt at combining the operational strength of the empirical statistical analysis and a simple error modelling is done. Since the heteroscedasticity of forecast errors can considerably weaken the predictive reliability for large floods, this error modelling is based on the log-sinh transformation which proved to reduce significantly the heteroscedasticity of the transformed error in a simulation context, even for flood peaks (Wang et al., 2012). Exploratory tests on some operational forecasts issued during the recent floods experienced in France (major spring floods in June 2016 on the Loire river tributaries and flash floods in fall 2016) will be shown and discussed. References Bourgin, F. (2014). How to assess the predictive uncertainty in hydrological modelling? An exploratory work on a large sample of watersheds, AgroParisTech Wang, Q. J., Shrestha, D. L., Robertson, D. E. and Pokhrel, P (2012). A log-sinh transformation for data normalization and variance stabilization. Water Resources Research, , W05514, doi:10.1029/2011WR010973

  9. Air Leakage of US Homes: Regression Analysis and Improvements from Retrofit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chan, Wanyu R.; Joh, Jeffrey; Sherman, Max H.

    2012-08-01

    LBNL Residential Diagnostics Database (ResDB) contains blower door measurements and other diagnostic test results of homes in United States. Of these, approximately 134,000 single-family detached homes have sufficient information for the analysis of air leakage in relation to a number of housing characteristics. We performed regression analysis to consider the correlation between normalized leakage and a number of explanatory variables: IECC climate zone, floor area, height, year built, foundation type, duct location, and other characteristics. The regression model explains 68% of the observed variability in normalized leakage. ResDB also contains the before and after retrofit air leakage measurements of approximatelymore » 23,000 homes that participated in weatherization assistant programs (WAPs) or residential energy efficiency programs. The two types of programs achieve rather similar reductions in normalized leakage: 30% for WAPs and 20% for other energy programs.« less

  10. Profile of children placed in residential psychiatric program: Association with delinquency, involuntary mental health commitment, and reentry into care.

    PubMed

    Yampolskaya, Svetlana; Mowery, Debra; Dollard, Norín

    2014-05-01

    This study examined characteristics and profiles of youth receiving services in 1 of Florida's Medicaid-funded residential mental health treatment programs--State Inpatient Psychiatric Program (SIPP)--between July 1, 2004, and June 30, 2008 (N=1,432). Latent class analysis (LCA) was used to classify youth, and 3 classes were identified: Children With Multiple Needs, Children With No Caregivers, and Abused Children With Substantial Maltreatment History. The results of LCA showed that Children With Multiple Needs experienced the greatest risk for adverse outcomes. Compared with youth in the other 2 classes, these children were more likely to get readmitted to SIPP, more likely to become involved with the juvenile justice system, and more likely to experience involuntary mental health assessments. Implications of the findings are discussed. PsycINFO Database Record (c) 2014 APA, all rights reserved

  11. The effects of regional insolation differences upon advanced solar thermal electric power plant performance and energy costs

    NASA Technical Reports Server (NTRS)

    Latta, A. F.; Bowyer, J. M.; Fujita, T.; Richter, P. H.

    1980-01-01

    The performance and cost of four 10 MWe advanced solar thermal electric power plants sited in various regions of the continental United States was studied. Each region has different insolation characteristics which result in varying collector field areas, plant performance, capital costs and energy costs. The regional variation in solar plant performance was assessed in relation to the expected rise in the future cost of residential and commercial electricity supplied by conventional utility power systems in the same regions. A discussion of the regional insolation data base is presented along with a description of the solar systems performance and costs. A range for the forecast cost of conventional electricity by region and nationally over the next several decades is given.

  12. Monitoring of waste disposal in deep geological formations

    NASA Astrophysics Data System (ADS)

    German, V.; Mansurov, V.

    2003-04-01

    In the paper application of kinetic approach for description of rock failure process and waste disposal microseismic monitoring is advanced. On base of two-stage model of failure process the capability of rock fracture is proved. The requests to monitoring system such as real time mode of data registration and processing and its precision range are formulated. The method of failure nuclei delineation in a rock masses is presented. This method is implemented in a software program for strong seismic events forecasting. It is based on direct use of the fracture concentration criterion. The method is applied to the database of microseismic events of the North Ural Bauxite Mine. The results of this application, such as: efficiency, stability, possibility of forecasting rockburst are discussed.

  13. Development of web-based services for an ensemble flood forecasting and risk assessment system

    NASA Astrophysics Data System (ADS)

    Yaw Manful, Desmond; He, Yi; Cloke, Hannah; Pappenberger, Florian; Li, Zhijia; Wetterhall, Fredrik; Huang, Yingchun; Hu, Yuzhong

    2010-05-01

    Flooding is a wide spread and devastating natural disaster worldwide. Floods that took place in the last decade in China were ranked the worst amongst recorded floods worldwide in terms of the number of human fatalities and economic losses (Munich Re-Insurance). Rapid economic development and population expansion into low lying flood plains has worsened the situation. Current conventional flood prediction systems in China are neither suited to the perceptible climate variability nor the rapid pace of urbanization sweeping the country. Flood prediction, from short-term (a few hours) to medium-term (a few days), needs to be revisited and adapted to changing socio-economic and hydro-climatic realities. The latest technology requires implementation of multiple numerical weather prediction systems. The availability of twelve global ensemble weather prediction systems through the ‘THORPEX Interactive Grand Global Ensemble' (TIGGE) offers a good opportunity for an effective state-of-the-art early forecasting system. A prototype of a Novel Flood Early Warning System (NEWS) using the TIGGE database is tested in the Huai River basin in east-central China. It is the first early flood warning system in China that uses the massive TIGGE database cascaded with river catchment models, the Xinanjiang hydrologic model and a 1-D hydraulic model, to predict river discharge and flood inundation. The NEWS algorithm is also designed to provide web-based services to a broad spectrum of end-users. The latter presents challenges as both databases and proprietary codes reside in different locations and converge at dissimilar times. NEWS will thus make use of a ready-to-run grid system that makes distributed computing and data resources available in a seamless and secure way. An ability to run or function on different operating systems and provide an interface or front that is accessible to broad spectrum of end-users is additional requirement. The aim is to achieve robust interoperability through strong security and workflow capabilities. A physical network diagram and a work flow scheme of all the models, codes and databases used to achieve the NEWS algorithm are presented. They constitute a first step in the development of a platform for providing real time flood forecasting services on the web to mitigate 21st century weather phenomena.

  14. The longevity of lava dome eruptions

    NASA Astrophysics Data System (ADS)

    Wolpert, Robert L.; Ogburn, Sarah E.; Calder, Eliza S.

    2016-02-01

    Understanding the duration of past, ongoing, and future volcanic eruptions is an important scientific goal and a key societal need. We present a new methodology for forecasting the duration of ongoing and future lava dome eruptions based on a database (DomeHaz) recently compiled by the authors. The database includes duration and composition for 177 such eruptions, with "eruption" defined as the period encompassing individual episodes of dome growth along with associated quiescent periods during which extrusion pauses but unrest continues. In a key finding, we show that probability distributions for dome eruption durations are both heavy tailed and composition dependent. We construct objective Bayesian statistical models featuring heavy-tailed Generalized Pareto distributions with composition-specific parameters to make forecasts about the durations of new and ongoing eruptions that depend on both eruption duration to date and composition. Our Bayesian predictive distributions reflect both uncertainty about model parameter values (epistemic uncertainty) and the natural variability of the geologic processes (aleatoric uncertainty). The results are illustrated by presenting likely trajectories for 14 dome-building eruptions ongoing in 2015. Full representation of the uncertainty is presented for two key eruptions, Soufriére Hills Volcano in Montserrat (10-139 years, median 35 years) and Sinabung, Indonesia (1-17 years, median 4 years). Uncertainties are high but, importantly, quantifiable. This work provides for the first time a quantitative and transferable method and rationale on which to base long-term planning decisions for lava dome-forming volcanoes, with wide potential use and transferability to forecasts of other types of eruptions and other adverse events across the geohazard spectrum.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, P.Y.; Wassom, J.S.

    Scientific and technological developments bring unprecedented stress to our environment. Society has to predict the results of potential health risks from technologically based actions that may have serious, far-reaching consequences. The potential for error in making such predictions or assessment is great and multiplies with the increasing size and complexity of the problem being studied. Because of this, the availability and use of reliable data is the key to any successful forecasting effort. Scientific research and development generate new data and information. Much of the scientific data being produced daily is stored in computers for subsequent analysis. This situation providesmore » both an invaluable resource and an enormous challenge. With large amounts of government funds being devoted to health and environmental research programs and with maintenance of our living environment at stake, we must make maximum use of the resulting data to forecast and avert catastrophic effects. Along with the readily available. The most efficient means of obtaining the data necessary for assessing the health effects of chemicals is to utilize applications include the toxicology databases and information files developed at ORNL. To make most efficient use of the data/information that has already been prepared, attention and resources should be directed toward projects that meticulously evaluate the available data/information and create specialized peer-reviewed value-added databases. Such projects include the National Library of Medicine`s Hazardous Substances Data Bank, and the U.S. Air Force Installation Restoration Toxicology Guide. These and similar value-added toxicology databases were developed at ORNL and are being maintained and updated. These databases and supporting information files, as well as some data evaluation techniques are discussed in this paper with special focus on how they are used to assess potential health effects of environmental agents. 19 refs., 5 tabs.« less

  16. Assessing high shares of renewable energies in district heating systems - a case study for the city of Herten

    NASA Astrophysics Data System (ADS)

    Aydemir, Ali; Popovski, Eftim; Bellstädt, Daniel; Fleiter, Tobias; Büchele, Richard

    2017-11-01

    Many earlier studies have assessed the DH generation mix without taking explicitly into account future changes in the building stock and heat demand. The approach of this study consists of three steps that combine stock modeling, energy demand forecasting, and simulation of different energy technologies. First, a detailed residential building stock model for Herten is constructed by using remote sensing together with a typology for the German building stock. Second, a bottom-up simulation model is used which calculates the thermal energy demand based on energy-related investments in buildings in order to forecast the thermal demand up to 2050. Third, solar thermal fields in combination with large-scale heat pumps are sized as an alternative to the current coal-fired CHPs. We finally assess cost of heat and CO2 reduction for these units for two scenarios which differ with regard to the DH expansion. It can be concluded that up to 2030 and 2050 a substantial reduction in buildings heat demand due to the improved building insulation is expected. The falling heat demand in the DH substantially reduces the economic feasibility of new RES generation capacity. This reduction might be compensated by continuously connecting apartment buildings to the DH network until 2050.

  17. Regional demand forecasting and simulation model: user's manual. Task 4, final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parhizgari, A M

    1978-09-25

    The Department of Energy's Regional Demand Forecasting Model (RDFOR) is an econometric and simulation system designed to estimate annual fuel-sector-region specific consumption of energy for the US. Its purposes are to (1) provide the demand side of the Project Independence Evaluation System (PIES), (2) enhance our empirical insights into the structure of US energy demand, and (3) assist policymakers in their decisions on and formulations of various energy policies and/or scenarios. This report provides a self-contained user's manual for interpreting, utilizing, and implementing RDFOR simulation software packages. Chapters I and II present the theoretical structure and the simulation of RDFOR,more » respectively. Chapter III describes several potential scenarios which are (or have been) utilized in the RDFOR simulations. Chapter IV presents an overview of the complete software package utilized in simulation. Chapter V provides the detailed explanation and documentation of this package. The last chapter describes step-by-step implementation of the simulation package using the two scenarios detailed in Chapter III. The RDFOR model contains 14 fuels: gasoline, electricity, natural gas, distillate and residual fuels, liquid gases, jet fuel, coal, oil, petroleum products, asphalt, petroleum coke, metallurgical coal, and total fuels, spread over residential, commercial, industrial, and transportation sectors.« less

  18. Decision Support on the Sediments Flushing of Aimorés Dam Using Medium-Range Ensemble Forecasts

    NASA Astrophysics Data System (ADS)

    Mainardi Fan, Fernando; Schwanenberg, Dirk; Collischonn, Walter; Assis dos Reis, Alberto; Alvarado Montero, Rodolfo; Alencar Siqueira, Vinicius

    2015-04-01

    In the present study we investigate the use of medium-range streamflow forecasts in the Doce River basin (Brazil), at the reservoir of Aimorés Hydro Power Plant (HPP). During daily operations this reservoir acts as a "trap" to the sediments that originate from the upstream basin of the Doce River. This motivates a cleaning process called "pass through" to periodically remove the sediments from the reservoir. The "pass through" or "sediments flushing" process consists of a decrease of the reservoir's water level to a certain flushing level when a determined reservoir inflow threshold is forecasted. Then, the water in the approaching inflow is used to flush the sediments from the reservoir through the spillway and to recover the original reservoir storage. To be triggered, the sediments flushing operation requires an inflow larger than 3000m³/s in a forecast horizon of 7 days. This lead-time of 7 days is far beyond the basin's concentration time (around 2 days), meaning that the forecasts for the pass through procedure highly depends on Numerical Weather Predictions (NWP) models that generate Quantitative Precipitation Forecasts (QPF). This dependency creates an environment with a high amount of uncertainty to the operator. To support the decision making at Aimorés HPP we developed a fully operational hydrological forecasting system to the basin. The system is capable of generating ensemble streamflow forecasts scenarios when driven by QPF data from meteorological Ensemble Prediction Systems (EPS). This approach allows accounting for uncertainties in the NWP at a decision making level. This system is starting to be used operationally by CEMIG and is the one shown in the present study, including a hindcasting analysis to assess the performance of the system for the specific flushing problem. The QPF data used in the hindcasting study was derived from the TIGGE (THORPEX Interactive Grand Global Ensemble) database. Among all EPS available on TIGGE, three were selected: ECMWF, GEFS, and CPTEC. As a deterministic reference forecast, we adopt the high resolution ECMWF forecast for comparison. The experiment consisted on running retrospective forecasts for a full five-year period. To verify the proposed objectives of the study, we use different metrics to evaluate the forecast: ROC Curves, Exceedance Diagrams, Forecast Convergence Score (FCS). Metrics results enabled to understand the benefits of the hydrological ensemble prediction system as a decision making tool for the HPP operation. The ROC scores indicate that the use of the lower percentiles of the ensemble scenarios issues for a true alarm rate around 0,5 to 0,8 (depending on the model and on the percentile), for the lead time of seven days. While the false alarm rate is between 0 and 0,3. Those rates were better than the ones resulting from the deterministic reference forecast. Exceedance diagrams and forecast convergence scores indicate that the ensemble scenarios provide an early signal about the threshold crossing. Furthermore, the ensemble forecasts are more consistent between two subsequent forecasts in comparison to the deterministic forecast. The assessments results also give more credibility to CEMIG in the realization and communication of flushing operation with the stakeholders involved.

  19. Improvement of Storm Forecasts Using Gridded Bayesian Linear Regression for Northeast United States

    NASA Astrophysics Data System (ADS)

    Yang, J.; Astitha, M.; Schwartz, C. S.

    2017-12-01

    Bayesian linear regression (BLR) is a post-processing technique in which regression coefficients are derived and used to correct raw forecasts based on pairs of observation-model values. This study presents the development and application of a gridded Bayesian linear regression (GBLR) as a new post-processing technique to improve numerical weather prediction (NWP) of rain and wind storm forecasts over northeast United States. Ten controlled variables produced from ten ensemble members of the National Center for Atmospheric Research (NCAR) real-time prediction system are used for a GBLR model. In the GBLR framework, leave-one-storm-out cross-validation is utilized to study the performances of the post-processing technique in a database composed of 92 storms. To estimate the regression coefficients of the GBLR, optimization procedures that minimize the systematic and random error of predicted atmospheric variables (wind speed, precipitation, etc.) are implemented for the modeled-observed pairs of training storms. The regression coefficients calculated for meteorological stations of the National Weather Service are interpolated back to the model domain. An analysis of forecast improvements based on error reductions during the storms will demonstrate the value of GBLR approach. This presentation will also illustrate how the variances are optimized for the training partition in GBLR and discuss the verification strategy for grid points where no observations are available. The new post-processing technique is successful in improving wind speed and precipitation storm forecasts using past event-based data and has the potential to be implemented in real-time.

  20. Evaluation of TIGGE Ensemble Forecasts of Precipitation in Distinct Climate Regions in Iran

    NASA Astrophysics Data System (ADS)

    Aminyavari, Saleh; Saghafian, Bahram; Delavar, Majid

    2018-04-01

    The application of numerical weather prediction (NWP) products is increasing dramatically. Existing reports indicate that ensemble predictions have better skill than deterministic forecasts. In this study, numerical ensemble precipitation forecasts in the TIGGE database were evaluated using deterministic, dichotomous (yes/no), and probabilistic techniques over Iran for the period 2008-16. Thirteen rain gauges spread over eight homogeneous precipitation regimes were selected for evaluation. The Inverse Distance Weighting and Kriging methods were adopted for interpolation of the prediction values, downscaled to the stations at lead times of one to three days. To enhance the forecast quality, NWP values were post-processed via Bayesian Model Averaging. The results showed that ECMWF had better scores than other products. However, products of all centers underestimated precipitation in high precipitation regions while overestimating precipitation in other regions. This points to a systematic bias in forecasts and demands application of bias correction techniques. Based on dichotomous evaluation, NCEP did better at most stations, although all centers overpredicted the number of precipitation events. Compared to those of ECMWF and NCEP, UKMO yielded higher scores in mountainous regions, but performed poorly at other selected stations. Furthermore, the evaluations showed that all centers had better skill in wet than in dry seasons. The quality of post-processed predictions was better than those of the raw predictions. In conclusion, the accuracy of the NWP predictions made by the selected centers could be classified as medium over Iran, while post-processing of predictions is recommended to improve the quality.

  1. USDA Foreign Agricultural Service overview for operational monitoring of current crop conditions and production forecasts.

    NASA Astrophysics Data System (ADS)

    Crutchfield, J.

    2016-12-01

    The presentation will discuss the current status of the International Production Assessment Division of the USDA ForeignAgricultural Service for operational monitoring and forecasting of current crop conditions, and anticipated productionchanges to produce monthly, multi-source consensus reports on global crop conditions including the use of Earthobservations (EO) from satellite and in situ sources.United States Department of Agriculture (USDA) Foreign Agricultural Service (FAS) International Production AssessmentDivision (IPAD) deals exclusively with global crop production forecasting and agricultural analysis in support of the USDAWorld Agricultural Outlook Board (WAOB) lockup process and contributions to the World Agricultural Supply DemandEstimates (WASE) report. Analysts are responsible for discrete regions or countries and conduct in-depth long-termresearch into national agricultural statistics, farming systems, climatic, environmental, and economic factors affectingcrop production. IPAD analysts become highly valued cross-commodity specialists over time, and are routinely soughtout for specialized analyses to support governmental studies. IPAD is responsible for grain, oilseed, and cotton analysison a global basis. IPAD is unique in the tools it uses to analyze crop conditions around the world, including customweather analysis software and databases, satellite imagery and value-added image interpretation products. It alsoincorporates all traditional agricultural intelligence resources into its forecasting program, to make the fullest use ofavailable information in its operational commodity forecasts and analysis. International travel and training play animportant role in learning about foreign agricultural production systems and in developing analyst knowledge andcapabilities.

  2. High-resolution atmospheric emission inventory of the argentine energy sector. Comparison with edgar global emission database.

    PubMed

    Puliafito, S Enrique; Allende, David G; Castesana, Paula S; Ruggeri, Maria F

    2017-12-01

    This study presents a 2014 high-resolution spatially disaggregated emission inventory (0.025° × 0.025° horizontal resolution), of the main activities in the energy sector in Argentina. The sub-sectors considered are public generation of electricity, oil refineries, cement production, transport (maritime, air, rail and road), residential and commercial. The following pollutants were included: greenhouse gases (CO 2 , CH 4 , N 2 O), ozone precursors (CO, NOx, VOC) and other specific air quality indicators such as SO 2 , PM10, and PM2.5. This work could contribute to a better geographical allocation of the pollutant sources through census based population maps. Considering the sources of greenhouse gas emissions, the total amount is 144 Tg CO2eq, from which the transportation sector emits 57.8 Tg (40%); followed by electricity generation, with 40.9 Tg (28%); residential + commercial, with 31.24 Tg (22%); and cement and refinery production, with 14.3 Tg (10%). This inventory shows that 49% of the total emissions occur in rural areas: 31% in rural areas of medium population density, 13% in intermediate urban areas and 7% in densely populated urban areas. However, if emissions are analyzed by extension (per square km), the largest impact is observed in medium and densely populated urban areas, reaching more than 20.3 Gg per square km of greenhouse gases, 297 Mg/km 2 of ozone precursors gases and 11.5 Mg/km 2 of other air quality emissions. A comparison with the EDGAR global emission database shows that, although the total country emissions are similar for several sub sectors and pollutants, its spatial distribution is not applicable to Argentina. The road and residential transport emissions represented by EDGAR result in an overestimation of emissions in rural areas and an underestimation in urban areas, especially in more densely populated areas. EDGAR underestimates 60 Gg of methane emissions from road transport sector and fugitive emissions from refining activities.

  3. Residential radon and lung cancer incidence in a Danish cohort

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braeuner, Elvira V., E-mail: ole@cancer.dk; Danish Building Research Institute, Aalborg University; Andersen, Claus E.

    High-level occupational radon exposure is an established risk factor for lung cancer. We assessed the long-term association between residential radon and lung cancer risk using a prospective Danish cohort using 57,053 persons recruited during 1993-1997. We followed each cohort member for cancer occurrence until 27 June 2006, identifying 589 lung cancer cases. We traced residential addresses from 1 January 1971 until 27 June 2006 and calculated radon at each of these addresses using information from central databases regarding geology and house construction. Cox proportional hazards models were used to estimate incidence rate ratios (IRR) and 95% confidence intervals (CI) formore » lung cancer risk associated with residential radon exposure with and without adjustment for sex, smoking variables, education, socio-economic status, occupation, body mass index, air pollution and consumption of fruit and alcohol. Potential effect modification by sex, traffic-related air pollution and environmental tobacco smoke was assessed. Median estimated radon was 35.8 Bq/m{sup 3}. The adjusted IRR for lung cancer was 1.04 (95% CI: 0.69-1.56) in association with a 100 Bq/m{sup 3} higher radon concentration and 1.67 (95% CI: 0.69-4.04) among non-smokers. We found no evidence of effect modification. We find a positive association between radon and lung cancer risk consistent with previous studies but the role of chance cannot be excluded as these associations were not statistically significant. Our results provide valuable information at the low-level radon dose range.« less

  4. Oseltamivir in influenza outbreaks in care homes: challenges and benefits of use in the real world.

    PubMed

    Millership, S; Cummins, A

    2015-08-01

    Respiratory virus infections, including influenza, are an important cause of potentially avoidable hospital admissions in the elderly. Although recent reviews have questioned the efficacy of oseltamivir in the prevention of transmission, it has been a central part of the authors' strategy to manage outbreaks in residential homes. To evaluate the management of respiratory virus infection outbreaks in residential homes, with particular emphasis on the logistics and effectiveness of antiviral prophylaxis with oseltamivir. A descriptive analysis was undertaken from a retrospective survey of records held on a local database for three northern hemisphere influenza seasons from 2010 to 2013. In total, 75 respiratory outbreaks were reported from 590 care homes during the study period; of these, the aetiological agent was confirmed as influenza in 35 outbreaks. Overall attack, hospital admission and death rates for influenza were 29.7%, 5.3% and 3.3%, respectively. A further 10 outbreaks were caused by parainfluenza, human metapneumovirus or respiratory syncytial virus in combination with each other or rhinovirus, and six outbreaks were caused by rhinovirus alone. No agent was identified for the remaining 24 outbreaks. Early public health involvement can lead to rapid termination of outbreaks of respiratory virus infections in residential homes. Although the use of oseltamivir is expensive, the data suggest that it does have some benefits as prophylaxis in this setting. Trials are needed to determine the most clinically and cost-effective interventions to control outbreaks in residential homes and avoid hospital admissions. Copyright © 2015 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  5. Nutritional status and health care costs for the elderly living in municipal residential homes--an intervention study.

    PubMed

    Lorefält, B; Andersson, A; Wirehn, A B; Wilhelmsson, S

    2011-02-01

    The aim was to study the effect of individualised meals on nutritional status among older people living in municipal residential homes and to compare the results with a control group. An additional aim was to estimate direct health care costs for both groups. Six different municipal residential homes in the south-east of Sweden. Older people living in three residential homes constituted the intervention group n=42 and the rest constituted the control group n=67. A multifaceted intervention design was used. Based on an interview with staff a tailored education programme about nutritional care, including both theoretical and practical issues, was carried through to staff in the intervention group. Nutritional status among the elderly was measured by Mini Nutritional Assessment (MNA), individualised meals were offered to the residents based on the results of the MNA. Staff in the control group only received education on how to measure MNA and the residents followed the usual meal routines. Nutritional status was measured by MNA at baseline and after 3 months. Cost data on health care visits during 2007 were collected from the Cost Per Patient database. Nutritional status improved and body weight increased after 3 months in the intervention group. Thus, primary health care costs constituted about 80% of the total median cost in the intervention group and about 55% in the control group. With improved knowledge the staff could offer the elderly more individualised meals. One of their future challenges is to recognise and assess nutritional status among this group. If malnutrition could be prevented health care costs should be reduced.

  6. High-order fuzzy time-series based on multi-period adaptation model for forecasting stock markets

    NASA Astrophysics Data System (ADS)

    Chen, Tai-Liang; Cheng, Ching-Hsue; Teoh, Hia-Jong

    2008-02-01

    Stock investors usually make their short-term investment decisions according to recent stock information such as the late market news, technical analysis reports, and price fluctuations. To reflect these short-term factors which impact stock price, this paper proposes a comprehensive fuzzy time-series, which factors linear relationships between recent periods of stock prices and fuzzy logical relationships (nonlinear relationships) mined from time-series into forecasting processes. In empirical analysis, the TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) and HSI (Heng Seng Index) are employed as experimental datasets, and four recent fuzzy time-series models, Chen’s (1996), Yu’s (2005), Cheng’s (2006) and Chen’s (2007), are used as comparison models. Besides, to compare with conventional statistic method, the method of least squares is utilized to estimate the auto-regressive models of the testing periods within the databases. From analysis results, the performance comparisons indicate that the multi-period adaptation model, proposed in this paper, can effectively improve the forecasting performance of conventional fuzzy time-series models which only factor fuzzy logical relationships in forecasting processes. From the empirical study, the traditional statistic method and the proposed model both reveal that stock price patterns in the Taiwan stock and Hong Kong stock markets are short-term.

  7. Waterspout Forecasting Method Over the Eastern Adriatic Using a High-Resolution Numerical Weather Model

    NASA Astrophysics Data System (ADS)

    Renko, Tanja; Ivušić, Sarah; Telišman Prtenjak, Maja; Šoljan, Vinko; Horvat, Igor

    2018-03-01

    In this study, a synoptic and mesoscale analysis was performed and Szilagyi's waterspout forecasting method was tested on ten waterspout events in the period of 2013-2016. Data regarding waterspout occurrences were collected from weather stations, an online survey at the official website of the National Meteorological and Hydrological Service of Croatia and eyewitness reports from newspapers and the internet. Synoptic weather conditions were analyzed using surface pressure fields, 500 hPa level synoptic charts, SYNOP reports and atmospheric soundings. For all observed waterspout events, a synoptic type was determined using the 500 hPa geopotential height chart. The occurrence of lightning activity was determined from the LINET lightning database, and waterspouts were divided into thunderstorm-related and "fair weather" ones. Mesoscale characteristics (with a focus on thermodynamic instability indices) were determined using the high-resolution (500 m grid length) mesoscale numerical weather model and model results were compared with the available observations. Because thermodynamic instability indices are usually insufficient for forecasting waterspout activity, the performance of the Szilagyi Waterspout Index (SWI) was tested using vertical atmospheric profiles provided by the mesoscale numerical model. The SWI successfully forecasted all waterspout events, even the winter events. This indicates that the Szilagyi's waterspout prognostic method could be used as a valid prognostic tool for the eastern Adriatic.

  8. Development of a mobile app for flash flood alerting and data cataloging

    NASA Astrophysics Data System (ADS)

    Gourley, J. J.; Flamig, Z.; Nguyen, M.

    2016-12-01

    No matter how accurate and specific a forecast of flash flooding is made, there are local nuances with the communities related to the built environment that often dictate the locations and magnitudes of impacts. These are difficult, if not impossible, to identify, classify, and measure using remote sensing methods. This presentation presents a Thriving Earth Exchange project that is developing a mobile app that serves two purposes. First, it will provide detailed forecasts of flash flooding down to the 1-km pixel scale with 10-min updates using the state-of-the-science hydrologic forecasting system called FLASH. The display of model outputs on an app will greatly facilitate their use and can potentially increase first responders' reactions to the specific locations of impending disasters. Then, the first responders will have the capability of reporting the geotagged impacts they are witnessing, including those local "trouble spots". Over time, we will catalog the trouble spots for the community so that they can be flagged in future events. If proven effective, the app will then be advertised in other flood-prone communities and the database will be expanded accordingly. In summary, we are engaging local communities to provide information that can inform and improve future forecasts of flash flood, ultimately reducing their impacts and saving lives.

  9. Community-based early warning systems for flood risk mitigation in Nepal

    NASA Astrophysics Data System (ADS)

    Smith, Paul J.; Brown, Sarah; Dugar, Sumit

    2017-03-01

    This paper focuses on the use of community-based early warning systems for flood resilience in Nepal. The first part of the work outlines the evolution and current status of these community-based systems, highlighting the limited lead times currently available for early warning. The second part of the paper focuses on the development of a robust operational flood forecasting methodology for use by the Nepal Department of Hydrology and Meteorology (DHM) to enhance early warning lead times. The methodology uses data-based physically interpretable time series models and data assimilation to generate probabilistic forecasts, which are presented in a simple visual tool. The approach is designed to work in situations of limited data availability with an emphasis on sustainability and appropriate technology. The successful application of the forecast methodology to the flood-prone Karnali River basin in western Nepal is outlined, increasing lead times from 2-3 to 7-8 h. The challenges faced in communicating probabilistic forecasts to the last mile of the existing community-based early warning systems across Nepal is discussed. The paper concludes with an assessment of the applicability of this approach in basins and countries beyond Karnali and Nepal and an overview of key lessons learnt from this initiative.

  10. He said what? Physiological and cognitive responses to imagining and witnessing outgroup racism.

    PubMed

    Karmali, Francine; Kawakami, Kerry; Page-Gould, Elizabeth

    2017-08-01

    Responses to outgroup racism can have serious implications for the perpetuation of bias, yet research examining this process is rare. The present research investigated self-reported, physiological, and cognitive responses among "experiencers" who witnessed and "forecasters" who imagined a racist comment targeting an outgroup member. Although previous research indicates that experiencers self-reported less distress and chose a racist partner more often than forecasters, the present results explored the possibility that experiencers may actually be distressed in such situation but regulate their initial affective reactions. The results from Experiment 1 demonstrated that participants in both roles showed (a) no activation of the hypothalamic pituitary adrenal stress axis (decreased cortisol) and (b) activation of the sympathetic autonomic nervous system (increased skin conductance). However, experiencers but not forecasters displayed a physiological profile indicative of an orienting response (decreased heart rate and increased skin conductance) rather than a defensive response (increased heart rate and increased skin conductance). Furthermore, the results from Experiment 2 provided additional evidence that experiencers are not distressed or regulating their emotional responses. In particular, experiencers showed less cognitive impairment on a Stroop task than forecasters. Together these findings indicate that when people actually encounter outgroup bias, they respond with apathy and do not censure the racist. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  11. Personality and affective forecasting: trait introverts underpredict the hedonic benefits of acting extraverted.

    PubMed

    Zelenski, John M; Whelan, Deanna C; Nealis, Logan J; Besner, Christina M; Santoro, Maya S; Wynn, Jessica E

    2013-06-01

    People report enjoying momentary extraverted behavior, and this does not seem to depend on trait levels of introversion-extraversion. Assuming that introverts desire enjoyment, this finding raises the question, why do introverts not act extraverted more often? This research explored a novel explanation, that trait introverts make an affective forecasting error, underpredicting the hedonic benefits of extraverted behavior. Study 1 (n = 97) found that trait introverts forecast less activated positive and pleasant affect and more negative and self-conscious affect (compared to extraverts) when asked to imagine acting extraverted, but not introverted, across a variety of hypothetical situations. Studies 2-5 (combined n = 495) found similar results using a between-subjects approach and laboratory situations. We replicated findings that people enjoy acting extraverted and that this does not depend on disposition. Accordingly, the personality differences in affective forecasts represent errors. In these studies, introverts tended to be less accurate, particularly by overestimating the negative affect and self-consciousness associated with their extraverted behavior. This may explain why introverts do not act extraverted more often (i.e., they overestimate hedonic costs that do not actually materialize) and have implications for understanding, and potentially trying to change, introverts' characteristically lower levels of happiness. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  12. Decision Model for Forecasting Projected Naval Enlisted Reserve Attainments

    DTIC Science & Technology

    2008-12-01

    Command CM Construction Mechanic CS Culinary Specialist CTA Cryptologic Technician - Administrative CTI Cryptologic Technician - Interpretive...services are utilized to compile databases of active duty and reserve accession and loss Category Arts and Photography Journalist (JO) Photographer’s...MM) Mineman (MN) Torpedoman’s Mate (TM) Food, Restaurant, and Lodging Culinary Specialist (CS) Human Resources Navy Counselor (NC) Personnelman (PN

  13. Initialization of high resolution surface wind simulations using NWS gridded data

    Treesearch

    J. Forthofer; K. Shannon; Bret Butler

    2010-01-01

    WindNinja is a standalone computer model designed to provide the user with simulations of surface wind flow. It is deterministic and steady state. It is currently being modified to allow the user to initialize the flow calculation using National Digital Forecast Database. It essentially allows the user to downscale the coarse scale simulations from meso-scale models to...

  14. Evaluation of the mining techniques in constructing a traditional Chinese-language nursing recording system.

    PubMed

    Liao, Pei-Hung; Chu, William; Chu, Woei-Chyn

    2014-05-01

    In 2009, the Department of Health, part of Taiwan's Executive Yuan, announced the advent of electronic medical records to reduce medical expenses and facilitate the international exchange of medical record information. An information technology platform for nursing records in medical institutions was then quickly established, which improved nursing information systems and electronic databases. The purpose of the present study was to explore the usability of the data mining techniques to enhance completeness and ensure consistency of nursing records in the database system.First, the study used a Chinese word-segmenting system on common and special terms often used by the nursing staff. We also used text-mining techniques to collect keywords and create a keyword lexicon. We then used an association rule and artificial neural network to measure the correlation and forecasting capability for keywords. Finally, nursing staff members were provided with an on-screen pop-up menu to use when establishing nursing records. Our study found that by using mining techniques we were able to create a powerful keyword lexicon and establish a forecasting model for nursing diagnoses, ensuring the consistency of nursing terminology and improving the nursing staff's work efficiency and productivity.

  15. Modeling residential lawn fertilization practices: integrating high resolution remote sensing with socioeconomic data.

    PubMed

    Zhou, Weiqi; Troy, Austin; Grove, Morgan

    2008-05-01

    This article investigates how remotely sensed lawn characteristics, such as parcel lawn area and parcel lawn greenness, combined with household characteristics, can be used to predict household lawn fertilization practices on private residential lands. This study involves two watersheds, Glyndon and Baisman's Run, in Baltimore County, Maryland, USA. Parcel lawn area and lawn greenness were derived from high-resolution aerial imagery using an object-oriented classification approach. Four indicators of household characteristics, including lot size, square footage of the house, housing value, and housing age were obtained from a property database. Residential lawn care survey data combined with remotely sensed parcel lawn area and greenness data were used to estimate two measures of household lawn fertilization practices, household annual fertilizer nitrogen application amount (N_yr) and household annual fertilizer nitrogen application rate (N_ha_yr). Using multiple regression with multi-model inferential procedures, we found that a combination of parcel lawn area and parcel lawn greenness best predicts N_yr, whereas a combination of parcel lawn greenness and lot size best predicts variation in N_ha_yr. Our analyses show that household fertilization practices can be effectively predicted by remotely sensed lawn indices and household characteristics. This has significant implications for urban watershed managers and modelers.

  16. Joint physical custody and adolescents' subjective well-being: a personality × environment interaction.

    PubMed

    Sodermans, An Katrien; Matthijs, Koen

    2014-06-01

    Shared residence after divorce is rising in most Western countries and legally recommended by law in Belgium since 2006. Living with both parents after divorce is assumed to increase children's well-being, through a better parent-child relationship, but may also be stressful, as children live in 2 different family settings. In this study, we investigate whether the association between the residential arrangement of adolescents and 3 measures of subjective well-being (depressive feelings, life satisfaction, and self-esteem) is moderated by the Big Five personality factors. The sample is selected from the national representative Divorce in Flanders study and contains information about 506 children from divorced parents between 14- and 21-years-old. Our findings indicated a consistent pattern of interactions between conscientiousness and joint physical custody for 2 of the 3 subjective well-being indicators. The specific demands of this residential arrangement (making frequent transitions, living at 2 places, adjustment to 2 different lifestyles, etc.) may interfere with the nature of conscientious adolescents: being organized, ordered, and planful. Our results showed support for a Person × Environment interaction, and demonstrate the need for considering the individual characteristics of the child when settling postdivorce residential arrangements. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  17. Employment and residential characteristics in relation to automated external defibrillator locations

    PubMed Central

    Griffis, Heather M.; Band, Roger A; Ruther, Matthew; Harhay, Michael; Asch, David A.; Hershey, John C.; Hill, Shawndra; Nadkarni, Lindsay; Kilaru, Austin; Branas, Charles C.; Shofer, Frances; Nichol, Graham; Becker, Lance B.; Merchant, Raina M.

    2015-01-01

    Background Survival from out-of-hospital cardiac arrest (OHCA) is generally poor and varies by geography. Variability in automated external defibrillator (AED) locations may be a contributing factor. To inform optimal placement of AEDs, we investigated AED access in a major US city relative to demographic and employment characteristics. Methods and Results This was a retrospective analysis of a Philadelphia AED registry (2,559 total AEDs). The 2010 US Census and the Local Employment Dynamics (LED) database by ZIP code was used. AED access was calculated as the weighted areal percentage of each ZIP code covered by a 400 meter radius around each AED. Of 47 ZIP codes, only 9%(4) were high AED service areas. In 26%(12) of ZIP codes, less than 35% of the area was covered by AED service areas. Higher AED access ZIP codes were more likely to have a moderately populated residential area (p=0.032), higher median household income (p=0.006), and higher paying jobs (p=008). Conclusions The locations of AEDs vary across specific ZIP codes; select residential and employment characteristics explain some variation. Further work on evaluating OHCA locations, AED use and availability, and OHCA outcomes could inform AED placement policies. Optimizing the placement of AEDs through this work may help to increase survival. PMID:26856232

  18. Novel approaches for an enhanced geothermal development of residential sites

    NASA Astrophysics Data System (ADS)

    Schelenz, Sophie; Firmbach, Linda; Shao, Haibing; Dietrich, Peter; Vienken, Thomas

    2015-04-01

    An ongoing technological enhancement drives an increasing use of shallow geothermal systems for heating and cooling applications. However, even in areas with intensive shallow geothermal use, planning of geothermal systems is in many cases solely based on geological maps, drilling databases, and literature references. Thus, relevant heat transport parameters are rather approximated than measured for the specific site. To increase the planning safety and promote the use of renewable energies in the domestic sector, this study investigates a novel concept for an enhanced geothermal development of residential neighbourhoods. This concept is based on a site-specific characterization of subsurface conditions and the implementation of demand-oriented geothermal usage options. Therefore, an investigation approach has been tested that combines non-invasive with minimum-invasive exploration methods. While electrical resistivity tomography has been applied to characterize the geological subsurface structure, Direct Push soundings enable a detailed, vertical high-resolution characterization of the subsurface surrounding the borehole heat exchangers. The benefit of this site-specific subsurface investigation is highlighted for 1) a more precise design of shallow geothermal systems and 2) a reliable prediction of induced long-term changes in groundwater temperatures. To guarantee the financial feasibility and practicability of the novel geothermal development, three different options for its implementation in residential neighbourhoods were consequently deduced.

  19. Implicit identification with drug and alcohol use predicts retention in residential rehabilitation programs.

    PubMed

    Wolff, Nathan; von Hippel, Courtney; Brener, Loren; von Hippel, William

    2015-03-01

    Research has identified numerous factors associated with successful treatment in alcohol and drug rehabilitation programs, yet treatment completion rates are often low and subsequent relapse rates very high. We propose that people's implicit identification with drugs and alcohol may be an additional factor that impacts their ability to complete abstinence-based rehabilitation programs. In the current research, we measured implicit identification with drugs and alcohol using the Implicit Association Test (Greenwald, McGhee, & Schwartz, 1998) among 137 members of a residential rehabilitation program for drugs and alcohol (104 men; mean age = 35 years old, 47 of whom were court-ordered to attend). Implicit identification with drugs and alcohol was measured within 1 week of arrival and again 3 weeks later, prior to the onset of the treatment phase of the program. Duration in rehabilitation was assessed 1 year later. Consistent with predictions, implicit identification with drugs and alcohol predicted the duration that people remained in residential rehabilitation even though a self-report measure of identification with drugs and alcohol did not. These results suggest that implicit identification with drugs and alcohol might be an important predictor of treatment outcomes, even among those with serious problems with drug and alcohol use. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  20. The potential predictability of fire danger provided by ECMWF forecast

    NASA Astrophysics Data System (ADS)

    Di Giuseppe, Francesca

    2017-04-01

    The European Forest Fire Information System (EFFIS), is currently being developed in the framework of the Copernicus Emergency Management Services to monitor and forecast fire danger in Europe. The system provides timely information to civil protection authorities in 38 nations across Europe and mostly concentrates on flagging regions which might be at high danger of spontaneous ignition due to persistent drought. The daily predictions of fire danger conditions are based on the US Forest Service National Fire Danger Rating System (NFDRS), the Canadian forest service Fire Weather Index Rating System (FWI) and the Australian McArthur (MARK-5) rating systems. Weather forcings are provided in real time by the European Centre for Medium range Weather Forecasts (ECMWF) forecasting system. The global system's potential predictability is assessed using re-analysis fields as weather forcings. The Global Fire Emissions Database (GFED4) provides 11 years of observed burned areas from satellite measurements and is used as a validation dataset. The fire indices implemented are good predictors to highlight dangerous conditions. High values are correlated with observed fire and low values correspond to non observed events. A more quantitative skill evaluation was performed using the Extremal Dependency Index which is a skill score specifically designed for rare events. It revealed that the three indices were more skilful on a global scale than the random forecast to detect large fires. The performance peaks in the boreal forests, in the Mediterranean, the Amazon rain-forests and southeast Asia. The skill-scores were then aggregated at country level to reveal which nations could potentiallty benefit from the system information in aid of decision making and fire control support. Overall we found that fire danger modelling based on weather forecasts, can provide reasonable predictability over large parts of the global landmass.

  1. Using wavelet-feedforward neural networks to improve air pollution forecasting in urban environments.

    PubMed

    Dunea, Daniel; Pohoata, Alin; Iordache, Stefania

    2015-07-01

    The paper presents the screening of various feedforward neural networks (FANN) and wavelet-feedforward neural networks (WFANN) applied to time series of ground-level ozone (O3), nitrogen dioxide (NO2), and particulate matter (PM10 and PM2.5 fractions) recorded at four monitoring stations located in various urban areas of Romania, to identify common configurations with optimal generalization performance. Two distinct model runs were performed as follows: data processing using hourly-recorded time series of airborne pollutants during cold months (O3, NO2, and PM10), when residential heating increases the local emissions, and data processing using 24-h daily averaged concentrations (PM2.5) recorded between 2009 and 2012. Dataset variability was assessed using statistical analysis. Time series were passed through various FANNs. Each time series was decomposed in four time-scale components using three-level wavelets, which have been passed also through FANN, and recomposed into a single time series. The agreement between observed and modelled output was evaluated based on the statistical significance (r coefficient and correlation between errors and data). Daubechies db3 wavelet-Rprop FANN (6-4-1) utilization gave positive results for O3 time series optimizing the exclusive use of the FANN for hourly-recorded time series. NO2 was difficult to model due to time series specificity, but wavelet integration improved FANN performances. Daubechies db3 wavelet did not improve the FANN outputs for PM10 time series. Both models (FANN/WFANN) overestimated PM2.5 forecasted values in the last quarter of time series. A potential improvement of the forecasted values could be the integration of a smoothing algorithm to adjust the PM2.5 model outputs.

  2. Introduction of Drought Monitoring and Forecasting System based on Real-time Water Information Using ICT

    NASA Astrophysics Data System (ADS)

    Lee, Y., II; Kim, H. S.; Chun, G.

    2016-12-01

    There were severe damages such as restriction on water supply caused by continuous drought from 2014 to 2015 in South Korea. Through this drought event, government of South Korea decided to establish National Drought Information Analysis Center in K-water(Korea Water Resources Corporation) and introduce a national drought monitoring and early warning system to mitigate those damages. Drought index such as SPI(Standard Precipitation Index), PDSI(Palmer Drought Severity Index) and SMI(Soil Moisture Index) etc. have been developed and are widely used to provide drought information in many countries. However, drought indexes are not appropriate for drought monitoring and early warning in civilized countries with high population density such as South Korea because it could not consider complicated water supply network. For the national drought monitoring and forecasting of South Korea, `Drought Information Analysis System' (D.I.A.S) which is based on the real time data(storage, flowrate, waterlevel etc.) was developed. Based on its advanced methodology, `DIAS' is changing the paradigm of drought monitoring and early warning systems. Because `D.I.A.S' contains the information of water supply network from water sources to the people across the nation and provides drought information considering the real-time hydrological conditions of each and every water source. For instance, in case the water level of a specific dam declines to predetermined level of caution, `D.I.A.S' will notify people who uses the dam as a source of residential or industrial water. It is expected to provide credible drought monitoring and forecasting information with a strong relationship between drought information and the feelings of people rely on water users by `D.I.A.S'.

  3. Real-time short-term forecast of water inflow into Bureyskaya reservoir

    NASA Astrophysics Data System (ADS)

    Motovilov, Yury

    2017-04-01

    During several recent years, a methodology for operational optimization in hydrosystems including forecasts of the hydrological situation has been developed on example of Burea reservoir. The forecasts accuracy improvement of the water inflow into the reservoir during planning of water and energy regime was one of the main goals for implemented research. Burea river is the second left largest Amur tributary after Zeya river with its 70.7 thousand square kilometers watershed and 723 km-long river course. A variety of natural conditions - from plains in the southern part to northern mountainous areas determine a significant spatio-temporal variability in runoff generation patterns and river regime. Bureyskaya hydropower plant (HPP) with watershed area 65.2 thousand square kilometers is a key station in the Russian Far Eastern energy system providing its reliable operation. With a spacious reservoir, Bureyskaya HPP makes a significant contribution to the protection of the Amur region from catastrophic floods. A physically-based distributed model of runoff generation based on the ECOMAG (ECOlogical Model for Applied Geophysics) hydrological modeling platform has been developed for the Burea River basin. The model describes processes of interception of rainfall/snowfall by the canopy, snow accumulation and melt, soil freezing and thawing, water infiltration into unfrozen and frozen soil, evapotranspiration, thermal and water regime of soil, overland, subsurface, ground and river flow. The governing model's equations are derived from integration of the basic hydro- and thermodynamics equations of water and heat vertical transfer in snowpack, frozen/unfrozen soil, horizontal water flow under and over catchment slopes, etc. The model setup for Bureya river basin included watershed and river network schematization with GIS module by DEM analysis, meteorological time-series preparation, model calibration and validation against historical observations. The results showed good model performance as compared to observed inflow data into the Bureya reservoir and high diagnostic potential of data-modeling system of the runoff formation. With the use of this system the following flowchart for short-range forecasting inflow into Bureyskoe reservoir and forecast correction technique using continuously updated hydrometeorological data has been developed: 1 - Daily renewal of weather observations and forecasts database via the Internet; 2 - Daily runoff calculation from the beginning of the current year to current date is conducted; 3 - Short-range (up to 7 days) forecast is generated based on weather forecast. The idea underlying the model assimilation of newly obtained hydro meteorological information to adjust short-range hydrological forecasts lies in the assumption of the forecast errors inertia. Then the difference between calculated and observed streamflow at the forecast release date is "scattered" with specific weights to calculated streamflow for the forecast lead time. During 2016 this forecasts method of the inflow into the Bureyskaya reservoir up to 7 days is tested in online mode. Satisfactory evaluated short-range inflow forecast success rate is obtained. Tests of developed method have shown strong sensitivity to the results of short-term precipitation forecasts.

  4. Method of identification of patent trends based on descriptions of technical functions

    NASA Astrophysics Data System (ADS)

    Korobkin, D. M.; Fomenkov, S. A.; Golovanchikov, A. B.

    2018-05-01

    The use of the global patent space to determine the scientific and technological priorities for the technical systems development (identifying patent trends) allows one to forecast the direction of the technical systems development and, accordingly, select patents of priority technical subjects as a source for updating the technical functions database and physical effects database. The authors propose an original method that uses as trend terms not individual unigrams or n-gram (usually for existing methods and systems), but structured descriptions of technical functions in the form “Subject-Action-Object” (SAO), which in the authors’ opinion are the basis of the invention.

  5. Impact of urban sprawl on United States residential energy use

    NASA Astrophysics Data System (ADS)

    Rong, Fang

    Improving energy efficiency through technological advances has been the focus of U.S. energy policy for decades. However, there is evidence that technology alone will be neither sufficient nor timely enough to solve looming crises associated with fossil fuel dependence and resulting greenhouse gas accumulation. Hence attention is shifting to demand-side measures. While the impact of urban sprawl on transportation energy use has been studied to a degree, the impact of sprawl on non-transport residential energy use represents a new area of inquiry. This dissertation is the first study linking sprawl to residential energy use and provides empirical support for compact land-use developments, which, as a demand-side measure, might play an important role in achieving sustainable residential energy consumption. This dissertation develops an original conceptual framework linking urban sprawl to residential energy use through electricity transmission and distribution losses and two mediators, housing stock and formation of urban heat islands. These two mediators are the focuses of this dissertation. By tapping multiple databases and performing statistical and geographical spatial analyses, this dissertation finds that (1) big houses consume more energy than small ones and single-family detached housing consumes more energy than multifamily or single-family attached housing; (2) residents of sprawling metro areas are more likely to live in single-family detached rather than attached or multifamily housing and are also expected to live in big houses; (3) a compact metro area is expected to have stronger urban heat island effects; (4) nationwide, urban heat island phenomena bring about a small energy reward, due to less energy demand on space heating, while they impose an energy penalty in States with a hot climate like Texas, due to higher energy demand for cooling; and taken all these together, (5) residents of sprawling metro areas are expected to consume more energy at home than residents of compact metro areas. This dissertation concludes with the policy implications that emerged from this study and suggestions for future research as well.

  6. Disability, Work Absenteeism, Sickness Benefits, and Cancer in Selected European OECD Countries-Forecasts to 2020.

    PubMed

    Jakovljevic, Mihajlo; Malmose-Stapelfeldt, Christina; Milovanovic, Olivera; Rancic, Nemanja; Bokonjic, Dubravko

    2017-01-01

    Disability either due to illness, aging, or both causes remains an essential contributor shaping European labor markets. Ability of modern day welfare states to compensate an impaired work ability and absenteeism arising from incapacity is very diverse. The aims of this study were to establish and explain intercountry differences among selected European OECD countries and to provide forecasts of future work absenteeism and expenditures on wage replacement benefits. Two major public registries, European health for all database and Organization for Economic Co-operation and Development database (OECD Health Data), were coupled to form a joint database on 12 core indicators. These were related to disability, work absenteeism, and sickness benefits in European OECD countries. Time horizon 1989-2013 was observed. Forecasting analysis was done on mean values of all data for each single variable for all observed countries in a single year. Trends were predicted on a selected time horizon based on the mean value, in our case, 7 years up to 2020. For this purpose, ARIMA prediction model was applied, and its significance was assessed using Ljung-Box Q test. Our forecasts based on ARIMA modeling of available data indicate that up to 2020, most European countries will experience downfall of absenteeism from work due to illness. The number of citizens receiving social/disability benefits and the number being compensated due to health-related absence from work will decline. As opposed to these trends, cancer morbidity may become the top ranked disability driver as hospital discharge diagnoses. Concerning development is the anticipated bold growth of hospital discharge frequencies due to cancer across the region. This effectively means that part of these savings on social support expenditure shall effectively be spent to combat strong cancer morbidity as the major driver of disability. We have clearly growing work load for the national health systems attributable to the clinical oncology acting as the major disability contributor. This effectively means that large share of these savings on public expenditure shall effectively be spent to combat strong cancer morbidity. On another side, we have all signs of falling societal responsibility toward the citizens suffering from diverse kinds of incapacity or impaired working ability and independence. Citizens suffering from any of these causes are likely to experience progressively less social support and publicly funded care and work support compared to the golden welfare era of previous decades.

  7. Development of Real-time Tsunami Inundation Forecast Using Ocean Bottom Tsunami Networks along the Japan Trench

    NASA Astrophysics Data System (ADS)

    Aoi, S.; Yamamoto, N.; Suzuki, W.; Hirata, K.; Nakamura, H.; Kunugi, T.; Kubo, T.; Maeda, T.

    2015-12-01

    In the 2011 Tohoku earthquake, in which huge tsunami claimed a great deal of lives, the initial tsunami forecast based on hypocenter information estimated using seismic data on land were greatly underestimated. From this lesson, NIED is now constructing S-net (Seafloor Observation Network for Earthquakes and Tsunamis along the Japan Trench) which consists of 150 ocean bottom observatories with seismometers and pressure gauges (tsunamimeters) linked by fiber optic cables. To take full advantage of S-net, we develop a new methodology of real-time tsunami inundation forecast using ocean bottom observation data and construct a prototype system that implements the developed forecasting method for the Pacific coast of Chiba prefecture (Sotobo area). We employ a database-based approach because inundation is a strongly non-linear phenomenon and its calculation costs are rather heavy. We prepare tsunami scenario bank in advance, by constructing the possible tsunami sources, and calculating the tsunami waveforms at S-net stations, coastal tsunami heights and tsunami inundation on land. To calculate the inundation for target Sotobo area, we construct the 10-m-mesh precise elevation model with coastal structures. Based on the sensitivities analyses, we construct the tsunami scenario bank that efficiently covers possible tsunami scenarios affecting the Sotobo area. A real-time forecast is carried out by selecting several possible scenarios which can well explain real-time tsunami data observed at S-net from tsunami scenario bank. An advantage of our method is that tsunami inundations are estimated directly from the actual tsunami data without any source information, which may have large estimation errors. In addition to the forecast system, we develop Web services, APIs, and smartphone applications and brush them up through social experiments to provide the real-time tsunami observation and forecast information in easy way to understand toward urging people to evacuate.

  8. Reconstructing paleoclimate fields using online data assimilation with a linear inverse model

    NASA Astrophysics Data System (ADS)

    Perkins, Walter A.; Hakim, Gregory J.

    2017-05-01

    We examine the skill of a new approach to climate field reconstructions (CFRs) using an online paleoclimate data assimilation (PDA) method. Several recent studies have foregone climate model forecasts during assimilation due to the computational expense of running coupled global climate models (CGCMs) and the relatively low skill of these forecasts on longer timescales. Here we greatly diminish the computational cost by employing an empirical forecast model (linear inverse model, LIM), which has been shown to have skill comparable to CGCMs for forecasting annual-to-decadal surface temperature anomalies. We reconstruct annual-average 2 m air temperature over the instrumental period (1850-2000) using proxy records from the PAGES 2k Consortium Phase 1 database; proxy models for estimating proxy observations are calibrated on GISTEMP surface temperature analyses. We compare results for LIMs calibrated using observational (Berkeley Earth), reanalysis (20th Century Reanalysis), and CMIP5 climate model (CCSM4 and MPI) data relative to a control offline reconstruction method. Generally, we find that the usage of LIM forecasts for online PDA increases reconstruction agreement with the instrumental record for both spatial fields and global mean temperature (GMT). Specifically, the coefficient of efficiency (CE) skill metric for detrended GMT increases by an average of 57 % over the offline benchmark. LIM experiments display a common pattern of skill improvement in the spatial fields over Northern Hemisphere land areas and in the high-latitude North Atlantic-Barents Sea corridor. Experiments for non-CGCM-calibrated LIMs reveal region-specific reductions in spatial skill compared to the offline control, likely due to aspects of the LIM calibration process. Overall, the CGCM-calibrated LIMs have the best performance when considering both spatial fields and GMT. A comparison with the persistence forecast experiment suggests that improvements are associated with the linear dynamical constraints of the forecast and not simply persistence of temperature anomalies.

  9. Super Ensemble-based Aviation Turbulence Guidance (SEATG) for Air Traffic Management (ATM)

    NASA Astrophysics Data System (ADS)

    Kim, Jung-Hoon; Chan, William; Sridhar, Banavar; Sharman, Robert

    2014-05-01

    Super Ensemble (ensemble of ten turbulence metrics from time-lagged ensemble members of weather forecast data)-based Aviation Turbulence Guidance (SEATG) is developed using Weather Research and Forecasting (WRF) model and in-situ eddy dissipation rate (EDR) observations equipped on commercial aircraft over the contiguous United States. SEATG is a sequence of five procedures including weather modeling, calculating turbulence metrics, mapping EDR-scale, evaluating metrics, and producing final SEATG forecast. This uses similar methodology to the operational Graphic Turbulence Guidance (GTG) with three major improvements. First, SEATG use a higher resolution (3-km) WRF model to capture cloud-resolving scale phenomena. Second, SEATG computes turbulence metrics for multiple forecasts that are combined at the same valid time resulting in an time-lagged ensemble of multiple turbulence metrics. Third, SEATG provides both deterministic and probabilistic turbulence forecasts to take into account weather uncertainties and user demands. It is found that the SEATG forecasts match well with observed radar reflectivity along a surface front as well as convectively induced turbulence outside the clouds on 7-8 Sep 2012. And, overall performance skill of deterministic SEATG against the observed EDR data during this period is superior to any single turbulence metrics. Finally, probabilistic SEATG is used as an example application of turbulence forecast for air-traffic management. In this study, a simple Wind-Optimal Route (WOR) passing through the potential areas of probabilistic SEATG and Lateral Turbulence Avoidance Route (LTAR) taking into account the SEATG are calculated at z = 35000 ft (z = 12 km) from Los Angeles to John F. Kennedy international airports. As a result, WOR takes total of 239 minutes with 16 minutes of SEATG areas for 40% of moderate turbulence potential, while LTAR takes total of 252 minutes travel time that 5% of fuel would be additionally consumed to entirely avoid the moderate SEATG regions.

  10. Emissions Inventory of Anthropogenic PM2.5 and PM10 in Mega city, Delhi, India for Air Quality Forecasting during CWG- 2010

    NASA Astrophysics Data System (ADS)

    Sahu, S.; Beig, G.; Schultz, M.; Parkhi, N.; Stein, O.

    2012-04-01

    The mega city of Delhi is the second largest urban agglomeration in India with 16.7 mio. inhabitants. Delhi has the highest per capita power consumption of electricity in India and the demand has risen by more than 50% during the last decade. Emissions from commercial, power, domestic and industrial sectors have strongly increased causing more and more environmental problems due to air pollution and its adverse impacts on human health. Particulate matter (PM) of size less than 2.5-micron (PM2.5) and 10 micron (PM10) have emerged as primary pollutants of concern due to their adverse impact on human health. As part of the System of Air quality Forecasting and Research (SAFAR) project developed for air quality forecasting during the Commonwealth Games (CWG) - 2010, a high resolution Emission Inventory (EI) of PM10 and PM2.5 has been developed for the metropolitan city Delhi for the year 2010. The comprehensive inventory involves detailed activity data and has been developed for a domain of 70km×65km with a 1.67km×1.67km resolution covering Delhi and its surrounding region (i.e. National Capital Region (NCR)). In creating this inventory, Geographical Information System (GIS) based techniques were used for the first time in India. The major sectors considered are, transport, thermal power plants, industries, residential and commercial cooking along with windblown road dust which is found to play a major role for the megacity environment. Extensive surveys were conducted among the Delhi slum dwellers (Jhuggi) in order to obtain more robust estimates for the activity data related to domestic cooking and heating. Total emissions of PM10 and PM2.5 including wind blown dust over the study area are found to be 236 Gg/yr and 94 Gg/yr respectively. About half of the PM10 emissions stem from windblown road dust. The new emission inventory has been used for regional air quality forecasts in the Delhi region during the Commonwealth games (SAFAR project), and they will soon be tested in simulations of the global atmospheric composition in the framework of the European MACC project which provided the chemical boundary conditions to the regional air quality forecasts in 2010.

  11. MesoNAM Verification Phase II

    NASA Technical Reports Server (NTRS)

    Watson, Leela R.

    2011-01-01

    The 45th Weather Squadron Launch Weather Officers use the 12-km resolution North American Mesoscale model (MesoNAM) forecasts to support launch weather operations. In Phase I, the performance of the model at KSC/CCAFS was measured objectively by conducting a detailed statistical analysis of model output compared to observed values. The objective analysis compared the MesoNAM forecast winds, temperature, and dew point to the observed values from the sensors in the KSC/CCAFS wind tower network. In Phase II, the AMU modified the current tool by adding an additional 15 months of model output to the database and recalculating the verification statistics. The bias, standard deviation of bias, Root Mean Square Error, and Hypothesis test for bias were calculated to verify the performance of the model. The results indicated that the accuracy decreased as the forecast progressed, there was a diurnal signal in temperature with a cool bias during the late night and a warm bias during the afternoon, and there was a diurnal signal in dewpoint temperature with a low bias during the afternoon and a high bias during the late night.

  12. Surface drift prediction in the Adriatic Sea using hyper-ensemble statistics on atmospheric, ocean and wave models: Uncertainties and probability distribution areas

    USGS Publications Warehouse

    Rixen, M.; Ferreira-Coelho, E.; Signell, R.

    2008-01-01

    Despite numerous and regular improvements in underlying models, surface drift prediction in the ocean remains a challenging task because of our yet limited understanding of all processes involved. Hence, deterministic approaches to the problem are often limited by empirical assumptions on underlying physics. Multi-model hyper-ensemble forecasts, which exploit the power of an optimal local combination of available information including ocean, atmospheric and wave models, may show superior forecasting skills when compared to individual models because they allow for local correction and/or bias removal. In this work, we explore in greater detail the potential and limitations of the hyper-ensemble method in the Adriatic Sea, using a comprehensive surface drifter database. The performance of the hyper-ensembles and the individual models are discussed by analyzing associated uncertainties and probability distribution maps. Results suggest that the stochastic method may reduce position errors significantly for 12 to 72??h forecasts and hence compete with pure deterministic approaches. ?? 2007 NATO Undersea Research Centre (NURC).

  13. Nowcasting and forecasting of lightning activity: the Talos project.

    NASA Astrophysics Data System (ADS)

    Lagouvardos, Kostas; Kotroni, Vassiliki; Kazadzis, Stelios; Giannaros, Theodore; Karagiannidis, Athanassios; Galanaki, Elissavet; Proestakis, Emmanouil

    2015-04-01

    Thunder And Lightning Observing System (TALOS) is a research program funded by the Greek Ministry of Education with the aim to promote excellence in the field of lightning meteorology. The study focuses on exploring the real-time observations provided by the ZEUS lightning detection system, operated by the National Observatory of Athens since 2005, as well as the 10-year long database of the same system. More precisely the main research issues explored are: - lightning climatology over the Mediterranean focusing on lightning spatial and temporal distribution, on the relation of lightning with topographical features and instability and on the importance of aerosols in lightning initiation and enhancement. - nowcasting of lightning activity over Greece, with emphasis on the operational aspects of this endeavour. The nowcasting tool is based on the use of lightning data complemented by high-time resolution METEOSAT imagery. - forecasting of lightning activity over Greece based on the use of WRF numerical weather prediction model. - assimilation of lightning with the aim to improve the model precipitation forecast skill. In the frame of this presentation the main findings of each of the aforementioned issues are highlighted.

  14. Assessment of municipal solid waste generation and recyclable materials potential in Kuala Lumpur, Malaysia.

    PubMed

    Saeed, Mohamed Osman; Hassan, Mohd Nasir; Mujeebu, M Abdul

    2009-07-01

    This paper presents a forecasting study of municipal solid waste generation (MSWG) rate and potential of its recyclable components in Kuala Lumpur (KL), the capital city of Malaysia. The generation rates and composition of solid wastes of various classes such as street cleansing, landscape and garden, industrial and constructional, institutional, residential and commercial are analyzed. The past and present trends are studied and extrapolated for the coming years using Microsoft office 2003 Excel spreadsheet assuming a linear behavior. The study shows that increased solid waste generation of KL is alarming. For instance, the amount of daily residential SWG is found to be about 1.62 kg/capita; with the national average at 0.8-0.9 kg/capita and is expected to be increasing linearly, reaching to 2.23 kg/capita by 2024. This figure seems reasonable for an urban developing area like KL city. It is also found that, food (organic) waste is the major recyclable component followed by mix paper and mix plastics. Along with estimated population growth and their business activities, it has been observed that the city is still lacking in terms of efficient waste treatment technology, sufficient fund, public awareness, maintaining the established norms of industrial waste treatment etc. Hence it is recommended that the concerned authority (DBKL) shall view this issue seriously.

  15. Forecast of Antarctic Sea Ice and Meteorological Fields

    NASA Astrophysics Data System (ADS)

    Barreira, S.; Orquera, F.

    2017-12-01

    Since 2001, we have been forecasting the climatic fields of the Antarctic sea ice (SI) and surface air temperature, surface pressure and precipitation anomalies for the Southern Hemisphere at the Meteorological Department of the Argentine Naval Hydrographic Service with different techniques that have evolved with the years. Forecast is based on the results of Principal Components Analysis applied to SI series (S-Mode) that gives patterns of temporal series with validity areas (these series are important to determine which areas in Antarctica will have positive or negative SI anomalies based on what happen in the atmosphere) and, on the other hand, to SI fields (T-Mode) that give us the form of the SI fields anomalies based on a classification of 16 patterns. Each T-Mode pattern has unique atmospheric fields associated to them. Therefore, it is possible to forecast whichever atmosphere variable we decide for the Southern Hemisphere. When the forecast is obtained, each pattern has a probability of occurrence and sometimes it is necessary to compose more than one of them to obtain the final result. S-Mode and T-Mode are monthly updated with new data, for that reason the forecasts improved with the increase of cases since 2001. We used the Monthly Polar Gridded Sea Ice Concentrations database derived from satellite information generated by NASA Team algorithm provided monthly by the National Snow and Ice Data Center of USA that begins in November 1978. Recently, we have been experimenting with multilayer Perceptron (neuronal network) with supervised learning and a back-propagation algorithm to improve the forecast. The Perceptron is the most common Artificial Neural Network topology dedicated to image pattern recognition. It was implemented through the use of temperature and pressure anomalies field images that were associated with a the different sea ice anomaly patterns. The variables analyzed included only composites of surface air temperature and pressure anomalies to simplify the density of input data and avoid a non-converging solution. Sea ice and atmospheric variables forecast can be checked every month at our web page http://www.hidro.gob.ar/smara/sb/sb.asp and at World Meteorological web page (Global Cryosphere Watch) http://globalcryospherewatch.org/state_of_cryo/seaice/.

  16. Prediction models for CO2 emission in Malaysia using best subsets regression and multi-linear regression

    NASA Astrophysics Data System (ADS)

    Tan, C. H.; Matjafri, M. Z.; Lim, H. S.

    2015-10-01

    This paper presents the prediction models which analyze and compute the CO2 emission in Malaysia. Each prediction model for CO2 emission will be analyzed based on three main groups which is transportation, electricity and heat production as well as residential buildings and commercial and public services. The prediction models were generated using data obtained from World Bank Open Data. Best subset method will be used to remove irrelevant data and followed by multi linear regression to produce the prediction models. From the results, high R-square (prediction) value was obtained and this implies that the models are reliable to predict the CO2 emission by using specific data. In addition, the CO2 emissions from these three groups are forecasted using trend analysis plots for observation purpose.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Youssef, Tarek A.; El Hariri, Mohamad; Elsayed, Ahmed T.

    The smart grid is seen as a power system with realtime communication and control capabilities between the consumer and the utility. This modern platform facilitates the optimization in energy usage based on several factors including environmental, price preferences, and system technical issues. In this paper a real-time energy management system (EMS) for microgrids or nanogrids was developed. The developed system involves an online optimization scheme to adapt its parameters based on previous, current, and forecasted future system states. The communication requirements for all EMS modules were analyzed and are all integrated over a data distribution service (DDS) Ethernet network withmore » appropriate quality of service (QoS) profiles. In conclusion, the developed EMS was emulated with actual residential energy consumption and irradiance data from Miami, Florida and proved its effectiveness in reducing consumers’ bills and achieving flat peak load profiles.« less

  18. Managing time-substitutable electricity usage using dynamic controls

    DOEpatents

    Ghosh, Soumyadip; Hosking, Jonathan R.; Natarajan, Ramesh; Subramaniam, Shivaram; Zhang, Xiaoxuan

    2017-02-07

    A predictive-control approach allows an electricity provider to monitor and proactively manage peak and off-peak residential intra-day electricity usage in an emerging smart energy grid using time-dependent dynamic pricing incentives. The daily load is modeled as time-shifted, but cost-differentiated and substitutable, copies of the continuously-consumed electricity resource, and a consumer-choice prediction model is constructed to forecast the corresponding intra-day shares of total daily load according to this model. This is embedded within an optimization framework for managing the daily electricity usage. A series of transformations are employed, including the reformulation-linearization technique (RLT) to obtain a Mixed-Integer Programming (MIP) model representation of the resulting nonlinear optimization problem. In addition, various regulatory and pricing constraints are incorporated in conjunction with the specified profit and capacity utilization objectives.

  19. Managing time-substitutable electricity usage using dynamic controls

    DOEpatents

    Ghosh, Soumyadip; Hosking, Jonathan R.; Natarajan, Ramesh; Subramaniam, Shivaram; Zhang, Xiaoxuan

    2017-02-21

    A predictive-control approach allows an electricity provider to monitor and proactively manage peak and off-peak residential intra-day electricity usage in an emerging smart energy grid using time-dependent dynamic pricing incentives. The daily load is modeled as time-shifted, but cost-differentiated and substitutable, copies of the continuously-consumed electricity resource, and a consumer-choice prediction model is constructed to forecast the corresponding intra-day shares of total daily load according to this model. This is embedded within an optimization framework for managing the daily electricity usage. A series of transformations are employed, including the reformulation-linearization technique (RLT) to obtain a Mixed-Integer Programming (MIP) model representation of the resulting nonlinear optimization problem. In addition, various regulatory and pricing constraints are incorporated in conjunction with the specified profit and capacity utilization objectives.

  20. Remote sensing for urban planning

    NASA Technical Reports Server (NTRS)

    Davis, Bruce A.; Schmidt, Nicholas; Jensen, John R.; Cowen, Dave J.; Halls, Joanne; Narumalani, Sunil; Burgess, Bryan

    1994-01-01

    Utility companies are challenged to provide services to a highly dynamic customer base. With factory closures and shifts in employment becoming a routine occurrence, the utility industry must develop new techniques to maintain records and plan for expected growth. BellSouth Telecommunications, the largest of the Bell telephone companies, currently serves over 13 million residences and 2 million commercial customers. Tracking the movement of customers and scheduling the delivery of service are major tasks for BellSouth that require intensive manpower and sophisticated information management techniques. Through NASA's Commercial Remote Sensing Program Office, BellSouth is investigating the utility of remote sensing and geographic information system techniques to forecast residential development. This paper highlights the initial results of this project, which indicate a high correlation between the U.S. Bureau of Census block group statistics and statistics derived from remote sensing data.

  1. A distributed big data storage and data mining framework for solar-generated electricity quantity forecasting

    NASA Astrophysics Data System (ADS)

    Wang, Jianzong; Chen, Yanjun; Hua, Rui; Wang, Peng; Fu, Jia

    2012-02-01

    Photovoltaic is a method of generating electrical power by converting solar radiation into direct current electricity using semiconductors that exhibit the photovoltaic effect. Photovoltaic power generation employs solar panels composed of a number of solar cells containing a photovoltaic material. Due to the growing demand for renewable energy sources, the manufacturing of solar cells and photovoltaic arrays has advanced considerably in recent years. Solar photovoltaics are growing rapidly, albeit from a small base, to a total global capacity of 40,000 MW at the end of 2010. More than 100 countries use solar photovoltaics. Driven by advances in technology and increases in manufacturing scale and sophistication, the cost of photovoltaic has declined steadily since the first solar cells were manufactured. Net metering and financial incentives, such as preferential feed-in tariffs for solar-generated electricity; have supported solar photovoltaics installations in many countries. However, the power that generated by solar photovoltaics is affected by the weather and other natural factors dramatically. To predict the photovoltaic energy accurately is of importance for the entire power intelligent dispatch in order to reduce the energy dissipation and maintain the security of power grid. In this paper, we have proposed a big data system--the Solar Photovoltaic Power Forecasting System, called SPPFS to calculate and predict the power according the real-time conditions. In this system, we utilized the distributed mixed database to speed up the rate of collecting, storing and analysis the meteorological data. In order to improve the accuracy of power prediction, the given neural network algorithm has been imported into SPPFS.By adopting abundant experiments, we shows that the framework can provide higher forecast accuracy-error rate less than 15% and obtain low latency of computing by deploying the mixed distributed database architecture for solar-generated electricity.

  2. SUVI Thematic Maps: A new tool for space weather forecasting

    NASA Astrophysics Data System (ADS)

    Hughes, J. M.; Seaton, D. B.; Darnel, J.

    2017-12-01

    The new Solar Ultraviolet Imager (SUVI) instruments aboard NOAA's GOES-R series satellites collect continuous, high-quality imagery of the Sun in six wavelengths. SUVI imagers produce at least one image every 10 seconds, or 8,640 images per day, considerably more data than observers can digest in real time. Over the projected 20-year lifetime of the four GOES-R series spacecraft, SUVI will provide critical imagery for space weather forecasters and produce an extensive but unwieldy archive. In order to condense the database into a dynamic and searchable form we have developed solar thematic maps, maps of the Sun with key features, such as coronal holes, flares, bright regions, quiet corona, and filaments, identified. Thematic maps will be used in NOAA's Space Weather Prediction Center to improve forecaster response time to solar events and generate several derivative products. Likewise, scientists use thematic maps to find observations of interest more easily. Using an expert-trained, naive Bayesian classifier to label each pixel, we create thematic maps in real-time. We created software to collect expert classifications of solar features based on SUVI images. Using this software, we compiled a database of expert classifications, from which we could characterize the distribution of pixels associated with each theme. Given new images, the classifier assigns each pixel the most appropriate label according to the trained distribution. Here we describe the software to collect expert training and the successes and limitations of the classifier. The algorithm excellently identifies coronal holes but fails to consistently detect filaments and prominences. We compare the Bayesian classifier to an artificial neural network, one of our attempts to overcome the aforementioned limitations. These results are very promising and encourage future research into an ensemble classification approach.

  3. Forecasting Medicaid Expenditures for Antipsychotic Medications.

    PubMed

    Slade, Eric P; Simoni-Wastila, Linda

    2015-07-01

    The ongoing transition from use of mostly branded to mostly generic second-generation antipsychotic medications could bring about a substantial reduction in Medicaid expenditures for antipsychotic medications, a change with critical implications for formulary restrictions on second-generation antipsychotics in Medicaid. This study provided a forecast of the impact of generics on Medicaid expenditures for antipsychotic medications. Quarterly (N=816) state-level aggregate data on outpatient antipsychotic prescriptions in Medicaid between 2008 and 2011 were drawn from the Medicaid state drug utilization database. Annual numbers of prescriptions, expenditures, and cost per prescription were constructed for each antipsychotic medication. Forecasts of antipsychotic expenditures in calendar years 2016 and 2019 were developed on the basis of the estimated percentage reduction in Medicaid expenditures for risperidone, the only second-generation antipsychotic available generically throughout the study period. Two models of savings from generic risperidone use were estimated, one based on constant risperidone prices and the other based on variable risperidone prices. The sensitivity of the expenditure forecast to expected changes in Medicaid enrollment was also examined. In the main model, annual Medicaid expenditures for antipsychotics were forecasted to decrease by $1,794 million (48.8%) by 2016 and by $2,814 million (76.5%) by 2019. Adjustment for variable prices of branded medications and changes in Medicaid enrollment only moderately affected the magnitude of these reductions. Within five years, antipsychotic expenditures in Medicaid may decline to less than half their current levels. Such a spending reduction warrants a reassessment of the continued necessity of formulary restrictions for second-generation antipsychotics in Medicaid.

  4. HIV prevention interventions to reduce sexual risk for African Americans: the influence of community-level stigma and psychological processes.

    PubMed

    Reid, Allecia E; Dovidio, John F; Ballester, Estrellita; Johnson, Blair T

    2014-02-01

    Interventions to improve public health may benefit from consideration of how environmental contexts can facilitate or hinder their success. We examined the extent to which efficacy of interventions to improve African Americans' condom use practices was moderated by two indicators of structural stigma-Whites' attitudes toward African Americans and residential segregation in the communities where interventions occurred. A previously published meta-analytic database was re-analyzed to examine the interplay of community-level stigma with the psychological processes implied by intervention content in influencing intervention efficacy. All studies were conducted in the United States and included samples that were at least 50% African American. Whites' attitudes were drawn from the American National Election Studies, which collects data from nationally representative samples. Residential segregation was drawn from published reports. Results showed independent effects of Whites' attitudes and residential segregation on condom use effect sizes. Interventions were most successful when Whites' attitudes were more positive or when residential segregation was low. These two structural factors interacted: Interventions improved condom use only when communities had both relatively positive attitudes toward African Americans and lower levels of segregation. The effect of Whites' attitudes was more pronounced at longer follow-up intervals and for younger samples and those samples with more African Americans. Tailoring content to participants' values and needs, which may reduce African Americans' mistrust of intervention providers, buffered against the negative influence of Whites' attitudes on condom use. The structural factors uniquely accounted for variance in condom use effect sizes over and above intervention-level features and community-level education and poverty. Results highlight the interplay of social identity and environment in perpetuating intergroup disparities. Potential mechanisms for these effects are discussed along with public health implications. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Massage, a complementary therapy effectively promoting the health and well-being of older people in residential care settings: a review of the literature.

    PubMed

    McFeeters, Sarah; Pront, Leeanne; Cuthbertson, Lesley; King, Lindy

    2016-12-01

    To explore the potential benefits of massage within daily routine care of the older person in residential care settings. Globally, the proportion of people over 65 years is rapidly rising. Increased longevity means older people may experience a rise in physiological and psychological health problems. These issues potentially place an increased demand for quality long-term care for the older person. Complementary approaches such as massage appear to be needed in quality residential care. A critical literature review was undertaken. A literature review pertaining to massage in the older resident was conducted using a range of online databases. Fourteen studies dated 1993-2012 met the inclusion criteria and were critically evaluated as suitable resources for this review. Evidence suggests massage may be advantageous from client and nursing perspectives. Clients' perceive massage to positively influence factors such as pain, sleep, emotional status and psychosocial health. Evidence also demonstrates massage to benefit the client and organisation by reducing the necessity for restraint and pharmacological intervention. Massage may be incorporated into care provision and adopted by care providers and family members as an additional strategy to enhance quality of life for older people. Massage offers a practical activity that can be used to enhance the health and well-being of the older person in residential care. Massage offers benefit for promoting health and well-being of the older person along with potential increased engagement of family in care provision. Integration of massage into daily care activities of the older person requires ongoing promotion and implementation. © 2016 John Wiley & Sons Ltd.

  6. HIV Prevention Interventions to Reduce Sexual Risk for African Americans: The Influence of Community-Level Stigma and Psychological Processes

    PubMed Central

    Reid, Allecia E.; Dovidio, John F.; Ballester, Estrellita; Johnson, Blair T.

    2013-01-01

    Interventions to improve public health may benefit from consideration of how environmental contexts can facilitate or hinder their success. We examined the extent to which efficacy of interventions to improve African Americans’ condom use practices was moderated by two indicators of structural stigma—Whites’ attitudes toward African Americans and residential segregation in the communities where interventions occurred. A previously published meta-analytic database was re-analyzed to examine the interplay of community-level stigma with the psychological processes implied by intervention content in influencing intervention efficacy. All studies were conducted in the United States and included samples that were at least 50% African American. Whites’ attitudes were drawn from the American National Election Studies, which collects data from nationally representative samples. Residential segregation was drawn from published reports. Results showed independent effects of Whites’ attitudes and residential segregation on condom use effect sizes. Interventions were most successful when Whites’ attitudes were more positive or when residential segregation was low. These two structural factors interacted: Interventions improved condom use only when communities had both relatively positive attitudes toward African Americans and lower levels of segregation. The effect of Whites’ attitudes was more pronounced at longer follow-up intervals and for younger samples and those samples with more African Americans. Tailoring content to participants’ values and needs, which may reduce African Americans’ mistrust of intervention providers, buffered against the negative influence of Whites’ attitudes on condom use. The structural factors uniquely accounted for variance in condom use effect sizes over and above intervention-level features and community-level education and poverty. Results highlight the interplay of social identity and environment in perpetuating intergroup disparities. Potential mechanisms for these effects are discussed along with public health implications. PMID:24507916

  7. Enhancing Community Based Early Warning Systems in Nepal with Flood Forecasting Using Local and Global Models

    NASA Astrophysics Data System (ADS)

    Dugar, Sumit; Smith, Paul; Parajuli, Binod; Khanal, Sonu; Brown, Sarah; Gautam, Dilip; Bhandari, Dinanath; Gurung, Gehendra; Shakya, Puja; Kharbuja, RamGopal; Uprety, Madhab

    2017-04-01

    Operationalising effective Flood Early Warning Systems (EWS) in developing countries like Nepal poses numerous challenges, with complex topography and geology, sparse network of river and rainfall gauging stations and diverse socio-economic conditions. Despite these challenges, simple real-time monitoring based EWSs have been in place for the past decade. A key constraint of these simple systems is the very limited lead time for response - as little as 2-3 hours, especially for rivers originating from steep mountainous catchments. Efforts to increase lead time for early warning are focusing on imbedding forecasts into the existing early warning systems. In 2016, the Nepal Department of Hydrology and Meteorology (DHM) piloted an operational Probabilistic Flood Forecasting Model in major river basins across Nepal. This comprised a low data approach to forecast water levels, developed jointly through a research/practitioner partnership with Lancaster University and WaterNumbers (UK) and the International NGO Practical Action. Using Data-Based Mechanistic Modelling (DBM) techniques, the model assimilated rainfall and water levels to generate localised hourly flood predictions, which are presented as probabilistic forecasts, increasing lead times from 2-3 hours to 7-8 hours. The Nepal DHM has simultaneously started utilizing forecasts from the Global Flood Awareness System (GLoFAS) that provides streamflow predictions at the global scale based upon distributed hydrological simulations using numerical ensemble weather forecasts from the ECMWF (European Centre for Medium-Range Weather Forecasts). The aforementioned global and local models have already affected the approach to early warning in Nepal, being operational during the 2016 monsoon in the West Rapti basin in Western Nepal. On 24 July 2016, GLoFAS hydrological forecasts for the West Rapti indicated a sharp rise in river discharge above 1500 m3/sec (equivalent to the river warning level at 5 meters) with 53% probability of exceeding the Medium Level Alert in two days. Rainfall stations upstream of the West Rapti catchment recorded heavy rainfall on 26 July, and localized forecasts from the probabilistic model at 8 am suggested that the water level would cross a pre-determined warning level in the next 3 hours. The Flood Forecasting Section at DHM issued a flood advisory, and disseminated SMS flood alerts to more than 13,000 at-risk people residing along the floodplains. Water levels crossed the danger threshold (5.4 meters) at 11 am, peaking at 8.15 meters at 10 pm. Extension of the warning lead time from probabilistic forecasts was significant in minimising the risk to lives and livelihoods as communities gained extra time to prepare, evacuate and respond. Likewise, longer timescale forecasts from GLoFAS could be potentially linked with no-regret early actions leading to improved preparedness and emergency response. These forecasting tools have contributed to enhance the effectiveness and efficiency of existing community based systems, increasing the lead time for response. Nevertheless, extensive work is required on appropriate ways to interpret and disseminate probabilistic forecasts having longer (2-14 days) and shorter (3-5 hours) time horizon for operational deployment as there are numerous uncertainties associated with predictions.

  8. Use of point-of-sale data to track usage patterns of residential pesticides: methodology development.

    PubMed

    Bekarian, Nyree; Payne-Sturges, Devon; Edmondson, Stuart; Chism, Bill; Woodruff, Tracey J

    2006-05-25

    Residential-use pesticides have been shown to be a major source of pesticide exposure to people in the United States. However, little is understood about the exposures to household pesticides and the resultant health effects. One reason that little is known about home-use pesticide exposure is the lack of comprehensive data on exposures to pesticides in the home. One method to help ascertain the amount of pesticides present in the home is use of point-of-sale data collected from marketing companies that track product sales to obtain the volume of pesticides sold for home-use. This provides a measure of volume of home-use pesticide. We have constructed a searchable database containing sales data for home-use permethrin-containing pesticides sold by retail stores in the United States from January 1997 through December 2002 in an attempt to develop a tracking method for pesticide. This pilot project was conducted to determine if point-of-sale data would be effective in helping track the purchase of home-use permethrin containing pesticides and if it would stand as a good model for tracking sales of other home-use pesticides. There are several limitations associated with this tracking method, including the availability of sales data, market coverage, and geographic resolution. As a result, a fraction of sales data potentially available for reporting is represented in this database. However, the database is sensitive to the number and type of merchants reporting permethrin sales. Further, analysis of the sale of individual products included in the database indicates that year to year variability has a greater impact on reported permethrin sales than the amount sold by each type of merchant. We conclude that, while nothing could completely replace a detailed exposure assessment to estimate exposures to home-use pesticides, a point-of-sale database is a useful tool in tracking the purchase of these types of pesticides to 1) detect anomalous trends in regional and seasonal pesticide sales warranting further investigation into the potential causes of the trends; 2) determine the most commonly purchased application types; and 3) compare relative trends in sales between indoor and outdoor use products as well as compare trends in sales between different active ingredients.

  9. Use of point-of-sale data to track usage patterns of residential pesticides: methodology development

    PubMed Central

    Bekarian, Nyree; Payne-Sturges, Devon; Edmondson, Stuart; Chism, Bill; Woodruff, Tracey J

    2006-01-01

    Background Residential-use pesticides have been shown to be a major source of pesticide exposure to people in the United States. However, little is understood about the exposures to household pesticides and the resultant health effects. One reason that little is known about home-use pesticide exposure is the lack of comprehensive data on exposures to pesticides in the home. One method to help ascertain the amount of pesticides present in the home is use of point-of-sale data collected from marketing companies that track product sales to obtain the volume of pesticides sold for home-use. This provides a measure of volume of home-use pesticide. Methods We have constructed a searchable database containing sales data for home-use permethrin-containing pesticides sold by retail stores in the United States from January 1997 through December 2002 in an attempt to develop a tracking method for pesticide. This pilot project was conducted to determine if point-of-sale data would be effective in helping track the purchase of home-use permethrin containing pesticides and if it would stand as a good model for tracking sales of other home-use pesticides. Results There are several limitations associated with this tracking method, including the availability of sales data, market coverage, and geographic resolution. As a result, a fraction of sales data potentially available for reporting is represented in this database. However, the database is sensitive to the number and type of merchants reporting permethrin sales. Further, analysis of the sale of individual products included in the database indicates that year to year variability has a greater impact on reported permethrin sales than the amount sold by each type of merchant. Conclusion We conclude that, while nothing could completely replace a detailed exposure assessment to estimate exposures to home-use pesticides, a point-of-sale database is a useful tool in tracking the purchase of these types of pesticides to 1) detect anomalous trends in regional and seasonal pesticide sales warranting further investigation into the potential causes of the trends; 2) determine the most commonly purchased application types; and 3) compare relative trends in sales between indoor and outdoor use products as well as compare trends in sales between different active ingredients. PMID:16725037

  10. Using FRET for Drought Mitigation

    NASA Astrophysics Data System (ADS)

    Osborne, H. D.; Palmer, C. K.; Hobbins, M.

    2016-12-01

    With the ongoing drought plaguing California and much of the Western United States, water agencies and the general public have a heightened need for short term forecasts of evapotranspiration. The National Weather Service's (NWS) Forecast Reference Evapotranspiration (FRET) product suite can fill this need. The FRET product suite uses the Penman - Monteith Reference Evapotranspiration (ETrc) equation for a short canopy (12 cm grasses), adopted by the Environmental Water Resources Institute of the American Society of Civil Engineers. FRET is calculated across the contiguous U.S. using temperatures, humidity, winds, and sky cover from Numerical Weather Prediction (NPW) models and adjusted by NWS forecasters with local expertise of terrain and weather patterns. The Weekly ETrc product is easily incorporated into drought-planning strategies, allowing water managers, the agricultural community, and the public to make better informed water-use decisions. FRET can assist with the decision making process for scheduling irrigation (e.g., farms, golf courses, vineyards) and timing of fertilizers. The California Department of Water Resources (CA DWR) also ingests the FRET into their soil moisture models, and uses FRET to assist in determining the reservoir releases for the Feather River. The United States Bureau of Reclamation (USBR) also uses FRET in determining reservoir releases and assessing water temperature along the Sacramento and American Rivers. FRET is now operational on the National Digital Forecast Database (NDFD), permitting other agencies easy access to this nationwide data for all drought mitigation and planning purposes.

  11. Simulating Residential Demand in Singapore through Five Decades of Demographic Change

    NASA Astrophysics Data System (ADS)

    Davis, N. R.; Fernández, J.

    2011-12-01

    Singapore's rapid and well-documented development over the last half-century provides an ideal case for studying urban metabolism. Extensive data [1, 2] facilitate the modeling of historical dynamics of population and resource consumption. This paper presents an agent-based population model that simulates key demographic factors - number, size, and relative income of households - through fifty years of development in Singapore. This is the first step in a broader study linking demographic factors to residential demand for urban land, materials, water, and energy. Previous studies of the resource demands of housing stock have accounted for demographics by modifying the important population driver with a single, aggregated "lifestyle" term [3, 4]. However, demographic changes that result from development can influence the nature of the residential sector, and warrant a closer look. Increasing levels of education and affluence coupled with decreasing birth rates have yielded an aging population and changing family structures in Singapore [5]. These factors all contribute to an increasingly resource-intense residential sector. Singaporeans' elevated per capita income and life expectancy have created demand for larger household area, which means a growing percentage of available land must be dedicated to residential use [6]. While the majority of Singapore's housing is public - a strategy designed to maximize land use efficiency - residents are increasingly seeking private alternatives [7]. In the private sector, lower density housing puts even greater pressure on the finite supply of undeveloped land. Agent-based modeling is used to study the selected aspects of demography. The population is disaggregated into historical time-series distributions of age, family size, education, and income. We propose a simplified methodology correlating average education level with birth rate, and income to categorize households and establish housing unit demand. Aggregated lifestyle variables have proven useful for simulating past resource consumption in some cases, but demographic shifts are important causal factors in future demand that would not be captured by these simple terms. For this reason disaggregated population modeling provides better insight into the size and income distributions of households that ultimately drive residential resource consumption. References [1] Yearbook of Statistics Singapore. Dept. of Statistics, Ministry of Trade & Industry, 1960-2011. [2] HDB Annual Report. Housing & Development Board, Ministry of National Development, 1960-2011. [3] B. Muller, "Stock dynamics for forecasting material flows-case study for housing in the Netherlands," Ecol Econ, vol. 59, no. 1, pp. 142-156, 2006. [4] H. Bergsdal, et al., "Dynamic material flow analysis for Norway's dwelling stock," Build Res Inf, vol. 35, no. 5, pp. 557-570, 2007. [5] D. Phillips and H. Bartlett, "Aging trends-Singapore," J Cross Cult Gerontol, vol. 10, no. 4, pp. 349-356, 1995. [6] T. Wong and A. Yap, Four decades of transformation: Land use in Singapore, 1960-2000. Eastern University Press, 2004. [7] -, "From universal public housing to meeting the increasing aspiration for private housing in Singapore," Habitat Int, vol. 27, no. 3, pp. 361-380, 2003.

  12. Forecasting the effects of fertility control on overabundant ungulates: White-tailed deer in the National Capital Region

    USGS Publications Warehouse

    Raiho, Ann M.; Hooten, Mevin B.; Bates, Scott; Hobbs, N. Thompson

    2015-01-01

    Overabundant populations of ungulates have caused environmental degradation and loss of biological diversity in ecosystems throughout the world. Culling or regulated harvest is often used to control overabundant species. These methods are difficult to implement in national parks, other types of conservation reserves, or in residential areas where public hunting may be forbidden by policy. As a result, fertility control has been recommended as a non-lethal alternative for regulating ungulate populations. We evaluate this alternative using white-tailed deer in national parks in the vicinity of Washington, D.C., USA as a model system. Managers seek to reduce densities of white-tailed deer from the current average (50 deer per km2) to decrease harm to native plant communities caused by deer. We present a Bayesian hierarchical model using 13 years of population estimates from 8 national parks in the National Capital Region Network. We offer a novel way to evaluate management actions relative to goals using short term forecasts. Our approach confirms past analyses that fertility control is incapable of rapidly reducing deer abundance. Fertility control can be combined with culling to maintain a population below carrying capacity with a high probability of success. This gives managers confronted with problematic overabundance a framework for implementing management actions with a realistic assessment of uncertainty.

  13. Forecasting the Effects of Fertility Control on Overabundant Ungulates: White-Tailed Deer in the National Capital Region.

    PubMed

    Raiho, Ann M; Hooten, Mevin B; Bates, Scott; Hobbs, N Thompson

    2015-01-01

    Overabundant populations of ungulates have caused environmental degradation and loss of biological diversity in ecosystems throughout the world. Culling or regulated harvest is often used to control overabundant species. These methods are difficult to implement in national parks, other types of conservation reserves, or in residential areas where public hunting may be forbidden by policy. As a result, fertility control has been recommended as a non-lethal alternative for regulating ungulate populations. We evaluate this alternative using white-tailed deer in national parks in the vicinity of Washington, D.C., USA as a model system. Managers seek to reduce densities of white-tailed deer from the current average (50 deer per km2) to decrease harm to native plant communities caused by deer. We present a Bayesian hierarchical model using 13 years of population estimates from 8 national parks in the National Capital Region Network. We offer a novel way to evaluate management actions relative to goals using short term forecasts. Our approach confirms past analyses that fertility control is incapable of rapidly reducing deer abundance. Fertility control can be combined with culling to maintain a population below carrying capacity with a high probability of success. This gives managers confronted with problematic overabundance a framework for implementing management actions with a realistic assessment of uncertainty.

  14. A spatiotemporal land-use regression model of winter fine particulate levels in residential neighbourhoods.

    PubMed

    Smargiassi, Audrey; Brand, Allan; Fournier, Michel; Tessier, François; Goudreau, Sophie; Rousseau, Jacques; Benjamin, Mario

    2012-07-01

    Residential wood burning can be a significant wintertime source of ambient fine particles in urban and suburban areas. We developed a statistical model to predict minute (min) levels of particles with median diameter of <1 μm (PM1) from mobile monitoring on evenings of winter weekends at different residential locations in Quebec, Canada, considering wood burning emissions. The 6 s PM1 levels were concurrently measured on 10 preselected routes travelled 3 to 24 times during the winters of 2008-2009 and 2009-2010 by vehicles equipped with a GRIMM or a dataRAM sampler and a Global Positioning System device. Route-specific and global land-use regression (LUR) models were developed using the following spatial and temporal covariates to predict 1-min-averaged PM1 levels: chimney density from property assessment data at sampling locations, PM2.5 "regional background" levels of particles with median diameter of <2.5 μm (PM2.5) and temperature and wind speed at hour of sampling, elevation at sampling locations and day of the week. In the various routes travelled, between 49% and 94% of the variability in PM1 levels was explained by the selected covariates. The effect of chimney density was not negligible in "cottage areas." The R(2) for the global model including all routes was 0.40. This LUR is the first to predict PM1 levels in both space and time with consideration of the effects of wood burning emissions. We show that the influence of chimney density, a proxy for wood burning emissions, varies by regions and that a global model cannot be used to predict PM in regions that were not measured. Future work should consider using both survey data on wood burning intensity and information from numerical air quality forecast models, in LUR models, to improve the generalisation of the prediction of fine particulate levels.

  15. A Pilot Tsunami Inundation Forecast System for Australia

    NASA Astrophysics Data System (ADS)

    Allen, Stewart C. R.; Greenslade, Diana J. M.

    2016-12-01

    The Joint Australian Tsunami Warning Centre (JATWC) provides a tsunami warning service for Australia. Warnings are currently issued according to a technique that does not include explicit modelling at the coastline, including any potential coastal inundation. This paper investigates the feasibility of developing and implementing tsunami inundation modelling as part of the JATWC warning system. An inundation model was developed for a site in Southeast Australia, on the basis of the availability of bathymetric and topographic data and observations of past tsunamis. The model was forced using data from T2, the operational deep-water tsunami scenario database currently used for generating warnings. The model was evaluated not only for its accuracy but also for its computational speed, particularly with respect to operational applications. Limitations of the proposed forecast processes in the Australian context and areas requiring future improvement are discussed.

  16. Deep-water oilfield development cost analysis and forecasting —— Take gulf of mexico for example

    NASA Astrophysics Data System (ADS)

    Shi, Mingyu; Wang, Jianjun; Yi, Chenggao; Bai, Jianhui; Wang, Jing

    2017-11-01

    Gulf of Mexico (GoM) is the earliest offshore oilfield which has ever been developed. It tends to breed increasingly value of efficient, secure and cheap key technology of deep-water development. Thus, the analyze of development expenditure in this area is significantly important the evaluation concept of deep-water oilfield all over the world. This article emphasizes on deep-water development concept and EPC contract value in GoM in recent 10 years in case of comparison and selection to the economic efficiency. Besides, the QUETOR has been put into use in this research processes the largest upstream cost database to simulate and calculate the calculating examples’ expenditure. By analyzing and forecasting the deep-water oilfield development expenditure, this article explores the relevance between expenditure index and oil price.

  17. WHE-PAGER Project: A new initiative in estimating global building inventory and its seismic vulnerability

    USGS Publications Warehouse

    Porter, K.A.; Jaiswal, K.S.; Wald, D.J.; Greene, M.; Comartin, Craig

    2008-01-01

    The U.S. Geological Survey’s Prompt Assessment of Global Earthquake’s Response (PAGER) Project and the Earthquake Engineering Research Institute’s World Housing Encyclopedia (WHE) are creating a global database of building stocks and their earthquake vulnerability. The WHE already represents a growing, community-developed public database of global housing and its detailed structural characteristics. It currently contains more than 135 reports on particular housing types in 40 countries. The WHE-PAGER effort extends the WHE in several ways: (1) by addressing non-residential construction; (2) by quantifying the prevalence of each building type in both rural and urban areas; (3) by addressing day and night occupancy patterns, (4) by adding quantitative vulnerability estimates from judgment or statistical observation; and (5) by analytically deriving alternative vulnerability estimates using in part laboratory testing.

  18. Short-range forecast of Shershnevskoie (South Ural) water-storage algal blooms: preliminary results of predictors' choosing and membership functions' construction

    NASA Astrophysics Data System (ADS)

    Gayazova, Anna; Abdullaev, Sanjar

    2014-05-01

    Short-range forecasting of algal blooms in drinking water reservoirs and other waterbodies is an actual element of water treatment system. Particularly, Shershnevskoie reservoir - the source of drinking water for Chelyabinsk city (South Ural region of Russia) - is exposed to interannual, seasonal and short-range fluctuations of blue-green alga Aphanizomenon flos-aquae and other dominant species abundance, which lead to technological problems and economic costs and adversely affect the water treatment quality. Whereas the composition, intensity and the period of blooms affected not only by meteorological seasonal conditions but also by ecological specificity of waterbody, that's important to develop object-oriented forecasting, particularly, search for an optimal number of predictors for such forecasting. Thereby, firstly fuzzy logic and fuzzy artificial neural network patterns for blue-green alga Microcystis aeruginosa (M. aeruginosa) blooms prediction in nearby undrained Smolino lake were developed. These results subsequently served as the base to derive membership functions for Shernevskoie reservoir forecasting patterns. Time series with the total lenght about 138-159 days of dominant species seasonal abundance, water temperature, cloud cover, wind speed, mineralization, phosphate and nitrate concentrations were obtained through field observations held at Lake Smolino (Chelyabinsk) in the warm season of 2009 and 2011 with time resolution of 2-7 days. The cross-correlation analysis of the data revealed the potential predictors of M. aeruginosa abundance quasi-periodic oscillations: green alga Pediastrum duplex (P. duplex) abundance and mineralization for 2009, P. duplex abundance, water temperature and concentration of nitrates for 2011. According to the results of cross-correlation analysis one membership function "P. duplex abundance" and one rule linking M. aeruginosa and P. duplex abundances were set up for database of 2009. Analogically, for database of 2011 three rules, linking membership functions of temperature, P. duplex abundance, nitrate concentration and M. aeruginosa abundance were set up. Developed fuzzy logic rules were good to predict M. aeruginosa intense outbreaks. For ANN method of forecasting specially written program was used to train the fuzzy artificial neural network on number of input selected predictors' values and output predicted factor's values to set up the predictive rules and membership functions automatically. As a result, two models based on mineralization and P. duplex abundance were developed for 2009. For 2011 four patterns were developed, the best result was obtained for model based on temperature and P. duplex abundance. Developed methods of forecasting were applied to predict outbreaks of Aphanizomenon flos-aquae and M. aeruginosa abundance in Shershnevskoie reservoir. For this purpose long-term data of chemical parameters, measured once in a month, data of dominant species abundance, measured fifth in a week and data of turbidity, water color, alkalinity, pH, obtained each day, were analyzed. Based on these empirical data significant factors were determined, membership functions were set up and preliminary models for Shershnevskoie reservoir were developed. As expected, these models differ significantly from developed for Smolino lake ones and should be tested on new data sets.

  19. Simple statistical bias correction techniques greatly improve moderate resolution air quality forecast at station level

    NASA Astrophysics Data System (ADS)

    Curci, Gabriele; Falasca, Serena

    2017-04-01

    Deterministic air quality forecast is routinely carried out at many local Environmental Agencies in Europe and throughout the world by means of eulerian chemistry-transport models. The skill of these models in predicting the ground-level concentrations of relevant pollutants (ozone, nitrogen dioxide, particulate matter) a few days ahead has greatly improved in recent years, but it is not yet always compliant with the required quality level for decision making (e.g. the European Commission has set a maximum uncertainty of 50% on daily values of relevant pollutants). Post-processing of deterministic model output is thus still regarded as a useful tool to make the forecast more reliable. In this work, we test several bias correction techniques applied to a long-term dataset of air quality forecasts over Europe and Italy. We used the WRF-CHIMERE modelling system, which provides operational experimental chemical weather forecast at CETEMPS (http://pumpkin.aquila.infn.it/forechem/), to simulate the years 2008-2012 at low resolution over Europe (0.5° x 0.5°) and moderate resolution over Italy (0.15° x 0.15°). We compared the simulated dataset with available observation from the European Environmental Agency database (AirBase) and characterized model skill and compliance with EU legislation using the Delta tool from FAIRMODE project (http://fairmode.jrc.ec.europa.eu/). The bias correction techniques adopted are, in order of complexity: (1) application of multiplicative factors calculated as the ratio of model-to-observed concentrations averaged over the previous days; (2) correction of the statistical distribution of model forecasts, in order to make it similar to that of the observations; (3) development and application of Model Output Statistics (MOS) regression equations. We illustrate differences and advantages/disadvantages of the three approaches. All the methods are relatively easy to implement for other modelling systems.

  20. Towards the Olympic Games: Guanabara Bay Forecasting System and its Application on the Floating Debris Cleaning Actions.

    NASA Astrophysics Data System (ADS)

    Pimentel, F. P.; Marques Da Cruz, L.; Cabral, M. M.; Miranda, T. C.; Garção, H. F.; Oliveira, A. L. S. C.; Carvalho, G. V.; Soares, F.; São Tiago, P. M.; Barmak, R. B.; Rinaldi, F.; dos Santos, F. A.; Da Rocha Fragoso, M.; Pellegrini, J. C.

    2016-02-01

    Marine debris is a widespread pollution issue that affects almost all water bodies and is remarkably relevant in estuaries and bays. Rio de Janeiro city will host the 2016 Olympic Games and Guanabara Bay will be the venue for the sailing competitions. Historically serving as deposit for all types of waste, this water body suffers with major environmental problems, one of them being the massive presence of floating garbage. Therefore, it is of great importance to count on effective contingency actions to address this issue. In this sense, an operational ocean forecasting system was designed and it is presently being used by the Rio de Janeiro State Government to manage and control the cleaning actions on the bay. The forecasting system makes use of high resolution hydrodynamic and atmospheric models and a lagragian particle transport model, in order to provide probabilistic forecasts maps of the areas where the debris are most probably accumulating. All the results are displayed on an interactive GIS web platform along with the tracks of the boats that make the garbage collection, so the decision makers can easily command the actions, enhancing its efficiency. The integration of in situ data and advanced techniques such as Lyapunov exponent analysis are also being developed in the system, so to increase its forecast reliability. Additionally, the system also gathers and compiles on its database all the information on the debris collection, including quantity, type, locations, accumulation areas and their correlation with the environmental factors that drive the runoff and surface drift. Combining probabilistic, deterministic and statistical approaches, the forecasting system of Guanabara Bay has been proving to be a powerful tool for the environmental management and will be of great importance on helping securing the safety and fairness of the Olympic sailing competitions. The system design, its components and main results are presented in this paper.

  1. Canadian Operational Air Quality Forecasting Systems: Status, Recent Progress, and Challenges

    NASA Astrophysics Data System (ADS)

    Pavlovic, Radenko; Davignon, Didier; Ménard, Sylvain; Munoz-Alpizar, Rodrigo; Landry, Hugo; Beaulieu, Paul-André; Gilbert, Samuel; Moran, Michael; Chen, Jack

    2017-04-01

    ECCC's Canadian Meteorological Centre Operations (CMCO) division runs a number of operational air quality (AQ)-related systems that revolve around the Regional Air Quality Deterministic Prediction System (RAQDPS). The RAQDPS generates 48-hour AQ forecasts and outputs hourly concentration fields of O3, PM2.5, NO2, and other pollutants twice daily on a North-American domain with 10-km horizontal grid spacing and 80 vertical levels. A closely related AQ forecast system with near-real-time wildfire emissions, known as FireWork, has been run by CMCO during the Canadian wildfire season (April to October) since 2014. This system became operational in June 2016. The CMCO`s operational AQ forecast systems also benefit from several support systems, such as a statistical post-processing model called UMOS-AQ that is applied to enhance forecast reliability at point locations with AQ monitors. The Regional Deterministic Air Quality Analysis (RDAQA) system has also been connected to the RAQDPS since February 2013, and hourly surface objective analyses are now available for O3, PM2.5, NO2, PM10, SO2 and, indirectly, the Canadian Air Quality Health Index. As of June 2015, another version of the RDAQA has been connected to FireWork (RDAQA-FW). For verification purposes, CMCO developed a third support system called Verification for Air QUality Models (VAQUM), which has a geospatial relational database core and which enables continuous monitoring of the AQ forecast systems' performance. Urban environments are particularly subject to AQ pollution. In order to improve the services offered, ECCC has recently been investing efforts to develop a high resolution air quality prediction capability for urban areas in Canada. In this presentation, a comprehensive description of the ECCC AQ systems will be provided, along with a discussion on AQ systems performance. Recent improvements, current challenges, and future directions of the Canadian operational AQ program will also be discussed.

  2. Forecasting the Solar Drivers of Severe Space Weather from Active-Region Magnetograms

    NASA Technical Reports Server (NTRS)

    Falconer, David A.; Moore, Ronald L.; Barghouty, Abdulnasser F.; Khazanov, Igor

    2012-01-01

    Large flares and fast CMEs are the drivers of the most severe space weather including Solar Energetic Particle Events (SEP Events). Large flares and their co-produced CMEs are powered by the explosive release of free magnetic energy stored in non-potential magnetic fields of sunspot active regions. The free energy is stored in and released from the low-beta regime of the active region s magnetic field above the photosphere, in the chromosphere and low corona. From our work over the past decade and from similar work of several other groups, it is now well established that (1) a proxy of the free magnetic energy stored above the photosphere can be measured from photospheric magnetograms, and (2) an active region s rate of production of major CME/flare eruptions in the coming day or so is strongly correlated with its present measured value of the free-energy proxy. These results have led us to use the large database of SOHO/MDI full-disk magnetograms spanning Solar Cycle 23 to obtain empirical forecasting curves that from an active region s present measured value of the free-energy proxy give the active region s expected rates of production of major flares, CMEs, fast CMEs, and SEP Events in the coming day or so (Falconer et al 2011, Space Weather, 9, S04003). We will present these forecasting curves and demonstrate the accuracy of their forecasts. In addition, we will show that the forecasts for major flares and fast CMEs can be made significantly more accurate by taking into account not only the value of the free energy proxy but also the active region s recent productivity of major flares; specifically, whether the active region has produced a major flare (GOES class M or X) during the past 24 hours before the time of the measured magnetogram. By empirically determining the conversion of the value of free-energy proxy measured from a GONG or HMI magnetogram to that which would be measured from an MDI magnetogram, we have made GONG and HMI magnetograms useable with our MDI-based forecasting curves to forecast event rates.

  3. An operational integrated short-term warning solution for solar radiation storms: introducing the Forecasting Solar Particle Events and Flares (FORSPEF) system

    NASA Astrophysics Data System (ADS)

    Anastasiadis, Anastasios; Sandberg, Ingmar; Papaioannou, Athanasios; Georgoulis, Manolis; Tziotziou, Kostas; Jiggens, Piers; Hilgers, Alain

    2015-04-01

    We present a novel integrated prediction system, of both solar flares and solar energetic particle (SEP) events, which is in place to provide short-term warnings for hazardous solar radiation storms. FORSPEF system provides forecasting of solar eruptive events, such as solar flares with a projection to coronal mass ejections (CMEs) (occurrence and velocity) and the likelihood of occurrence of a SEP event. It also provides nowcasting of SEP events based on actual solar flare and CME near real-time alerts, as well as SEP characteristics (peak flux, fluence, rise time, duration) per parent solar event. The prediction of solar flares relies on a morphological method which is based on the sophisticated derivation of the effective connected magnetic field strength (Beff) of potentially flaring active-region (AR) magnetic configurations and it utilizes analysis of a large number of AR magnetograms. For the prediction of SEP events a new reductive statistical method has been implemented based on a newly constructed database of solar flares, CMEs and SEP events that covers a large time span from 1984-2013. The method is based on flare location (longitude), flare size (maximum soft X-ray intensity), and the occurrence (or not) of a CME. Warnings are issued for all > C1.0 soft X-ray flares. The warning time in the forecasting scheme extends to 24 hours with a refresh rate of 3 hours while the respective warning time for the nowcasting scheme depends on the availability of the near real-time data and falls between 15-20 minutes. We discuss the modules of the FORSPEF system, their interconnection and the operational set up. The dual approach in the development of FORPSEF (i.e. forecasting and nowcasting scheme) permits the refinement of predictions upon the availability of new data that characterize changes on the Sun and the interplanetary space, while the combined usage of solar flare and SEP forecasting methods upgrades FORSPEF to an integrated forecasting solution. This work has been funded through the "FORSPEF: FORecasting Solar Particle Events and Flares", ESA Contract No. 4000109641/13/NL/AK

  4. A DDS-Based Energy Management Framework for Small Microgrid Operation and Control

    DOE PAGES

    Youssef, Tarek A.; El Hariri, Mohamad; Elsayed, Ahmed T.; ...

    2017-09-26

    The smart grid is seen as a power system with realtime communication and control capabilities between the consumer and the utility. This modern platform facilitates the optimization in energy usage based on several factors including environmental, price preferences, and system technical issues. In this paper a real-time energy management system (EMS) for microgrids or nanogrids was developed. The developed system involves an online optimization scheme to adapt its parameters based on previous, current, and forecasted future system states. The communication requirements for all EMS modules were analyzed and are all integrated over a data distribution service (DDS) Ethernet network withmore » appropriate quality of service (QoS) profiles. In conclusion, the developed EMS was emulated with actual residential energy consumption and irradiance data from Miami, Florida and proved its effectiveness in reducing consumers’ bills and achieving flat peak load profiles.« less

  5. Satellites monitor Atlanta regional development

    USGS Publications Warehouse

    Todd, William J.; Blackmon, C.C.; Rudasill, R.G.

    1979-01-01

    Since the adoption of a Regional Development Plan in 1975, the Atlanta Regional Commission has investigated methods for monitoring regional development patterns in a periodic, efficient manner. A promising approach appears to be the use of Landsat satellite data. In cooperation with the Earth Resources Observation Systems (EROS) Data Center, the commission used machine processing of digital temporal overlays of Landsat data collected in 1972, 1974 and 1976 to detect land use and land cover changes in the Atlanta metropolitan area. Results of the analysis revealed the conversion of forested and open space areas to residential, commercial and industrial land use in the urban-rural fringe zone from 1972 to 1974 and from 1974 to 1976. The study indicated that a land use and land cover change-detection program may be used to revise small-area forecasts of land use, population and employment made by planning models.

  6. An Investigation of Bomb Cyclone Climatology: Reanalysis vs. NCEP's CFS Model

    NASA Astrophysics Data System (ADS)

    Alvarez, F. M.; Eichler, T.; Gottschalck, J.

    2009-12-01

    Given the concerns and potential impacts of climate change, the need for climate models to simulate weather phenomena is as important as ever. An example of such phenomena is rapidly intensifying cyclones, also known as "bombs." These intense cyclones have devastating effects on residential and marine commercial interests as well as the transportation industry. In this study, we generate a climatology of rapid cyclogenesis using the National Centers for Environmental Prediction’s (NCEP) Climate Forecast System (CFS) model. Results are compared to NCEP’s global reanalysis data to determine if the CFS model is capable of producing a realistic extreme storm climatology. This represents the first step in quantifying rapidly intensifying cyclones in the CFS model, which will be useful in contributing towards future model improvements, as well as gauging its ability in determining the role of synoptic-scale storms in climate change.

  7. The invisible benefits of exercise.

    PubMed

    Ruby, Matthew B; Dunn, Elizabeth W; Perrino, Andrea; Gillis, Randall; Viel, Sasha

    2011-01-01

    To examine whether--and why--people underestimate how much they enjoy exercise. Across four studies, 279 adults predicted how much they would enjoy exercising, or reported their actual feelings after exercising. Main outcome measures were predicted and actual enjoyment ratings of exercise routines, as well as intention to exercise. Participants significantly underestimated how much they would enjoy exercising; this affective forecasting bias emerged consistently for group and individual exercise, and moderate and challenging workouts spanning a wide range of forms, from yoga and Pilates to aerobic exercise and weight training (Studies 1 and 2). We argue that this bias stems largely from forecasting myopia, whereby people place disproportionate weight on the beginning of a workout, which is typically unpleasant. We demonstrate that forecasting myopia can be harnessed (Study 3) or overcome (Study 4), thereby increasing expected enjoyment of exercise. Finally, Study 4 provides evidence for a mediational model, in which improving people's expected enjoyment of exercise leads to increased intention to exercise. People underestimate how much they enjoy exercise because of a myopic focus on the unpleasant beginning of exercise, but this tendency can be harnessed or overcome, potentially increasing intention to exercise. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  8. Risky family processes prospectively forecast shorter telomere length mediated through negative emotions.

    PubMed

    Brody, Gene H; Yu, Tianyi; Shalev, Idan

    2017-05-01

    This study was designed to examine prospective associations of risky family environments with subsequent levels of negative emotions and peripheral blood mononuclear cell telomere length (TL), a marker of cellular aging. A second purpose was to determine whether negative emotions mediate the hypothesized link between risky family processes and diminished telomere length. Participants were 293 adolescents (age 17 years at the first assessment) and their primary caregivers. Caregivers provided data on risky family processes when the youths were age 17 years, youths reported their negative emotions at age 18 years, and youths' TL was assayed from a blood sample at age 22 years. The results revealed that (a) risky family processes forecast heightened negative emotions (β = .316, p < .001) and diminished TL (β = -.199, p = .003) among youths, (b) higher levels of negative emotions forecast shorter TL (β = -.187, p = .012), and (c) negative emotions served as a mediator connecting risky family processes with diminished TL (indirect effect = -0.012, 95% CI [-0.036, -0.002]). These findings are consistent with the hypothesis that risky family processes presage premature cellular aging through effects on negative emotions, with potential implications for lifelong health. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. Validation of the United States Marine Corps Qualified Candidate Population Model

    DTIC Science & Technology

    2003-03-01

    time. Fields are created in the database to support this forecasting. User forms and a macro are programmed in Microsoft VBA to develop the...at 0.001. To accomplish 50,000 iterations of a minimization problem, this study wrote a macro in the VBA programming language that guides the solver...success in the commissioning process. **To improve the diagnostics of this propensity model, other factors were considered as well. Applying SQL

  10. Gravity models of forest products trade: applications to forecasting and policy analysis

    Treesearch

    Joseph Buongiorno

    2016-01-01

    To predict the value of trade between countries, a differential gravity model of bilateral trade flowswas formulated and estimated with panel data from 2005 to 2014 for each of the commodity groups HS44 (wood and articles of wood), HS47 (pulp of wood, fibrous cellulosic material) and HS48 (paper and paperboard). The parameters were estimated with a large database by...

  11. Standard Port-Visit Cost Forecasting Model for U.S. Navy Husbanding Contracts

    DTIC Science & Technology

    2009-12-01

    Protocol (HTTP) server.35 2. MySQL . An open-source database.36 3. PHP . A common scripting language used for Web development.37 E. IMPLEMENTATION OF...Inc. (2009). MySQL Community Server (Version 5.1) [Software]. Available from http://dev.mysql.com/downloads/ 37 The PHP Group (2009). PHP (Version...Logistics Services MySQL My Structured Query Language NAVSUP Navy Supply Systems Command NC Non-Contract Items NPS Naval Postgraduate

  12. Sources of secondary organic aerosols over North China Plain in winter

    NASA Astrophysics Data System (ADS)

    Xing, L.; Li, G.; Tie, X.; Junji, C.; Long, X.

    2017-12-01

    Organic aerosol (OA) concentrations are simulated over the North China Plain (NCP) from 10th to 26th January, 2014 using the Weather Research and Forecasting model coupled to chemistry (WRF-CHEM), with the goal of examining the impact of heterogeneous HONO sources on atmospheric oxidation capacity and consequently on SOA formation and SOA formation from different pathways in winter. Generally, the model well reproduced the spatial and temporal distribution of PM2.5, SO2, NO2, and O3 concentrations. The heterogeneous HONO formation contributed a major part of atmospheric HONO concentrations in Beijing. The heterogeneous HONO sources significantly increased the daily maximum OH concentrations by 260% on average in Beijing, which enhanced the atmospheric oxidation capacity and consequently SOA concentrations by 80% in Beijing on average. Under severe haze pollution on January 16th 2014, the regional average HONO concentration over NCP was 0.86 ppb, which increased SOA concentration by 68% on average. The average mass fractions of ASOA (SOA from oxidation of anthropogenic VOCs), BSOA (SOA from oxidation of biogenic VOCs), PSOA (SOA from oxidation of evaporated POA), and GSOA (SOA from irreversible uptake of glyoxal and methylglyoxal) during the simulation period over NCP were 24%, 5%, 26% and 45%, respectively. GSOA contributed most to the total SOA mass over NCP in winter. The model sensitivity simulation revealed that GSOA in winter was mainly from primary residential sources. The regional average of GSOA from primary residential sources constituted 87% of total GSOA mass.

  13. The longevity of lava dome eruptions: analysis of the global DomeHaz database

    NASA Astrophysics Data System (ADS)

    Ogburn, S. E.; Wolpert, R.; Calder, E.; Pallister, J. S.; Wright, H. M. N.

    2015-12-01

    The likely duration of ongoing volcanic eruptions is a topic of great interest to volcanologists, volcano observatories, and communities near volcanoes. Lava dome forming eruptions can last from days to centuries, and can produce violent, difficult-to-forecast activity including vulcanian to plinian explosions and pyroclastic density currents. Periods of active dome extrusion are often interspersed with periods of relative quiescence, during which extrusion may slow or pause altogether, but persistent volcanic unrest continues. This contribution focuses on the durations of these longer-term unrest phases, hereafter eruptions, that include periods of both lava extrusion and quiescence. A new database of lava dome eruptions, DomeHaz, provides characteristics of 228 eruptions at 127 volcanoes; for which 177 have duration information. We find that while 78% of dome-forming eruptions do not continue for more than 5 years, the remainder can be very long-lived. The probability distributions of eruption durations are shown to be heavy-tailed and vary by magma composition. For this reason, eruption durations are modeled with generalized Pareto distributions whose governing parameters depend on each volcano's composition and eruption duration to date. Bayesian predictive distributions and associated uncertainties are presented for the remaining duration of ongoing eruptions of specified composition and duration to date. Forecasts of such natural events will always have large uncertainties, but the ability to quantify such uncertainty is key to effective communication with stakeholders and to mitigation of hazards. Projections are made for the remaining eruption durations of ongoing eruptions, including those at Soufrière Hills Volcano, Montserrat and Sinabung, Indonesia. This work provides a quantitative, transferable method and rationale on which to base long-term planning decisions for dome forming volcanoes of different compositions, regardless of the quality of an individual volcano's eruptive record, by leveraging a global database.

  14. Climate Change Impacts on Worldwide Coffee Production

    NASA Astrophysics Data System (ADS)

    Foreman, T.; Rising, J. A.

    2015-12-01

    Coffee (Coffea arabica and Coffea canephora) plays a vital role in many countries' economies, providing necessary income to 25 million members of tropical countries, and supporting a $81 billion industry, making it one of the most valuable commodities in the world. At the same time, coffee is at the center of many issues of sustainability. It is vulnerable to climate change, with disease outbreaks becoming more common and suitable regions beginning to shift. We develop a statistical production model for coffee which incorporates temperature, precipitation, frost, and humidity effects using a new database of worldwide coffee production. We then use this model to project coffee yields and production into the future based on a variety of climate forecasts. This model can then be used together with a market model to forecast the locations of future coffee production as well as future prices, supply, and demand.

  15. Tsunami Forecasting in the Atlantic Basin

    NASA Astrophysics Data System (ADS)

    Knight, W. R.; Whitmore, P.; Sterling, K.; Hale, D. A.; Bahng, B.

    2012-12-01

    The mission of the West Coast and Alaska Tsunami Warning Center (WCATWC) is to provide advance tsunami warning and guidance to coastal communities within its Area-of-Responsibility (AOR). Predictive tsunami models, based on the shallow water wave equations, are an important part of the Center's guidance support. An Atlantic-based counterpart to the long-standing forecasting ability in the Pacific known as the Alaska Tsunami Forecast Model (ATFM) is now developed. The Atlantic forecasting method is based on ATFM version 2 which contains advanced capabilities over the original model; including better handling of the dynamic interactions between grids, inundation over dry land, new forecast model products, an optional non-hydrostatic approach, and the ability to pre-compute larger and more finely gridded regions using parallel computational techniques. The wide and nearly continuous Atlantic shelf region presents a challenge for forecast models. Our solution to this problem has been to develop a single unbroken high resolution sub-mesh (currently 30 arc-seconds), trimmed to the shelf break. This allows for edge wave propagation and for kilometer scale bathymetric feature resolution. Terminating the fine mesh at the 2000m isobath keeps the number of grid points manageable while allowing for a coarse (4 minute) mesh to adequately resolve deep water tsunami dynamics. Higher resolution sub-meshes are then included around coastal forecast points of interest. The WCATWC Atlantic AOR includes eastern U.S. and Canada, the U.S. Gulf of Mexico, Puerto Rico, and the Virgin Islands. Puerto Rico and the Virgin Islands are in very close proximity to well-known tsunami sources. Because travel times are under an hour and response must be immediate, our focus is on pre-computing many tsunami source "scenarios" and compiling those results into a database accessible and calibrated with observations during an event. Seismic source evaluation determines the order of model pre-computation - starting with those sources that carry the highest risk. Model computation zones are confined to regions at risk to save computation time. For example, Atlantic sources have been shown to not propagate into the Gulf of Mexico. Therefore, fine grid computations are not performed in the Gulf for Atlantic sources. Outputs from the Atlantic model include forecast marigrams at selected sites, maximum amplitudes, drawdowns, and currents for all coastal points. The maximum amplitude maps will be supplemented with contoured energy flux maps which show more clearly the effects of bathymetric features on tsunami wave propagation. During an event, forecast marigrams will be compared to observations to adjust the model results. The modified forecasts will then be used to set alert levels between coastal breakpoints, and provided to emergency management.

  16. Waste Information Management System: One Year After Web Deployment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shoffner, P.A.; Geisler, T.J.; Upadhyay, H.

    2008-07-01

    The implementation of the Department of Energy (DOE) mandated accelerated cleanup program created significant potential technical impediments. The schedule compression required close coordination and a comprehensive review and prioritization of the barriers that impeded treatment and disposition of the waste streams at each site. Many issues related to site waste treatment and disposal were potential critical path issues under the accelerated schedules. In order to facilitate accelerated cleanup initiatives, waste managers at DOE field sites and at DOE Headquarters in Washington, D.C., needed timely waste forecast information regarding the volumes and types of waste that would be generated by DOEmore » sites over the next 30 years. Each local DOE site has historically collected, organized, and displayed site waste forecast information in separate and unique systems. However, waste information from all sites needed a common application to allow interested parties to understand and view the complete complex-wide picture. A common application allows identification of total waste volumes, material classes, disposition sites, choke points, and technological or regulatory barriers to treatment and disposal. The Applied Research Center (ARC) at Florida International University (FIU) in Miami, Florida, has completed the deployment of this fully operational, web-based forecast system. New functional modules and annual waste forecast data updates have been added to ensure the long-term viability and value of this system. In conclusion: WIMS continues to successfully accomplish the goals and objectives set forth by DOE for this project. WIMS has replaced the historic process of each DOE site gathering, organizing, and reporting their waste forecast information utilizing different database and display technologies. In addition, WIMS meets DOE's objective to have the complex-wide waste forecast information available to all stakeholders and the public in one easy-to-navigate system. The enhancements to WIMS made over the year since its web deployment include the addition of new DOE sites, an updated data set, and the ability to easily print the forecast data tables, the disposition maps, and the GIS maps. Future enhancements will include a high-level waste summary, a display of waste forecast by mode of transportation, and a user help module. The waste summary display module will provide a high-level summary view of the waste forecast data based on the selection of sites, facilities, material types, and forecast years. The waste summary report module will allow users to build custom filtered reports in a variety of formats, such as MS Excel, MS Word, and PDF. The user help module will provide a step-by-step explanation of various modules, using screen shots and general tutorials. The help module will also provide instructions for printing and margin/layout settings to assist users in using their local printers to print maps and reports. (authors)« less

  17. Profiles of youth in therapeutic group care: Associations with involuntary psychiatric examinations and readmissions.

    PubMed

    Yampolskaya, Svetlana; Mowery, Debra

    2017-01-01

    The study aims were to identify distinct subgroups among youth placed in therapeutic group care (TGC) and to examine the effect of specific constellations of risk factors on readmission to residential mental health care and involuntary psychiatric examination among youth in TGC. Several administrative databases were merged to examine outcomes for youth placed in TGC during fiscal year FY04-05 through FY07-08 (N = 1,009). Latent class analysis (LCA) was conducted. Two classes were identified: youth with multiple needs (Class 1) and lower risk youth (Class 2). Class 1 represented 45% of youth in TGC. Compared with Class 2, these youth had a greater probability of having physical health problems, parents with substance abuse problems, and more extensive histories of maltreatment. Compared with Class 2, youth with multiple needs were almost twice more likely to exhibit self-injurious behavior leading to involuntary mental health examinations, but they were less likely to be readmitted to a residential mental health care of higher level of restrictiveness, such as state inpatient psychiatric programs (SIPPs). Youth placed in Florida TGC represent a heterogeneous population and services tailored to these youth's needs are important. Youth with multiple risk factors would benefit from interventions that would address multiple areas of risk. Lower risk youth (Class 2) would benefit from interventions that would focus on promoting mental health, especially among those who have experienced threatened harm, and providing services and supports necessary for stabilizing these youth in the community. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  18. A systematic review of inequalities in psychosocial outcomes for women with breast cancer according to residential location and Indigenous status in Australia.

    PubMed

    Youl, P H; Dasgupta, P; Youlden, D; Aitken, J F; Garvey, G; Zorbas, H; Chynoweth, J; Wallington, I; Baade, P D

    2016-10-01

    The aim of this systematic review was to examine variations in psychosocial outcomes by residential location and Indigenous status in women diagnosed with breast cancer (BC) in Australia. Systematic searches were undertaken using multiple databases covering articles between 1 January 1990 and 1 March 2015 focusing on adult women with BC in an Australian setting and measuring quality of life (QOL), psychological distress or psychosocial support. Thirteen quantitative and three qualitative articles were included. Two quantitative and one qualitative article were rated high quality, seven moderate and the remaining were low quality. No studies examining inequalities by Indigenous status were identified. Non-metropolitan women were more likely to record lower QOL relating to breast cancer-specific concerns and reported a lack of information and resources specific to their needs. Continuity of support, ongoing care and access to specialist and allied health professionals were major concerns for non-metropolitan women. Non-metropolitan women identified unmet needs in relation to travel, fear of cancer recurrence and lack of psychosocial support. Overall, there was a lack of evidence relating to variations in psychosocial outcomes for women with BC according to residential status or Indigenous status. While the review identified some specific concerns for non-metropolitan women with BC, it was limited by the lack of good quality studies using standardised measures. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Urban residential fire and flame injuries: a population based study

    PubMed Central

    DiGuiseppi, C; Edwards, P; Godward, C; Roberts, I; Wade, A

    2000-01-01

    Background—Fires are a leading cause of death, but non-fatal injuries from residential fires have not been well characterised. Methods—To identify residential fire injuries that resulted in an emergency department visit, hospitalisation, or death, computerised databases from emergency departments, hospitals, ambulance and helicopter services, the fire department, and the health department, and paper records from the local coroner and fire stations were screened in a deprived urban area between June 1996 and May 1997. Result—There were 131 fire related injuries, primarily smoke inhalation (76%), an incidence of 36 (95% confidence interval (CI) 30 to 42)/100 000 person years. Forty one patients (32%) were hospitalised (11 (95% CI 8 to 15)/100 000 person years) and three people (2%) died (0.8 (95% CI 0.2 to 2.4)/100 000 person years). Injury rates were highest in those 0–4 (68 (95% CI 39 to 112)/100 000 person years) and ≥85 years (90 (95% CI 29 to 213)/100 000 person years). Rates did not vary by sex. Leading causes of injury were unintentional house fires (63%), assault (8%), clothing and nightwear ignition (6%), and controlled fires (for example, gas burners) (4%). Cooking (31%) and smoker's materials (18%) were leading fire sources. Conclusions—Because of the varied causes of fire and flame injuries, it is likely that diverse interventions, targeted to those at highest risk, that is, the elderly, young children, and the poor, may be required to address this important public health problem. PMID:11144621

  20. Neighborhood × Serotonin Transporter Linked Polymorphic Region (5-HTTLPR) interactions for substance use from ages 10 to 24 years using a harmonized data set of African American children.

    PubMed

    Windle, Michael; Kogan, Steven M; Lee, Sunbok; Chen, Yi-Fu; Lei, Karlo Mankit; Brody, Gene H; Beach, Steven R H; Yu, Tianyi

    2016-05-01

    This study investigated the influences of neighborhood factors (residential stability and neighborhood disadvantage) and variants of the serotonin transporter linked polymorphic region (5-HTTLPR) genotype on the development of substance use among African American children aged 10-24 years. To accomplish this, a harmonized data set of five longitudinal studies was created via pooling overlapping age cohorts to establish a database with 2,689 children and 12,474 data points to span ages 10-24 years. A description of steps used in the development of the harmonized data set is provided, including how issues such as the measurement equivalence of constructs were addressed. A sequence of multilevel models was specified to evaluate Gene × Environment effects on growth of substance use across time. Findings indicated that residential instability was associated with higher levels and a steeper gradient of growth in substance use across time. The inclusion of the 5-HTTLPR genotype provided greater precision to the relationships in that higher residential instability, in conjunction with the risk variant of 5-HTTLPR (i.e., the short allele), was associated with the highest level and steepest gradient of growth in substance use across ages 10-24 years. The findings demonstrated how the creation of a harmonized data set increased statistical power to test Gene × Environment interactions for an under studied sample.

  1. Tracking Expected Improvements of Decadal Prediction in Climate Services

    NASA Astrophysics Data System (ADS)

    Suckling, E.; Thompson, E.; Smith, L. A.

    2013-12-01

    Physics-based simulation models are ultimately expected to provide the best available (decision-relevant) probabilistic climate predictions, as they can capture the dynamics of the Earth System across a range of situations, situations for which observations for the construction of empirical models are scant if not nonexistent. This fact in itself provides neither evidence that predictions from today's Earth Systems Models will outperform today's empirical models, nor a guide to the space and time scales on which today's model predictions are adequate for a given purpose. Empirical (data-based) models are employed to make probability forecasts on decadal timescales. The skill of these forecasts is contrasted with that of state-of-the-art climate models, and the challenges faced by each approach are discussed. The focus is on providing decision-relevant probability forecasts for decision support. An empirical model, known as Dynamic Climatology is shown to be competitive with CMIP5 climate models on decadal scale probability forecasts. Contrasting the skill of simulation models not only with each other but also with empirical models can reveal the space and time scales on which a generation of simulation models exploits their physical basis effectively. It can also quantify their ability to add information in the formation of operational forecasts. Difficulties (i) of information contamination (ii) of the interpretation of probabilistic skill and (iii) of artificial skill complicate each modelling approach, and are discussed. "Physics free" empirical models provide fixed, quantitative benchmarks for the evaluation of ever more complex climate models, that is not available from (inter)comparisons restricted to only complex models. At present, empirical models can also provide a background term for blending in the formation of probability forecasts from ensembles of simulation models. In weather forecasting this role is filled by the climatological distribution, and can significantly enhance the value of longer lead-time weather forecasts to those who use them. It is suggested that the direct comparison of simulation models with empirical models become a regular component of large model forecast intercomparison and evaluation. This would clarify the extent to which a given generation of state-of-the-art simulation models provide information beyond that available from simpler empirical models. It would also clarify current limitations in using simulation forecasting for decision support. No model-based probability forecast is complete without a quantitative estimate if its own irrelevance; this estimate is likely to increase as a function of lead time. A lack of decision-relevant quantitative skill would not bring the science-based foundation of anthropogenic warming into doubt. Similar levels of skill with empirical models does suggest a clear quantification of limits, as a function of lead time, for spatial and temporal scales on which decisions based on such model output are expected to prove maladaptive. Failing to clearly state such weaknesses of a given generation of simulation models, while clearly stating their strength and their foundation, risks the credibility of science in support of policy in the long term.

  2. Analysis and forecasting of municipal solid waste in Nankana City using geo-spatial techniques.

    PubMed

    Mahmood, Shakeel; Sharif, Faiza; Rahman, Atta-Ur; Khan, Amin U

    2018-04-11

    The objective of this study was to analyze and forecast municipal solid waste (MSW) in Nankana City (NC), District Nankana, Province of Punjab, Pakistan. The study is based on primary data acquired through a questionnaire, Global Positioning System (GPS), and direct waste sampling and analysis. Inverse distance weighting (IDW) technique was applied to geo-visualize the spatial trend of MSW generation. Analysis revealed that the total MSW generated was 12,419,636 kg/annum (12,419.64 t) or 34,026.4 kg/day (34.03 t), or 0.46 kg/capita/day (kg/cap/day). The average wastes generated per day by studied households, clinics, hospitals, and hotels were 3, 7.5, 20, and 15 kg, respectively. The residential sector was the top producer with 95.5% (32,511 kg/day) followed by commercial sector 1.9% (665 kg/day). On average, high-income and low-income households were generating waste of 4.2 kg/household/day (kg/hh/day) and 1.7 kg/hh/day, respectively. Similarly, large-size families were generating more (4.4 kg/hh/day) waste than small-size families (1.8 kg/hh/day). The physical constituents of MSW generated in the study area with a population of about 70,000 included paper (7%); compostable matter (61%); plastics (9%); fine earth, ashes, ceramics, and stones (20.4%); and others (2.6%).The spatial trend of MSW generation varies; city center has a high rate of generation and towards periphery generation lowers. Based on the current population growth and MSW generation rate, NC is expected to generate 2.8 times more waste by the year 2050.This is imperative to develop a proper solid waste management plan to reduce the risk of environmental degradation and protect human health. This study provides insights into MSW generation rate, physical composition, and forecasting which are vital in its management strategies.

  3. On the future of carbonaceous aerosol emissions

    NASA Astrophysics Data System (ADS)

    Streets, D. G.; Bond, T. C.; Lee, T.; Jang, C.

    2004-12-01

    This paper presents the first model-based forecasts of future emissions of the primary carbonaceous aerosols, black carbon (BC) and organic carbon (OC). The forecasts build on a recent 1996 inventory of emissions that contains detailed fuel, technology, sector, and world-region specifications. The forecasts are driven by four IPCC scenarios, A1B, A2, B1, and B2, out to 2030 and 2050, incorporating not only changing patterns of fuel use but also technology development. Emissions from both energy generation and open biomass burning are included. We project that global BC emissions will decline from 8.0 Tg in 1996 to 5.3-7.3 Tg by 2030 and to 4.3-6.1 Tg by 2050, across the range of scenarios. We project that OC emissions will decline from 34 Tg in 1996 to 24-30 Tg by 2030 and to 21-28 Tg by 2050. The introduction of advanced technology with lower emission rates, as well as a shift away from the use of traditional solid fuels in the residential sector, more than offsets the increased combustion of fossil fuels worldwide. Environmental pressures and a diminishing demand for new agricultural land lead to a slow decline in the amount of open biomass burning. Although emissions of BC and OC are generally expected to decline around the world, some regions, particularly South America, northern Africa, the Middle East, South Asia, Southeast Asia, and Oceania, show increasing emissions in several scenarios. Particularly difficult to control are BC emissions from the transport sector, which increase under most scenarios. We expect that the BC/OC emission ratio for energy sources will rise from 0.5 to as much as 0.8, signifying a shift toward net warming of the climate system due to carbonaceous aerosols. When biomass burning is included, however, the BC/OC emission ratios are for the most part invariant across scenarios at about 0.2.

  4. Search of medical literature for indoor carbon monoxide exposure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brennan, T.; Ivanovich, M.

    1995-12-01

    This report documents a literature search on carbon monoxide. The search was limited to the medical and toxicological databases at the National Library of Medicine (MEDLARS). The databases searched were Medline, Toxline and TOXNET. Searches were performed using a variety of strategies. Combinations of the following keywords were used: carbon, monoxide, accidental, residential, occult, diagnosis, misdiagnosis, heating, furnace, and indoor. The literature was searched from 1966 to the present. Over 1000 references were identified and summarized using the following abbreviations: The major findings of the search are: (1) Acute and subacute carbon monoxide exposures result in a large number ofmore » symptoms affecting the brain, kidneys, respiratory system, retina, and motor functions. (2) Acute and subacute carbon monoxide (CO) poisonings have been misdiagnosed on many occasions. (3) Very few systematic investigations have been made into the frequency and consequences of carbon monoxide poisonings.« less

  5. Establishment of a national database to link epidemiological and molecular data from norovirus outbreaks in Ireland.

    PubMed

    Kelly, S; Foley, B; Dunford, L; Coughlan, S; Tuite, G; Duffy, M; Mitchell, S; Smyth, B; O'Neill, H; McKeown, P; Hall, W; Lynch, M

    2008-11-01

    A prospective study of norovirus outbreaks in Ireland was carried out over a 1-year period from 1 October 2004 to 30 September 2005. Epidemiological and molecular data on norovirus outbreaks in the Republic of Ireland (ROI) and Northern Ireland (NI) were collected and combined in real time in a common database. Most reported outbreaks occurred in hospitals and residential institutions and person-to-person spread was the predominant mode of transmission. The predominant circulating norovirus strain was the GII.4-2004 strain with a small number of outbreaks due to GII.2. This study represents the first time that enhanced epidemiological and virological data on norovirus outbreaks in Ireland have been described. The link established between the epidemiological and virological institutions during the course of this study has been continued and the data is being used as a source of data for the Foodborne Viruses in Europe Network (DIVINE-NET).

  6. Identifying residential neighbourhood types from settlement points in a machine learning approach.

    PubMed

    Jochem, Warren C; Bird, Tomas J; Tatem, Andrew J

    2018-05-01

    Remote sensing techniques are now commonly applied to map and monitor urban land uses to measure growth and to assist with development and planning. Recent work in this area has highlighted the use of textures and other spatial features that can be measured in very high spatial resolution imagery. Far less attention has been given to using geospatial vector data (i.e. points, lines, polygons) to map land uses. This paper presents an approach to distinguish residential settlement types (regular vs. irregular) using an existing database of settlement points locating structures. Nine data features describing the density, distance, angles, and spacing of the settlement points are calculated at multiple spatial scales. These data are analysed alone and with five common remote sensing measures on elevation, slope, vegetation, and nighttime lights in a supervised machine learning approach to classify land use areas. The method was tested in seven provinces of Afghanistan (Balkh, Helmand, Herat, Kabul, Kandahar, Kunduz, Nangarhar). Overall accuracy ranged from 78% in Kandahar to 90% in Nangarhar. This research demonstrates the potential to accurately map land uses from even the simplest representation of structures.

  7. Water Data Report: An Annotated Bibliography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunham Whitehead, Camilla; Melody, Moya

    2007-05-01

    This report and its accompanying Microsoft Excel workbooksummarize water data we found to support efforts of the EnvironmentalProtection Agency s WaterSense program. WaterSense aims to extend theoperating life of water and wastewater treatment facilities and prolongthe availability of water resourcesby reducing residential andcommercial water consumption through the voluntary replacement ofinefficient water-using products with more efficient ones. WaterSense hasan immediate need for water consumption data categorized by sector and,for the residential sector, per capita data available by region. Thisinformation will assist policy makers, water and wastewater utilityplanners, and others in defining and refining program possibilities.Future data needs concern water supply, wastewatermore » flow volumes, waterquality, and watersheds. This report focuses primarily on the immediateneed for data regarding water consumption and product end-use. We found avariety of data on water consumption at the national, state, andmunicipal levels. We also found several databases related towater-consuming products. Most of the data are available in electronicform on the Web pages of the data-collecting organizations. In addition,we found national, state, and local data on water supply, wastewater,water quality, and watersheds.« less

  8. A High-Granularity Approach to Modeling Energy Consumption and Savings Potential in the U.S. Residential Building Stock

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Building simulations are increasingly used in various applications related to energy efficient buildings. For individual buildings, applications include: design of new buildings, prediction of retrofit savings, ratings, performance path code compliance and qualification for incentives. Beyond individual building applications, larger scale applications (across the stock of buildings at various scales: national, regional and state) include: codes and standards development, utility program design, regional/state planning, and technology assessments. For these sorts of applications, a set of representative buildings are typically simulated to predict performance of the entire population of buildings. Focusing on the U.S. single-family residential building stock, this paper willmore » describe how multiple data sources for building characteristics are combined into a highly-granular database that preserves the important interdependencies of the characteristics. We will present the sampling technique used to generate a representative set of thousands (up to hundreds of thousands) of building models. We will also present results of detailed calibrations against building stock consumption data.« less

  9. Review of Methods for Buildings Energy Performance Modelling

    NASA Astrophysics Data System (ADS)

    Krstić, Hrvoje; Teni, Mihaela

    2017-10-01

    Research presented in this paper gives a brief review of methods used for buildings energy performance modelling. This paper gives also a comprehensive review of the advantages and disadvantages of available methods as well as the input parameters used for modelling buildings energy performance. European Directive EPBD obliges the implementation of energy certification procedure which gives an insight on buildings energy performance via exiting energy certificate databases. Some of the methods for buildings energy performance modelling mentioned in this paper are developed by employing data sets of buildings which have already undergone an energy certification procedure. Such database is used in this paper where the majority of buildings in the database have already gone under some form of partial retrofitting - replacement of windows or installation of thermal insulation but still have poor energy performance. The case study presented in this paper utilizes energy certificates database obtained from residential units in Croatia (over 400 buildings) in order to determine the dependence between buildings energy performance and variables from database by using statistical dependencies tests. Building energy performance in database is presented with building energy efficiency rate (from A+ to G) which is based on specific annual energy needs for heating for referential climatic data [kWh/(m2a)]. Independent variables in database are surfaces and volume of the conditioned part of the building, building shape factor, energy used for heating, CO2 emission, building age and year of reconstruction. Research results presented in this paper give an insight in possibilities of methods used for buildings energy performance modelling. Further on it gives an analysis of dependencies between buildings energy performance as a dependent variable and independent variables from the database. Presented results could be used for development of new building energy performance predictive model.

  10. 77 FR 28519 - Test Procedure Guidance for Room Air Conditioners, Residential Dishwashers, and Residential...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-15

    ... Guidance for Room Air Conditioners, Residential Dishwashers, and Residential Clothes Washers: Public... procedures for room air conditioners, residential dishwashers, and residential clothes washers. DATES: DOE...'s existing test procedures for residential room air conditioners, residential dishwashers, and...

  11. Density, destinations or both? A comparison of measures of walkability in relation to transportation behaviors, obesity and diabetes in Toronto, Canada.

    PubMed

    Glazier, Richard H; Creatore, Maria I; Weyman, Jonathan T; Fazli, Ghazal; Matheson, Flora I; Gozdyra, Peter; Moineddin, Rahim; Kaufman-Shriqui, Vered; Shriqui, Vered Kaufman; Booth, Gillian L

    2014-01-01

    The design of suburban communities encourages car dependency and discourages walking, characteristics that have been implicated in the rise of obesity. Walkability measures have been developed to capture these features of urban built environments. Our objective was to examine the individual and combined associations of residential density and the presence of walkable destinations, two of the most commonly used and potentially modifiable components of walkability measures, with transportation, overweight, obesity, and diabetes. We examined associations between a previously published walkability measure and transportation behaviors and health outcomes in Toronto, Canada, a city of 2.6 million people in 2011. Data sources included the Canada census, a transportation survey, a national health survey and a validated administrative diabetes database. We depicted interactions between residential density and the availability of walkable destinations graphically and examined them statistically using general linear modeling. Individuals living in more walkable areas were more than twice as likely to walk, bicycle or use public transit and were significantly less likely to drive or own a vehicle compared with those living in less walkable areas. Individuals in less walkable areas were up to one-third more likely to be obese or to have diabetes. Residential density and the availability of walkable destinations were each significantly associated with transportation and health outcomes. The combination of high levels of both measures was associated with the highest levels of walking or bicycling (p<0.0001) and public transit use (p<0.0026) and the lowest levels of automobile trips (p<0.0001), and diabetes prevalence (p<0.0001). We conclude that both residential density and the availability of walkable destinations are good measures of urban walkability and can be recommended for use by policy-makers, planners and public health officials. In our setting, the combination of both factors provided additional explanatory power.

  12. EDGARv4 Gridded Anthropogenic Emissions of Persistent Organic Pollutants (POPs) from Power Generation, Residential and Transport Sectors: Regional Trends Analysis in East Asia.

    NASA Astrophysics Data System (ADS)

    Muntean, M.; Janssens-Maenhout, G.; Guizzardi, D.; Crippa, M.; Schaaf, E.; Olivier, J. G.; Dentener, F. J.

    2016-12-01

    Persistent organic pollutants (POPs) are toxic substances and so harmful for human health. Mitigation of these emissions are internationally addressed by the Convention on Long-range Transboundary Air Pollution and by the Stockholm Convention. A global insight on POPs emissions evolution is essential since they can be transported long distances, they bio-accumulate and damage the environment. The Emission Database for Global Atmospheric Research (EDGARv4) is currently updated with POPs. We have estimated the global emissions of Polychlorinated biphenyls (PCBs), Polychlorinated dibenzo-p-dioxins (PCDDs), Polychlorinated dibenzofurans (PCDFs), Polycyclic aromatic hydrocarbons (PAHs) (benzo[a]pyrene (BaP), benzo[b]fluoranthene (BbF), benzo[k]fluoranthene (BkF), Indeno[1,2,3-cd]pyrene (IcdP)) and Hexachlorobenzene (HCB) from fuel combustion in the power generation, residential and transport sectors. This emissions inventory has been developed by using as input to the EDGAR technology-based emissions calculation algorithm the fossil fuel consumption data from International Energy Agency (2014) and the emission factors from EMEP/EEA (2013). We provide a complete emission time series for the period 1970-2010 and discuss the trends. A comprehensive analysis of the contribution of East Asia region to the total global will be provided for each substance of the POPs group. An example is presented in Figure 1 for BaP emissions from residential sector; with emissions mainly from China, the East Asia region has a great share (32%) in the total global. We distributed the POPs emissions on gridmaps of 0.1°x0.1° resolution. Areas with high emissions in East Asia will be presented and discussed; Figure 2 shows the hot-spots in East Asia for BaP emissions from the residential sector. These emission gridmaps, used as input for the chemical transport models, contribute to the improvement of impact evaluation, which is a key element in measuring the effectiveness of mitigation measures.

  13. Density, Destinations or Both? A Comparison of Measures of Walkability in Relation to Transportation Behaviors, Obesity and Diabetes in Toronto, Canada

    PubMed Central

    Glazier, Richard H.; Creatore, Maria I.; Weyman, Jonathan T.; Fazli, Ghazal; Matheson, Flora I.; Gozdyra, Peter; Moineddin, Rahim; Shriqui, Vered Kaufman; Booth, Gillian L.

    2014-01-01

    The design of suburban communities encourages car dependency and discourages walking, characteristics that have been implicated in the rise of obesity. Walkability measures have been developed to capture these features of urban built environments. Our objective was to examine the individual and combined associations of residential density and the presence of walkable destinations, two of the most commonly used and potentially modifiable components of walkability measures, with transportation, overweight, obesity, and diabetes. We examined associations between a previously published walkability measure and transportation behaviors and health outcomes in Toronto, Canada, a city of 2.6 million people in 2011. Data sources included the Canada census, a transportation survey, a national health survey and a validated administrative diabetes database. We depicted interactions between residential density and the availability of walkable destinations graphically and examined them statistically using general linear modeling. Individuals living in more walkable areas were more than twice as likely to walk, bicycle or use public transit and were significantly less likely to drive or own a vehicle compared with those living in less walkable areas. Individuals in less walkable areas were up to one-third more likely to be obese or to have diabetes. Residential density and the availability of walkable destinations were each significantly associated with transportation and health outcomes. The combination of high levels of both measures was associated with the highest levels of walking or bicycling (p<0.0001) and public transit use (p<0.0026) and the lowest levels of automobile trips (p<0.0001), and diabetes prevalence (p<0.0001). We conclude that both residential density and the availability of walkable destinations are good measures of urban walkability and can be recommended for use by policy-makers, planners and public health officials. In our setting, the combination of both factors provided additional explanatory power. PMID:24454837

  14. Setting-related influences on physical inactivity of older adults in residential care settings: a review.

    PubMed

    Douma, Johanna G; Volkers, Karin M; Engels, Gwenda; Sonneveld, Marieke H; Goossens, Richard H M; Scherder, Erik J A

    2017-04-28

    Despite the detrimental effects of physical inactivity for older adults, especially aged residents of residential care settings may spend much time in inactive behavior. This may be partly due to their poorer physical condition; however, there may also be other, setting-related factors that influence the amount of inactivity. The aim of this review was to review setting-related factors (including the social and physical environment) that may contribute to the amount of older adults' physical inactivity in a wide range of residential care settings (e.g., nursing homes, assisted care facilities). Five databases were systematically searched for eligible studies, using the key words 'inactivity', 'care facilities', and 'older adults', including their synonyms and MeSH terms. Additional studies were selected from references used in articles included from the search. Based on specific eligibility criteria, a total of 12 studies were included. Quality of the included studies was assessed using the Mixed Methods Appraisal Tool (MMAT). Based on studies using different methodologies (e.g., interviews and observations), and of different quality (assessed quality range: 25-100%), we report several aspects related to the physical environment and caregivers. Factors of the physical environment that may be related to physical inactivity included, among others, the environment's compatibility with the abilities of a resident, the presence of equipment, the accessibility, security, comfort, and aesthetics of the environment/corridors, and possibly the presence of some specific areas. Caregiver-related factors included staffing levels, the available time, and the amount and type of care being provided. Inactivity levels in residential care settings may be reduced by improving several features of the physical environment and with the help of caregivers. Intervention studies could be performed in order to gain more insight into causal effects of improving setting-related factors on physical inactivity of aged residents.

  15. Detection of mesoscale zones of atmospheric instabilities using remote sensing and weather forecasting model data

    NASA Astrophysics Data System (ADS)

    Winnicki, I.; Jasinski, J.; Kroszczynski, K.; Pietrek, S.

    2009-04-01

    The paper presents elements of research conducted in the Faculty of Civil Engineering and Geodesy of the Military University of Technology, Warsaw, Poland, concerning application of mesoscale models and remote sensing data to determining meteorological conditions of aircraft flight directly related with atmospheric instabilities. The quality of meteorological support of aviation depends on prompt and effective forecasting of weather conditions changes. The paper presents a computer module for detecting and monitoring zones of cloud cover, precipitation and turbulence along the aircraft flight route. It consists of programs and scripts for managing, processing and visualizing meteorological and remote sensing databases. The application was developed in Matlab® for Windows®. The module uses products of COAMPS (Coupled Ocean/Atmosphere Mesoscale Prediction System) mesoscale non-hydrostatic model of the atmosphere developed by the US Naval Research Laboratory, satellite images acquisition system from the MSG-2 (Meteosat Second Generation) of the European Organization for the Exploitation of Meteorological Satellites (EUMETSAT) and meteorological radars data acquired from the Institute of Meteorology and Water Management (IMGW), Warsaw, Poland. The satellite images acquisition system and the COAMPS model are run operationally in the Faculty of Civil Engineering and Geodesy. The mesoscale model is run on an IA64 Feniks multiprocessor 64-bit computer cluster. The basic task of the module is to enable a complex analysis of data sets of miscellaneous information structure and to verify COAMPS results using satellite and radar data. The research is conducted using uniform cartographic projection of all elements of the database. Satellite and radar images are transformed into the Lambert Conformal projection of COAMPS. This facilitates simultaneous interpretation and supports decision making process for safe execution of flights. Forecasts are based on horizontal distributions and vertical profiles of meteorological parameters produced by the module. Verification of forecasts includes research of spatial and temporal correlations of structures generated by the model, e.g.: cloudiness, meteorological phenomena (fogs, precipitation, turbulence) and structures identified on current satellite images. The developed module determines meteorological parameters fields for vertical profiles of the atmosphere. Interpolation procedures run at user selected standard (pressure) or height levels of the model enable to determine weather conditions along any route of aircraft. Basic parameters of the procedures determining e.g. flight safety include: cloud base, visibility, cloud cover, turbulence coefficient, icing and precipitation intensity. Determining icing and turbulence characteristics is based on standard and new methods (from other mesoscale models). The research includes also investigating new generation mesoscale models, especially remote sensing data assimilation. This is required by necessity to develop and introduce objective methods of forecasting weather conditions. Current research in the Faculty of Civil Engineering and Geodesy concerns validation of the mesoscale module performance.

  16. Environmental and health disparities in residential communities of New Orleans: the need for soil lead intervention to advance primary prevention.

    PubMed

    Mielke, Howard W; Gonzales, Christopher R; Powell, Eric T; Mielke, Paul W

    2013-01-01

    Urban environments are the major sites for human habitation and this study evaluates soil lead (Pb) and blood Pb at the community scale of a U.S. city. There is no safe level of Pb exposure for humans and novel primary Pb prevention strategies are requisite to mitigate children's Pb exposure and health disparities observed in major cities. We produced a rich source of environmental and Pb exposure data for metropolitan New Orleans by combining a large soil Pb database (n=5467) with blood Pb databases (n=55,551 pre-Katrina and 7384 post-Katrina) from the Louisiana Childhood Lead Poisoning Prevention Program (LACLPPP). Reanalysis of pre- and post-Hurricane Katrina soil samples indicates relatively unchanged soil Pb. The objective was to evaluate the New Orleans soil Pb and blood Pb database for basic information about conditions that may merit innovative ways to pursue primary Pb exposure prevention. The city was divided into high (median census tract soil≥100 mg/kg) and low Pb areas (median census tract soil<100mg/kg). Soil and blood Pb concentrations within the high and low Pb areas of New Orleans were analyzed by permutation statistical methods. The high Pb areas are toward the interior of the city where median soil Pb was 367, 313, 1228, and 103 mg/kg, respectively for samples collected at busy streets, residential streets, house sides, and open space locations; the low Pb areas are in outlying neighborhoods of the city where median soil Pb was 64, 46, 32, and 28 mg/kg, respectively for busy streets, residential streets, house sides, and open spaces (P-values<10(-16)). Pre-Katrina children's blood Pb prevalence of ≥5 μg/dL was 58.5% and 24.8% for the high and low Pb areas, respectively compared to post-Katrina prevalence of 29.6% and 7.5%, for high and low Pb areas, respectively. Elevated soil Pb permeates interior areas of the city and children living there generally lack Pb safe areas for outdoor play. Soil Pb medians in outlying areas were safer by factors ranging from 3 to 38 depending on specific location. Patterns of Pb deposition from many decades of accumulation have not been transformed by hastily conducted renovations during the seven year interval since Hurricane Katrina. Low Pb soils available outside of cities can remedy soil Pb contamination within city interiors. Mapping soil Pb provides an overview of deposition characteristics and assists with planning and conducting primary Pb exposure prevention. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Mining Software Usage with the Automatic Library Tracking Database (ALTD)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadri, Bilel; Fahey, Mark R

    2013-01-01

    Tracking software usage is important for HPC centers, computer vendors, code developers and funding agencies to provide more efficient and targeted software support, and to forecast needs and guide HPC software effort towards the Exascale era. However, accurately tracking software usage on HPC systems has been a challenging task. In this paper, we present a tool called Automatic Library Tracking Database (ALTD) that has been developed and put in production on several Cray systems. The ALTD infrastructure prototype automatically and transparently stores information about libraries linked into an application at compilation time and also the executables launched in a batchmore » job. We will illustrate the usage of libraries, compilers and third party software applications on a system managed by the National Institute for Computational Sciences.« less

  18. European cardiovascular mortality over the last three decades: evaluation of time trends, forecasts for 2016.

    PubMed

    Gaeta, M; Campanella, F; Gentile, L; Schifino, G M; Capasso, L; Bandera, F; Banfi, G; Arpesella, M; Ricci, C

    2017-01-01

    The circulatory diseases, in particular ischemic heart diseases and stroke, represent the main causes of death worldwide both in high income and in middle and low income countries. Our aim is to provide a comprehensive report to depict the circulatory disease mortality in Europe over the last 30 years and to address the sources of heterogeneity among different countries. Our study was performed using the WHO statistical information system - mortality database - and was restricted to the 28 countries belonging to the European Union (EU-28). We evaluated gender and age time series of all circulatory disease mortality, ischemic heart diseases, cerebrovascular diseases, pulmonary and other circulatory diseases and than we performed forecast for 2016. Mortality heterogeneity was evaluated by countries using the Cochrane Q statistic and the I-squared index. Between 1985 and 2011 SDR for deaths attributable to all circulatory system diseases decreased from 440.9 to 212.0 x 100,000 in EU-28 and a clear uniform reduction was observed. Heterogeneity among countries was found to be consistent, therefore different analysis were carried out considering geographical area. We forecast a reduction in European cardiovascular mortality. Heterogeneity among countries could only in part be explained by both geographical and health expenditure factors.

  19. Recent advances in the compilation of holocene relative Sea-level database in North America

    NASA Astrophysics Data System (ADS)

    Horton, B.; Vacchi, M.; Engelhart, S. E.; Nikitina, D.

    2015-12-01

    Reconstruction of relative sea level (RSL) has implications for investigation of crustal movements, calibration of earth rheology models and the reconstruction of ice sheets. In recent years, efforts were made to create RSL databases following a standardized methodology. These regional databases provided a framework for developing our understanding of the primary mechanisms of RSL change since the Last Glacial Maximum and a long-term baseline against which to gauge changes in sea-level during the 20th century and forecasts for the 21st. Here we present two quality-controlled Holocene RSL database compiled for North America. Along the Pacific coast of North America (British Columbia, Canada to California, USA), our re-evaluation of sea-level indicators from geological and archaeological investigations yield 841 RSL data-points mainly from salt and freshwater wetlands or adjacent estuarine sediment as well as from isolation basin. Along the Atlantic coast of North America (Hudson Bay, Canada to South Carolina, USA), we are currently compiling a database including more than 2000 RSL data-points from isolation basin, salt and freshwater wetlands, beach ridges and intratidal deposits. We outline the difficulties and solutions we made to compile databases in such different depostional environment. We address complex tectonics and the framework to compare such large variability of RSL data-point. We discuss the implications of our results for the glacio-isostatic adjustment (GIA) models in the two studied regions.

  20. Test operation of a real-time tsunami inundation forecast system using actual data observed by S-net

    NASA Astrophysics Data System (ADS)

    Suzuki, W.; Yamamoto, N.; Miyoshi, T.; Aoi, S.

    2017-12-01

    If the tsunami inundation information can be rapidly and stably forecast before the large tsunami attacks, the information would have effectively people realize the impeding danger and necessity of evacuation. Toward that goal, we have developed a prototype system to perform the real-time tsunami inundation forecast for Chiba prefecture, eastern Japan, using off-shore ocean bottom pressure data observed by the seafloor observation network for earthquakes and tsunamis along the Japan Trench (S-net) (Aoi et al., 2015, AGU). Because tsunami inundation simulation requires a large computation cost, we employ a database approach searching the pre-calculated tsunami scenarios that reasonably explain the observed S-net pressure data based on the multi-index method (Yamamoto et al., 2016, EPS). The scenario search is regularly repeated, not triggered by the occurrence of the tsunami event, and the forecast information is generated from the selected scenarios that meet the criterion. Test operation of the prototype system using the actual observation data started in April, 2017 and the performance and behavior of the system during non-tsunami event periods have been examined. It is found that the treatment of the noises affecting the observed data is the main issue to be solved toward the improvement of the system. Even if the observed pressure data are filtered to extract the tsunami signals, the noises in ordinary times or unusually large noises like high ocean waves due to storm affect the comparison between the observed and scenario data. Due to the noises, the tsunami scenarios are selected and the tsunami is forecast although any tsunami event does not actually occur. In most cases, the selected scenarios due to the noises have the fault models in the region along the Kurile or Izu-Bonin Trenches, far from the S-net region, or the fault models below the land. Based on the parallel operation of the forecast system with a different scenario search condition and examination of the fault models, we improve the stability and performance of the forecast system.This work was supported by Council for Science, Technology and Innovation(CSTI), Cross-ministerial Strategic Innovation Promotion Program (SIP), "Enhancement of societal resiliency against natural disasters"(Funding agency: JST).

  1. Potential assessment of a neural network model with PCA/RBF approach for forecasting pollutant trends in Mong Kok urban air, Hong Kong.

    PubMed

    Lu, Wei-Zhen; Wang, Wen-Jian; Wang, Xie-Kang; Yan, Sui-Hang; Lam, Joseph C

    2004-09-01

    The forecasting of air pollutant trends has received much attention in recent years. It is an important and popular topic in environmental science, as concerns have been raised about the health impacts caused by unacceptable ambient air pollutant levels. Of greatest concern are metropolitan cities like Hong Kong. In Hong Kong, respirable suspended particulates (RSP), nitrogen oxides (NOx), and nitrogen dioxide (NO2) are major air pollutants due to the dominant usage of diesel fuel by commercial vehicles and buses. Hence, the study of the influence and the trends relating to these pollutants is extremely significant to the public health and the image of the city. The use of neural network techniques to predict trends relating to air pollutants is regarded as a reliable and cost-effective method for the task of prediction. The works reported here involve developing an improved neural network model that combines both the principal component analysis technique and the radial basis function network and forecasts pollutant tendencies based on a recorded database. Compared with general neural network models, the proposed model features a more simple network architecture, a faster training speed, and a more satisfactory prediction performance. The improved model was evaluated with hourly time series of RSP, NOx and NO2 concentrations monitored at the Mong Kok Roadside Gaseous Monitory Station in Hong Kong during the year 2000 and proved to be effective. The model developed is a potential tool for forecasting air quality parameters and is superior to traditional neural network methods.

  2. The challenge of forecasting impacts of flash floods: test of a simplified hydraulic approach and validation based on insurance claim data

    NASA Astrophysics Data System (ADS)

    Le Bihan, Guillaume; Payrastre, Olivier; Gaume, Eric; Moncoulon, David; Pons, Frédéric

    2017-11-01

    Up to now, flash flood monitoring and forecasting systems, based on rainfall radar measurements and distributed rainfall-runoff models, generally aimed at estimating flood magnitudes - typically discharges or return periods - at selected river cross sections. The approach presented here goes one step further by proposing an integrated forecasting chain for the direct assessment of flash flood possible impacts on inhabited areas (number of buildings at risk in the presented case studies). The proposed approach includes, in addition to a distributed rainfall-runoff model, an automatic hydraulic method suited for the computation of flood extent maps on a dense river network and over large territories. The resulting catalogue of flood extent maps is then combined with land use data to build a flood impact curve for each considered river reach, i.e. the number of inundated buildings versus discharge. These curves are finally used to compute estimated impacts based on forecasted discharges. The approach has been extensively tested in the regions of Alès and Draguignan, located in the south of France, where well-documented major flash floods recently occurred. The article presents two types of validation results. First, the automatically computed flood extent maps and corresponding water levels are tested against rating curves at available river gauging stations as well as against local reference or observed flood extent maps. Second, a rich and comprehensive insurance claim database is used to evaluate the relevance of the estimated impacts for some recent major floods.

  3. Advances in Landslide Hazard Forecasting: Evaluation of Global and Regional Modeling Approach

    NASA Technical Reports Server (NTRS)

    Kirschbaum, Dalia B.; Adler, Robert; Hone, Yang; Kumar, Sujay; Peters-Lidard, Christa; Lerner-Lam, Arthur

    2010-01-01

    A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that exhibit a high potential for landslide activity by combining a calculation of landslide susceptibility with satellite-derived rainfall estimates. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale landslide forecasting efforts, it requires several modifications before it can be fully realized as an operational tool. The evaluation finds that the landslide forecasting may be more feasible at a regional scale. This study draws upon a prior work's recommendations to develop a new approach for considering landslide susceptibility and forecasting at the regional scale. This case study uses a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America: Guatemala, Honduras, EI Salvador and Nicaragua. A regional susceptibility map is calculated from satellite and surface datasets using a statistical methodology. The susceptibility map is tested with a regional rainfall intensity-duration triggering relationship and results are compared to global algorithm framework for the Hurricane Mitch event. The statistical results suggest that this regional investigation provides one plausible way to approach some of the data and resolution issues identified in the global assessment, providing more realistic landslide forecasts for this case study. Evaluation of landslide hazards for this extreme event helps to identify several potential improvements of the algorithm framework, but also highlights several remaining challenges for the algorithm assessment, transferability and performance accuracy. Evaluation challenges include representation errors from comparing susceptibility maps of different spatial resolutions, biases in event-based landslide inventory data, and limited nonlandslide event data for more comprehensive evaluation. Additional factors that may improve algorithm performance accuracy include incorporating additional triggering factors such as tectonic activity, anthropogenic impacts and soil moisture into the algorithm calculation. Despite these limitations, the methodology presented in this regional evaluation is both straightforward to calculate and easy to interpret, making results transferable between regions and allowing findings to be placed within an inter-comparison framework. The regional algorithm scenario represents an important step in advancing regional and global-scale landslide hazard assessment and forecasting.

  4. A spatial database for landslides in northern Bavaria: A methodological approach

    NASA Astrophysics Data System (ADS)

    Jäger, Daniel; Kreuzer, Thomas; Wilde, Martina; Bemm, Stefan; Terhorst, Birgit

    2018-04-01

    Landslide databases provide essential information for hazard modeling, damages on buildings and infrastructure, mitigation, and research needs. This study presents the development of a landslide database system named WISL (Würzburg Information System on Landslides), currently storing detailed landslide data for northern Bavaria, Germany, in order to enable scientific queries as well as comparisons with other regional landslide inventories. WISL is based on free open source software solutions (PostgreSQL, PostGIS) assuring good correspondence of the various softwares and to enable further extensions with specific adaptions of self-developed software. Apart from that, WISL was designed to be particularly compatible for easy communication with other databases. As a central pre-requisite for standardized, homogeneous data acquisition in the field, a customized data sheet for landslide description was compiled. This sheet also serves as an input mask for all data registration procedures in WISL. A variety of "in-database" solutions for landslide analysis provides the necessary scalability for the database, enabling operations at the local server. In its current state, WISL already enables extensive analysis and queries. This paper presents an example analysis of landslides in Oxfordian Limestones in the northeastern Franconian Alb, northern Bavaria. The results reveal widely differing landslides in terms of geometry and size. Further queries related to landslide activity classifies the majority of the landslides as currently inactive, however, they clearly possess a certain potential for remobilization. Along with some active mass movements, a significant percentage of landslides potentially endangers residential areas or infrastructure. The main aspect of future enhancements of the WISL database is related to data extensions in order to increase research possibilities, as well as to transfer the system to other regions and countries.

  5. Forecasting the Solar Drivers of Solar Energetic Particle Events

    NASA Technical Reports Server (NTRS)

    Falconer, David A.; Moore, Ronald L.; Barghouty, Abdulnasser F.; Khazanov, Igor

    2012-01-01

    Large flares and fast CMEs are the drivers of the most severe space weather including Solar Energetic Particle Events (SEP Events). Large flares and their co-produced CMEs are powered by the explosive release of free magnetic energy stored in non-potential magnetic fields of sunspot active regions. The free energy is stored in and released from the low-beta regime of the active region's magnetic field above the photosphere, in the chromosphere and low corona. From our work over the past decade and from similar work of several other groups, it is now well established that (1) a proxy of the free magnetic energy stored above the photosphere can be measured from photospheric magnetograms, maps of the measured field in the photosphere, and (2) an active region's rate of production of major CME/flare eruptions in the coming day or so is strongly correlated with its present measured value of the free-energy proxy. These results have led us to use the large database of SOHO/MDI full-disk magnetograms spanning Solar Cycle 23 to obtain empirical forecasting curves that from an active region's present measured value of the free-energy proxy give the active region's expected rates of production of major flares, CMEs, fast CMEs, and SEP Events in the coming day or so (Falconer et al 2011, Space Weather, 9, S04003). We will present these forecasting curves and demonstrate the accuracy of their forecasts. In addition, we will show that the forecasts for major flares and fast CMEs can be made significantly more accurate by taking into account not only the value of the free energy proxy but also the active region's recent productivity of major flares; specifically, whether the active region has produced a major flare (GOES class M or X) during the past 24 hours before the time of the measured magnetogram.

  6. Using Haines Index coupled with fire weather model predicted from high resolution LAM forecasts to asses wildfire extreme behaviour in Southern Europe.

    NASA Astrophysics Data System (ADS)

    Gaetani, Francesco; Baptiste Filippi, Jean; Simeoni, Albert; D'Andrea, Mirko

    2010-05-01

    Haines Index (HI) was developed by USDA Forest Service to measure the atmosphere's contribution to the growth potential of a wildfire. The Haines Index combines two atmospheric factors that are known to have an effect on wildfires: Stability and Dryness. As operational tools, HI proved its ability to predict plume dominated high intensity wildfires. However, since HI does not take into account the fuel continuity, composition and moisture conditions and the effects of wind and topography on fire behaviour, its use as forecasting tool should be carefully considered. In this work we propose the use of HI, predicted from HR Limited Area Model forecasts, coupled with a Fire Weather model (i.e., RISICO system) fully operational in Italy since 2003. RISICO is based on dynamic models able to represent in space and in time the effects that environment and vegetal physiology have on fuels and, in turn, on the potential behaviour of wildfires. The system automatically acquires from remote databases a thorough data-set of input information both of in situ and spatial nature. Meteorological observations, radar data, Limited Area Model weather forecasts, EO data, and fuel data are managed by a Unified Interface able to process a wide set of different data. Specific semi-physical models are used in the system to simulate the dynamics of the fuels (load and moisture contents of dead and live fuel) and the potential fire behaviour (rate of spread and linear intensity). A preliminary validation of this approach will be provided with reference to Sardinia and Corsica Islands, two major islands of the Mediterranean See frequently affected by extreme plume dominated wildfires. A time series of about 3000 wildfires burnt in Sardinia and Corsica in 2007 and 2008 will be used to evaluate the capability of HI coupled with the outputs of the Fire Weather model to forecast the actual risk in time and in space.

  7. Physician supply forecast: better than peering in a crystal ball?

    PubMed Central

    Roberfroid, Dominique; Leonard, Christian; Stordeur, Sabine

    2009-01-01

    Background Anticipating physician supply to tackle future health challenges is a crucial but complex task for policy planners. A number of forecasting tools are available, but the methods, advantages and shortcomings of such tools are not straightforward and not always well appraised. Therefore this paper had two objectives: to present a typology of existing forecasting approaches and to analyse the methodology-related issues. Methods A literature review was carried out in electronic databases Medline-Ovid, Embase and ERIC. Concrete examples of planning experiences in various countries were analysed. Results Four main forecasting approaches were identified. The supply projection approach defines the necessary inflow to maintain or to reach in the future an arbitrary predefined level of service offer. The demand-based approach estimates the quantity of health care services used by the population in the future to project physician requirements. The needs-based approach involves defining and predicting health care deficits so that they can be addressed by an adequate workforce. Benchmarking health systems with similar populations and health profiles is the last approach. These different methods can be combined to perform a gap analysis. The methodological challenges of such projections are numerous: most often static models are used and their uncertainty is not assessed; valid and comprehensive data to feed into the models are often lacking; and a rapidly evolving environment affects the likelihood of projection scenarios. As a result, the internal and external validity of the projections included in our review appeared limited. Conclusion There is no single accepted approach to forecasting physician requirements. The value of projections lies in their utility in identifying the current and emerging trends to which policy-makers need to respond. A genuine gap analysis, an effective monitoring of key parameters and comprehensive workforce planning are key elements to improving the usefulness of physician supply projections. PMID:19216772

  8. The regional geological hazard forecast based on rainfall and WebGIS in Hubei, China

    NASA Astrophysics Data System (ADS)

    Zheng, Guizhou; Chao, Yi; Xu, Hongwen

    2008-10-01

    Various disasters have been a serious threat to human and are increasing over time. The reduction and prevention of hazard is the largest problem faced by local governments. The study of disasters has drawn more and more attention mainly due to increasing awareness of the socio-economic impact of disasters. Hubei province, one of the highest economic developing provinces in China, suffered big economic losses from geo-hazards in recent years due to frequent geo-hazard events with the estimated damage of approximately 3000 million RMB. It is therefore important to establish an efficient way to mitigate potential damage and reduce losses of property and life derived from disasters. This paper presents the procedure of setting up a regional geological hazard forecast and information releasing system of Hubei province with the combination of advanced techniques such as World Wide Web (WWW), database online and ASP based on WEBGIS platform (MAPGIS-IMS) and rainfall information. A Web-based interface was developed using a three-tiered architecture based on client-server technology in this system. The study focused on the upload of the rainfall data, the definition of rainfall threshold values, the creation of geological disaster warning map and the forecast of geohazard relating to the rainfall. Its purposes are to contribute to the management of mass individual and regional geological disaster spatial data, help to forecast the conditional probabilities of occurrence of various disasters that might be posed by the rainfall, and release forecasting information of Hubei province timely via the internet throughout all levels of government, the private and nonprofit sectors, and the academic community. This system has worked efficiently and stably in the internet environment which is strongly connected with meteorological observatory. Environment Station of Hubei Province are making increased use of our Web-tool to assist in the decision-making process to analyze geo-hazard in Hubei Province. It would be more helpful to present the geo-hazard information for Hubei administrator.

  9. Contribution of insurance data to cost assessment of coastal flood damage to residential buildings: insights gained from Johanna (2008) and Xynthia (2010) storm events

    NASA Astrophysics Data System (ADS)

    André, C.; Monfort, D.; Bouzit, M.; Vinchon, C.

    2013-08-01

    There are a number of methodological issues involved in assessing damage caused by natural hazards. The first is the lack of data, due to the rarity of events and the widely different circumstances in which they occur. Thus, historical data, albeit scarce, should not be neglected when seeking to build ex-ante risk management models. This article analyses the input of insurance data for two recent severe coastal storm events, to examine what causal relationships may exist between hazard characteristics and the level of damage incurred by residential buildings. To do so, data was collected at two levels: from lists of about 4000 damage records, 358 loss adjustment reports were consulted, constituting a detailed damage database. The results show that for flooded residential buildings, over 75% of reconstruction costs are associated with interior elements, with damage to structural components remaining very localised and negligible. Further analysis revealed a high scatter between costs and water depth, suggesting that uncertainty remains high in drawing up damage functions with insurance data alone. Due to the paper format of the loss adjustment reports, and the lack of harmonisation between their contents, the collection stage called for a considerable amount of work. For future events, establishing a standardised process for archiving damage information could significantly contribute to the production of such empirical damage functions. Nevertheless, complementary sources of data on hazards and asset vulnerability parameters will definitely still be necessary for damage modelling; multivariate approaches, crossing insurance data with external material, should also be investigated more deeply.

  10. Contribution of insurance data to cost assessment of coastal flood damage to residential buildings: insights gained from Johanna (2008) and Xynthia (2010) storm events

    NASA Astrophysics Data System (ADS)

    André, C.; Monfort, D.; Bouzit, M.; Vinchon, C.

    2013-03-01

    There are a number of methodological issues involved in assessing damage caused by natural hazards. The first is the lack of data, due to the rarity of events and the widely different circumstances in which they occur. Thus, historical data, albeit scarce, should not be neglected when seeking to build ex-ante risk management models. This article analyses the input of insurance data for two recent severe coastal storm events, to examine what causal relationships may exist between hazard characteristics and the level of damage incurred by residential buildings. To do so, data was collected at two levels: from lists of about 4000 damage records, 358 loss adjustment reports were consulted, constituting a detailed damage database. The results show that for flooded residential buildings, over 75% of reconstruction costs are associated with interior elements, damage to structural components remaining very localised and negligible. Further analysis revealed a high scatter between costs and water depth, suggesting that uncertainty remains high in drawing up damage functions with insurance data alone. Due to the paper format of the loss adjustment reports and the lack of harmonisation between their contents, the collection stage called for a considerable amount of work. For future events, establishing a standardised process for archiving damage information could significantly contribute to the production of such empirical damage functions. Nevertheless, complementary sources of data on hazards and asset vulnerability parameters, will definitely still be necessary for damage modelling and multivariate approaches, crossing insurance data with external material, should also be deeper investigated.

  11. A qualitative metasynthesis: family involvement in decision making for people with dementia in residential aged care.

    PubMed

    Petriwskyj, Andrea; Gibson, Alexandra; Parker, Deborah; Banks, Susan; Andrews, Sharon; Robinson, Andrew

    2014-06-01

    Involving people in decisions about their care is good practice and ensures optimal outcomes. Despite considerable research, in practice family involvement in decision making can be challenging for both care staff and families. The aim of this review was to identify and appraise existing knowledge about family involvement in decision making for people with dementia living in residential aged care. The present Joanna Briggs Institute meta-synthesis considered studies that investigate involvement of family members in decision making for people with dementia in residential aged care settings. While quantitative and qualitative studies were included in the review, this article presents the qualitative findings. A comprehensive search of studies was conducted in 15 electronic databases. The search was limited to papers published in English, from 1990 to 2013. Twenty-six studies were identified as relevant for this review; 16 were qualitative papers reporting on 15 studies. Two independent reviewers assessed the studies for methodological validity and extracted the data using the standardized Joanna Briggs Institute Qualitative Assessment and Review Instrument (JBI-QARI). The findings were synthesized using JBI-QARI. The findings related to the decisions encountered and made by family surrogates, family perceptions of, and preferences for, their role/s, factors regarding treatment decisions and the collaborative decision-making process, and outcomes for family decision makers. Results indicate varied and complex experiences and multiple factors influencing decision making. Communication and contacts between staff and families and the support available for families should be addressed, as well as the role of different stakeholders in decisions.

  12. 24 CFR 40.2 - Definition of “residential structure”.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... OWNED RESIDENTIAL STRUCTURES § 40.2 Definition of “residential structure”. (a) As used in this part, the term residential structure means a residential structure (other than a privately owned residential structure and a residential structure on a military reservation): (1) Constructed or altered by or on behalf...

  13. Hydro-meteorological evaluation of downscaled global ensemble rainfall forecasts

    NASA Astrophysics Data System (ADS)

    Gaborit, Étienne; Anctil, François; Fortin, Vincent; Pelletier, Geneviève

    2013-04-01

    Ensemble rainfall forecasts are of high interest for decision making, as they provide an explicit and dynamic assessment of the uncertainty in the forecast (Ruiz et al. 2009). However, for hydrological forecasting, their low resolution currently limits their use to large watersheds (Maraun et al. 2010). In order to bridge this gap, various implementations of the statistic-stochastic multi-fractal downscaling technique presented by Perica and Foufoula-Georgiou (1996) were compared, bringing Environment Canada's global ensemble rainfall forecasts from a 100 by 70-km resolution down to 6 by 4-km, while increasing each pixel's rainfall variance and preserving its original mean. For comparison purposes, simpler methods were also implemented such as the bi-linear interpolation, which disaggregates global forecasts without modifying their variance. The downscaled meteorological products were evaluated using different scores and diagrams, from both a meteorological and a hydrological view points. The meteorological evaluation was conducted comparing the forecasted rainfall depths against nine days of observed values taken from Québec City rain gauge database. These 9 days present strong precipitation events occurring during the summer of 2009. For the hydrologic evaluation, the hydrological models SWMM5 and (a modified version of) GR4J were implemented on a small 6 km2 urban catchment located in the Québec City region. Ensemble hydrologic forecasts with a time step of 3 hours were then performed over a 3-months period of the summer of 2010 using the original and downscaled ensemble rainfall forecasts. The most important conclusions of this work are that the overall quality of the forecasts was preserved during the disaggregation procedure and that the disaggregated products using this variance-enhancing method were of similar quality than bi-linear interpolation products. However, variance and dispersion of the different members were, of course, much improved for the variance-enhanced products, compared to the bi-linear interpolation, which is a decisive advantage. The disaggregation technique of Perica and Foufoula-Georgiou (1996) hence represents an interesting way of bridging the gap between the meteorological models' resolution and the high degree of spatial precision sometimes required by hydrological models in their precipitation representation. References Maraun, D., Wetterhall, F., Ireson, A. M., Chandler, R. E., Kendon, E. J., Widmann, M., Brienen, S., Rust, H. W., Sauter, T., Themeßl, M., Venema, V. K. C., Chun, K. P., Goodess, C. M., Jones, R. G., Onof, C., Vrac, M., and Thiele-Eich, I. 2010. Precipitation downscaling under climate change: recent developments to bridge the gap between dynamical models and the end user. Reviews of Geophysics, 48 (3): RG3003, [np]. Doi: 10.1029/2009RG000314. Perica, S., and Foufoula-Georgiou, E. 1996. Model for multiscale disaggregation of spatial rainfall based on coupling meteorological and scaling descriptions. Journal Of Geophysical Research, 101(D21): 26347-26361. Ruiz, J., Saulo, C. and Kalnay, E. 2009. Comparison of Methods Used to Generate Probabilistic Quantitative Precipitation Forecasts over South America. Weather and forecasting, 24: 319-336. DOI: 10.1175/2008WAF2007098.1 This work is distributed under the Creative Commons Attribution 3.0 Unported License together with an author copyright. This license does not conflict with the regulations of the Crown Copyright.

  14. Development of a Land Use Database for the Little Blackwater Watershed, Dorchester County, Maryland

    USGS Publications Warehouse

    Milheim, Lesley E.; Jones, John W.; Barlow, Roger A.

    2007-01-01

    Many agricultural and forested areas in proximity to National Wildlife Refuges (NWR) are under increasing economic pressure to develop lands for commercial or residential development. The upper portion of the Little Blackwater River watershed - a 27 square mile area within largely low-lying Dorchester County, Maryland, on the eastern shore of the Chesapeake Bay - is important to the U.S. Fish and Wildlife Service (USFWS) because it flows toward the Blackwater National Wildlife Refuge (BNWR), and developmental impacts of areas upstream from the BNWR are unknown. One of the primary concerns for the refuge is how storm-water runoff may affect living resources downstream. The Egypt Road project (fig. 1), for which approximately 600 residential units have been approved, has the potential to markedly change the land use and land cover on the west bank of the Little Blackwater River. In an effort to limit anticipated impacts, the Maryland Department of Natural Resources (Maryland DNR) recently decided to purchase some of the lands previously slated for development. Local topography, a high water table (typically 1 foot or less below the land surface), and hydric soils present a challenge for the best management of storm-water flow from developed surfaces. A spatial data coordination group was formed by the Dorchester County Soil and Conservation District to collect data to aid decisionmakers in watershed management and on the possible impacts of development on this watershed. Determination of streamflow combined with land cover and impervious-surface baselines will allow linking of hydrologic and geologic factors that influence the land surface. This baseline information will help planners, refuge managers, and developers discuss issues and formulate best management practices to mitigate development impacts on the refuge. In consultation with the Eastern Region Geospatial Information Office, the dataset selected to be that baseline land cover source was the June-July 2005 National Agricultural Imagery Program (NAIP) 1-meter resolution orthoimagery of Maryland. This publicly available, statewide dataset provided imagery corresponding to the closest in time to the installation of a U.S. Geological Survey (USGS) Water Resources Discipline gaging station on the Little Blackwater River. It also captures land cover status just before major residential development occurs. This document describes the process used to create a land use database for the Little Blackwater watershed.

  15. Development of an Impervious-Surface Database for the Little Blackwater River Watershed, Dorchester County, Maryland

    USGS Publications Warehouse

    Milheim, Lesley E.; Jones, John W.; Barlow, Roger A.

    2007-01-01

    Many agricultural and forested areas in proximity to National Wildlife Refuges (NWR) are under increasing economic pressure for commercial or residential development. The upper portion of the Little Blackwater River watershed - a 27 square mile area within largely low-lying Dorchester County, Maryland, on the eastern shore of the Chesapeake Bay - is important to the U.S. Fish and Wildlife Service (USFWS) because it flows toward the Blackwater National Wildlife Refuge (BNWR), and developmental impacts of areas upstream from the BNWR are unknown. One of the primary concerns for the Refuge is how storm-water runoff may affect living resources downstream. The Egypt Road project (fig. 1), for which approximately 600 residential units have been approved, has the potential to markedly change the land use and land cover on the west bank of the Little Blackwater River. In an effort to limit anticipated impacts, the Maryland Department of Natural Resources (Maryland DNR) recently decided to purchase some of the lands previously slated for development. Local topography, a high water table (typically 1 foot or less below the land surface), and hydric soils present a challenge for the best management of storm-water flow from developed surfaces. A spatial data coordination group was formed by the Dorchester County Soil and Conservation District to collect data to aid decisionmakers in watershed management and on the possible impacts of development on this watershed. Determination of streamflow combined with land cover and impervious-surface baselines will allow linking of hydrologic and geologic factors that influence the land surface. This baseline information will help planners, refuge managers, and developers discuss issues and formulate best management practices to mitigate development impacts on the refuge. In consultation with the Eastern Region Geospatial Information Office, the dataset selected to be that baseline land cover source was the June-July 2005 National Agricultural Imagery Program (NAIP) 1-meter resolution orthoimagery of Maryland. This publicly available, statewide dataset provided imagery corresponding to the closest in time to the installation of a U.S. Geological Survey (USGS) Water Resources Discipline gaging station on the Little Blackwater River. It also captures land cover status just before major residential development occurs. This document describes the process used to create a database of impervious surfaces for the Little Blackwater watershed.

  16. Cold Climate Foundation Retrofit Experimental Hygrothermal Performance: Cloquet Residential Research Facility Laboratory Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, Louise F.; Harmon, Anna C.

    2015-04-01

    Thermal and moisture problems in existing basements create a unique challenge because the exterior face of the wall is not easily or inexpensively accessible. This approach addresses thermal and moisture management from the interior face of the wall without disturbing the exterior soil and landscaping. the interior and exterior environments. This approach has the potential for improving durability, comfort, and indoor air quality. This project was funded jointly by the National Renewable Energy Laboratory (NREL) and Oak Ridge National Laboratory (ORNL). ORNL focused on developing a full basement wall system experimental database to enable others to validate hygrothermal simulation codes.more » NREL focused on testing the moisture durability of practical basement wall interior insulation retrofit solutions for cold climates. The project has produced a physically credible and reliable long-term hygrothermal performance database for retrofit foundation wall insulation systems in zone 6 and 7 climates that are fully compliant with the performance criteria in the 2009 Minnesota Energy Code. The experimental data were configured into a standard format that can be published online and that is compatible with standard commercially available spreadsheet and database software.« less

  17. The distribution of common construction materials at risk to acid deposition in the United States

    NASA Astrophysics Data System (ADS)

    Lipfert, Frederick W.; Daum, Mary L.

    Information on the geographic distribution of various types of exposed materials is required to estimate the economic costs of damage to construction materials from acid deposition. This paper focuses on the identification, evaluation and interpretation of data describing the distributions of exterior construction materials, primarily in the United States. This information could provide guidance on how data needed for future economic assessments might be acquired in the most cost-effective ways. Materials distribution surveys from 16 cities in the U.S. and Canada and five related databases from government agencies and trade organizations were examined. Data on residential buildings are more commonly available than on nonresidential buildings; little geographically resolved information on distributions of materials in infrastructure was found. Survey results generally agree with the appropriate ancillary databases, but the usefulness of the databases is often limited by their coarse spatial resolution. Information on those materials which are most sensitive to acid deposition is especially scarce. Since a comprehensive error analysis has never been performed on the data required for an economic assessment, it is not possible to specify the corresponding detailed requirements for data on the distributions of materials.

  18. Numerical Procedure to Forecast the Tsunami Parameters from a Database of Pre-Simulated Seismic Unit Sources

    NASA Astrophysics Data System (ADS)

    Jiménez, César; Carbonel, Carlos; Rojas, Joel

    2018-04-01

    We have implemented a numerical procedure to forecast the parameters of a tsunami, such as the arrival time of the front of the first wave and the maximum wave height in real and virtual tidal stations along the Peruvian coast, with this purpose a database of pre-computed synthetic tsunami waveforms (or Green functions) was obtained from numerical simulation of seismic unit sources (dimension: 50 × 50 km2) for subduction zones from southern Chile to northern Mexico. A bathymetry resolution of 30 arc-sec (approximately 927 m) was used. The resulting tsunami waveform is obtained from the superposition of synthetic waveforms corresponding to several seismic unit sources contained within the tsunami source geometry. The numerical procedure was applied to the Chilean tsunami of April 1, 2014. The results show a very good correlation for stations with wave amplitude greater than 1 m, in the case of the Arica tide station an error (from the maximum height of the observed and simulated waveform) of 3.5% was obtained, for Callao station the error was 12% and the largest error was in Chimbote with 53.5%, however, due to the low amplitude of the Chimbote wave (<1 m), the overestimated error, in this case, is not important for evacuation purposes. The aim of the present research is tsunami early warning, where speed is required rather than accuracy, so the results should be taken as preliminary.

  19. Numerical Procedure to Forecast the Tsunami Parameters from a Database of Pre-Simulated Seismic Unit Sources

    NASA Astrophysics Data System (ADS)

    Jiménez, César; Carbonel, Carlos; Rojas, Joel

    2017-09-01

    We have implemented a numerical procedure to forecast the parameters of a tsunami, such as the arrival time of the front of the first wave and the maximum wave height in real and virtual tidal stations along the Peruvian coast, with this purpose a database of pre-computed synthetic tsunami waveforms (or Green functions) was obtained from numerical simulation of seismic unit sources (dimension: 50 × 50 km2) for subduction zones from southern Chile to northern Mexico. A bathymetry resolution of 30 arc-sec (approximately 927 m) was used. The resulting tsunami waveform is obtained from the superposition of synthetic waveforms corresponding to several seismic unit sources contained within the tsunami source geometry. The numerical procedure was applied to the Chilean tsunami of April 1, 2014. The results show a very good correlation for stations with wave amplitude greater than 1 m, in the case of the Arica tide station an error (from the maximum height of the observed and simulated waveform) of 3.5% was obtained, for Callao station the error was 12% and the largest error was in Chimbote with 53.5%, however, due to the low amplitude of the Chimbote wave (<1 m), the overestimated error, in this case, is not important for evacuation purposes. The aim of the present research is tsunami early warning, where speed is required rather than accuracy, so the results should be taken as preliminary.

  20. MINDMAP: establishing an integrated database infrastructure for research in ageing, mental well-being, and the urban environment.

    PubMed

    Beenackers, Mariëlle A; Doiron, Dany; Fortier, Isabel; Noordzij, J Mark; Reinhard, Erica; Courtin, Emilie; Bobak, Martin; Chaix, Basile; Costa, Giuseppe; Dapp, Ulrike; Diez Roux, Ana V; Huisman, Martijn; Grundy, Emily M; Krokstad, Steinar; Martikainen, Pekka; Raina, Parminder; Avendano, Mauricio; van Lenthe, Frank J

    2018-01-19

    Urbanization and ageing have important implications for public mental health and well-being. Cities pose major challenges for older citizens, but also offer opportunities to develop, test, and implement policies, services, infrastructure, and interventions that promote mental well-being. The MINDMAP project aims to identify the opportunities and challenges posed by urban environmental characteristics for the promotion and management of mental well-being and cognitive function of older individuals. MINDMAP aims to achieve its research objectives by bringing together longitudinal studies from 11 countries covering over 35 cities linked to databases of area-level environmental exposures and social and urban policy indicators. The infrastructure supporting integration of this data will allow multiple MINDMAP investigators to safely and remotely co-analyse individual-level and area-level data. Individual-level data is derived from baseline and follow-up measurements of ten participating cohort studies and provides information on mental well-being outcomes, sociodemographic variables, health behaviour characteristics, social factors, measures of frailty, physical function indicators, and chronic conditions, as well as blood derived clinical biochemistry-based biomarkers and genetic biomarkers. Area-level information on physical environment characteristics (e.g. green spaces, transportation), socioeconomic and sociodemographic characteristics (e.g. neighbourhood income, residential segregation, residential density), and social environment characteristics (e.g. social cohesion, criminality) and national and urban social policies is derived from publically available sources such as geoportals and administrative databases. The linkage, harmonization, and analysis of data from different sources are being carried out using piloted tools to optimize the validity of the research results and transparency of the methodology. MINDMAP is a novel research collaboration that is combining population-based cohort data with publicly available datasets not typically used for ageing and mental well-being research. Integration of various data sources and observational units into a single platform will help to explain the differences in ageing-related mental and cognitive disorders both within as well as between cities in Europe, the US, Canada, and Russia and to assess the causal pathways and interactions between the urban environment and the individual determinants of mental well-being and cognitive ageing in older adults.

  1. A hydrogen energy carrier. Volume 2: Systems analysis

    NASA Technical Reports Server (NTRS)

    Savage, R. L. (Editor); Blank, L. (Editor); Cady, T. (Editor); Cox, K. (Editor); Murray, R. (Editor); Williams, R. D. (Editor)

    1973-01-01

    A systems analysis of hydrogen as an energy carrier in the United States indicated that it is feasible to use hydrogen in all energy use areas, except some types of transportation. These use areas are industrial, residential and commercial, and electric power generation. Saturation concept and conservation concept forecasts of future total energy demands were made. Projected costs of producing hydrogen from coal or from nuclear heat combined with thermochemical decomposition of water are in the range $1.00 to $1.50 per million Btu of hydrogen produced. Other methods are estimated to be more costly. The use of hydrogen as a fuel will require the development of large-scale transmission and storage systems. A pipeline system similar to the existing natural gas pipeline system appears practical, if design factors are included to avoid hydrogen environment embrittlement of pipeline metals. Conclusions from the examination of the safety, legal, environmental, economic, political and societal aspects of hydrogen fuel are that a hydrogen energy carrier system would be compatible with American values and the existing energy system.

  2. Calendar Year 2007 Program Benefits for U.S. EPA Energy Star Labeled Products: Expanded Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanchez, Marla; Homan, Gregory; Lai, Judy

    2009-09-24

    This report provides a top-level summary of national savings achieved by the Energy Star voluntary product labeling program. To best quantify and analyze savings for all products, we developed a bottom-up product-based model. Each Energy Star product type is characterized by product-specific inputs that result in a product savings estimate. Our results show that through 2007, U.S. EPA Energy Star labeled products saved 5.5 Quads of primary energy and avoided 100 MtC of emissions. Although Energy Star-labeled products encompass over forty product types, only five of those product types accounted for 65percent of all Energy Star carbon reductions achieved tomore » date, including (listed in order of savings magnitude)monitors, printers, residential light fixtures, televisions, and furnaces. The forecast shows that U.S. EPA?s program is expected to save 12.2 Quads of primary energy and avoid 215 MtC of emissions over the period of 2008?2015.« less

  3. Applied Meteorology Unit (AMU) Quarterly Report Fourth Quarter FY-13

    NASA Technical Reports Server (NTRS)

    Bauman, William; Crawford, Winifred; Watson, Leela; Shafer, Jaclyn; Huddleston, Lisa

    2013-01-01

    Ms. Shafer completed the task to determine relationships between pressure gradients and peak winds at Vandenberg Air Force Base (VAFB), and began developing a climatology for the VAFB wind towers; Dr. Huddleston completed the task to develop a tool to help forecast the time of the first lightning strike of the day in the Kennedy Space Center (KSC)/Cape Canaveral Air Force Station (CCAFS) area; Dr. Bauman completed work on a severe weather forecast tool focused on the Eastern Range (ER), and also developed upper-winds analysis tools for VAFB and Wallops Fl ight Facility (WFF); Ms. Crawford processed and displayed radar data in the software she will use to create a dual-Doppler analysis over the east-central Florida and KSC/CCAFS areas; Mr. Decker completed developing a wind pairs database for the Launch Services Program to use when evaluating upper-level winds for launch vehicles; Dr. Watson continued work to assimilate observational data into the high-resolution model configurations she created for WFF and the ER.

  4. Estimation of PV energy production based on satellite data

    NASA Astrophysics Data System (ADS)

    Mazurek, G.

    2015-09-01

    Photovoltaic (PV) technology is an attractive source of power for systems without connection to power grid. Because of seasonal variations of solar radiation, design of such a power system requires careful analysis in order to provide required reliability. In this paper we present results of three-year measurements of experimental PV system located in Poland and based on polycrystalline silicon module. Irradiation values calculated from results of ground measurements have been compared with data from solar radiation databases employ calculations from of satellite observations. Good convergence level of both data sources has been shown, especially during summer. When satellite data from the same time period is available, yearly and monthly production of PV energy can be calculated with 2% and 5% accuracy, respectively. However, monthly production during winter seems to be overestimated, especially in January. Results of this work may be helpful in forecasting performance of similar PV systems in Central Europe and allow to make more precise forecasts of PV system performance than based only on tables with long time averaged values.

  5. NOAA's Integrated Tsunami Database: Data for improved forecasts, warnings, research, and risk assessments

    NASA Astrophysics Data System (ADS)

    Stroker, Kelly; Dunbar, Paula; Mungov, George; Sweeney, Aaron; McCullough, Heather; Carignan, Kelly

    2015-04-01

    The National Oceanic and Atmospheric Administration (NOAA) has primary responsibility in the United States for tsunami forecast, warning, research, and supports community resiliency. NOAA's National Geophysical Data Center (NGDC) and co-located World Data Service for Geophysics provide a unique collection of data enabling communities to ensure preparedness and resilience to tsunami hazards. Immediately following a damaging or fatal tsunami event there is a need for authoritative data and information. The NGDC Global Historical Tsunami Database (http://www.ngdc.noaa.gov/hazard/) includes all tsunami events, regardless of intensity, as well as earthquakes and volcanic eruptions that caused fatalities, moderate damage, or generated a tsunami. The long-term data from these events, including photographs of damage, provide clues to what might happen in the future. NGDC catalogs the information on global historical tsunamis and uses these data to produce qualitative tsunami hazard assessments at regional levels. In addition to the socioeconomic effects of a tsunami, NGDC also obtains water level data from the coasts and the deep-ocean at stations operated by the NOAA/NOS Center for Operational Oceanographic Products and Services, the NOAA Tsunami Warning Centers, and the National Data Buoy Center (NDBC) and produces research-quality data to isolate seismic waves (in the case of the deep-ocean sites) and the tsunami signal. These water-level data provide evidence of sea-level fluctuation and possible inundation events. NGDC is also building high-resolution digital elevation models (DEMs) to support real-time forecasts, implemented at 75 US coastal communities. After a damaging or fatal event NGDC begins to collect and integrate data and information from many organizations into the hazards databases. Sources of data include our NOAA partners, the U.S. Geological Survey, the UNESCO Intergovernmental Oceanographic Commission (IOC) and International Tsunami Information Center, Smithsonian Institution's Global Volcanism Program, news organizations, etc. NGDC assesses the data and then works to promptly distribute the data and information. For example, when a major tsunami occurs, all of the related tsunami data are combined into one timely resource, posted in an online report, which includes: 1) event summary; 2) eyewitness and instrumental recordings from preliminary field surveys; 3) regional historical observations including similar past events and effects; 4) observed water heights and calculated tsunami travel times; and 5) near-field effects. This report is regularly updated to incorporate the most recent data and observations. Providing timely access to authoritative data and information ultimately benefits researchers, state officials, the media and the public. This paper will demonstrate the extensive collection of data and how it is used.

  6. Economic evaluation of pharmacist-led medication reviews in residential aged care facilities.

    PubMed

    Hasan, Syed Shahzad; Thiruchelvam, Kaeshaelya; Kow, Chia Siang; Ghori, Muhammad Usman; Babar, Zaheer-Ud-Din

    2017-10-01

    Medication reviews is a widely accepted approach known to have a substantial impact on patients' pharmacotherapy and safety. Numerous options to optimise pharmacotherapy in older people have been reported in literature and they include medication reviews, computerised decision support systems, management teams, and educational approaches. Pharmacist-led medication reviews are increasingly being conducted, aimed at attaining patient safety and medication optimisation. Cost effectiveness is an essential aspect of a medication review evaluation. Areas covered: A systematic searching of articles that examined the cost-effectiveness of medication reviews conducted in aged care facilities was performed using the relevant databases. Pharmacist-led medication reviews confer many benefits such as attainment of biomarker targets for improved clinical outcomes, and other clinical parameters, as well as depict concrete financial advantages in terms of decrement in total medication costs and associated cost savings. Expert commentary: The cost-effectiveness of medication reviews are more consequential than ever before. A critical evaluation of pharmacist-led medication reviews in residential aged care facilities from an economical aspect is crucial in determining if the time, effort, and direct and indirect costs involved in the review rationalise the significance of conducting medication reviews for older people in aged care facilities.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Logue, Jennifer M; Singer, Brett

    Range hood use during residential cooking is essential to maintaining good indoor air quality. However, widespread use will impact the energy demand of the U.S. housing stock. This paper describes a modeling study to determine site energy, source energy, and consumer costs for comprehensive range hood use. To estimate the energy impacts for all 113 million homes in the U.S., we extrapolated from the simulation of a representative weighted sample of 50,000 virtual homes developed from the 2009 Residential Energy Consumption Survey database. A physics-based simulation model that considered fan energy, energy to condition additional incoming air, and the effectmore » on home heating and cooling due to exhausting the heat from cooking was applied to each home. Hoods performing at a level common to hoods currently in U.S. homes would require 19?33 TWh [69?120 PJ] of site energy, 31?53 TWh [110-190 PJ] of source energy; and would cost consumers $1.2?2.1 billion (U.S.$2010) annually in the U.S. housing stock. The average household would spend less than $15 annually. Reducing required airflow, e.g. with designs that promote better pollutant capture has more energy saving potential, on average, than improving fan efficiency.« less

  8. Long-term effect of September 11 on the political behavior of victims’ families and neighbors

    PubMed Central

    Hersh, Eitan D.

    2013-01-01

    This article investigates the long-term effect of September 11, 2001 on the political behaviors of victims’ families and neighbors. Relative to comparable individuals, family members and residential neighbors of victims have become—and have stayed—significantly more active in politics in the last 12 years, and they have become more Republican on account of the terrorist attacks. The method used to demonstrate these findings leverages the random nature of the terrorist attack to estimate a causal effect and exploits new techniques to link multiple, individual-level, governmental databases to measure behavioral change without relying on surveys or aggregate analysis. PMID:24324145

  9. Long-term effect of September 11 on the political behavior of victims' families and neighbors.

    PubMed

    Hersh, Eitan D

    2013-12-24

    This article investigates the long-term effect of September 11, 2001 on the political behaviors of victims' families and neighbors. Relative to comparable individuals, family members and residential neighbors of victims have become--and have stayed--significantly more active in politics in the last 12 years, and they have become more Republican on account of the terrorist attacks. The method used to demonstrate these findings leverages the random nature of the terrorist attack to estimate a causal effect and exploits new techniques to link multiple, individual-level, governmental databases to measure behavioral change without relying on surveys or aggregate analysis.

  10. Performance modeling and valuation of snow-covered PV systems: examination of a simplified approach to decrease forecasting error.

    PubMed

    Bosman, Lisa B; Darling, Seth B

    2018-06-01

    The advent of modern solar energy technologies can improve the costs of energy consumption on a global, national, and regional level, ultimately spanning stakeholders from governmental entities to utility companies, corporations, and residential homeowners. For those stakeholders experiencing the four seasons, accurately accounting for snow-related energy losses is important for effectively predicting photovoltaic performance energy generation and valuation. This paper provides an examination of a new, simplified approach to decrease snow-related forecasting error, in comparison to current solar energy performance models. A new method is proposed to allow model designers, and ultimately users, the opportunity to better understand the return on investment for solar energy systems located in snowy environments. The new method is validated using two different sets of solar energy systems located near Green Bay, WI, USA: a 3.0-kW micro inverter system and a 13.2-kW central inverter system. Both systems were unobstructed, facing south, and set at a tilt of 26.56°. Data were collected beginning in May 2014 (micro inverter system) and October 2014 (central inverter system), through January 2018. In comparison to reference industry standard solar energy prediction applications (PVWatts and PVsyst), the new method results in lower mean absolute percent errors per kilowatt hour of 0.039 and 0.055%, respectively, for the micro inverter system and central inverter system. The statistical analysis provides support for incorporating this new method into freely available, online, up-to-date prediction applications, such as PVWatts and PVsyst.

  11. Space Monitoring Data Center at Moscow State University

    NASA Astrophysics Data System (ADS)

    Kalegaev, Vladimir; Bobrovnikov, Sergey; Barinova, Vera; Myagkova, Irina; Shugay, Yulia; Barinov, Oleg; Dolenko, Sergey; Mukhametdinova, Ludmila; Shiroky, Vladimir

    Space monitoring data center of Moscow State University provides operational information on radiation state of the near-Earth space. Internet portal http://swx.sinp.msu.ru/ gives access to the actual data characterizing the level of solar activity, geomagnetic and radiation conditions in the magnetosphere and heliosphere in the real time mode. Operational data coming from space missions (ACE, GOES, ELECTRO-L1, Meteor-M1) at L1, LEO and GEO and from the Earth’s surface are used to represent geomagnetic and radiation state of near-Earth environment. On-line database of measurements is also maintained to allow quick comparison between current conditions and conditions experienced in the past. The models of space environment working in autonomous mode are used to generalize the information obtained from observations on the whole magnetosphere. Interactive applications and operational forecasting services are created on the base of these models. They automatically generate alerts on particle fluxes enhancements above the threshold values, both for SEP and relativistic electrons using data from LEO orbits. Special forecasting services give short-term forecast of SEP penetration to the Earth magnetosphere at low altitudes, as well as relativistic electron fluxes at GEO. Velocities of recurrent high speed solar wind streams on the Earth orbit are predicted with advance time of 3-4 days on the basis of automatic estimation of the coronal hole areas detected on the images of the Sun received from the SDO satellite. By means of neural network approach, Dst and Kp indices online forecasting 0.5-1.5 hours ahead, depending on solar wind and the interplanetary magnetic field, measured by ACE satellite, is carried out. Visualization system allows representing experimental and modeling data in 2D and 3D.

  12. Predicting drifter trajectories and particle dispersion in the Caribbean using a high resolution coastal ocean forecasting system

    NASA Astrophysics Data System (ADS)

    Solano, M.

    2016-02-01

    The present study discusses the accuracy of a high-resolution ocean forecasting system in predicting floating drifter trajectories and the uncertainty of modeled particle dispersion in coastal areas. Trajectories were calculated using an offline particle-tracking algorithm coupled to the operational model developed for the region of Puerto Rico by CariCOOS. Both, a simple advection algorithm as well as the Larval TRANSport (LTRANS) model, a more sophisticated offline particle-tracking application, were coupled to the ocean model. Numerical results are compared with 12 floating drifters deployed in the near-shore of Puerto Rico during February and March 2015, and tracked for several days using Global Positioning Systems mounted on the drifters. In addition the trajectories have also been calculated with the AmSeas Navy Coastal Ocean Model (NCOM). The operational model is based on the Regional Ocean Modeling System (ROMS) with a uniform horizontal resolution of 1/100 degrees (1.1km). Initial, surface and open boundary conditions are taken from NCOM, except for wind stress, which is computed using winds from the National Digital Forecasting Database. Probabilistic maps were created to quantify the uncertainty of particle trajectories at different locations. Results show that the forecasted trajectories are location dependent, with tidally active regions having the largest error. The predicted trajectories by both the ROMS and NCOM models show good agreement on average, however both perform differently at particular locations. The effect of wind stress on the drifter trajectories is investigated to account for wind slippage. Furthermore, a real case scenario is presented where simulated trajectories show good agreement when compared to the actual drifter trajectories.

  13. Hailstorm forecast from stability indexes in Southwestern France

    NASA Astrophysics Data System (ADS)

    Melcón, Pablo; Merino, Andrés; Sánchez, José Luis; Dessens, Jean; Gascón, Estíbaliz; Berthet, Claude; López, Laura; García-Ortega, Eduardo

    2016-04-01

    Forecasting hailstorms is a difficult task because of their small spatial and temporal scales. Over recent decades, stability indexes have been commonly used in operational forecasting to provide a simplified representation of different thermodynamic characteristics of the atmosphere, regarding the onset of convective events. However, they are estimated from vertical profiles obtained by radiosondes, which are usually available only twice a day and have limited spatial representativeness. Numerical models predictions can be used to overcome these drawbacks, providing vertical profiles with higher spatiotemporal resolution. The main objective of this study is to create a tool for hail prediction in the southwest of France, one of the European regions where hailstorms have a higher incidence. The Association Nationale d'Etude et de Lutte contre les Fleáux Atmosphériques (ANELFA) maintains there a dense hailpad network in continuous operation, which has created an extensive database of hail events, used in this study as ground truth. The new technique is aimed to classify the spatial distribution of different stability indexes on hail days. These indexes were calculated from vertical profiles at 1200 UTC provided by WRF numerical model, validated with radiosonde data from Bordeaux. Binary logistic regression is used to select those indexes that best represent thermodynamic conditions related to occurrence of hail in the zone. Then, they are combined in a single algorithm that surpassed the predictive power they have when used independently. Regression equation results in hail days are used in cluster analysis to identify different spatial patterns given by the probability algorithm. This new tool can be used in operational forecasting, in combination with synoptic and mesoscale techniques, to properly define hail probability and distribution. Acknowledgements The authors would like to thank the CEPA González Díez Foundation and the University of Leon for its financial support.

  14. Multi-centennial upper-ocean heat content reconstruction using online data assimilation

    NASA Astrophysics Data System (ADS)

    Perkins, W. A.; Hakim, G. J.

    2017-12-01

    The Last Millennium Reanalysis (LMR) provides an advanced paleoclimate ensemble data assimilation framework for multi-variate climate field reconstructions over the Common Era. Although reconstructions in this framework with full Earth system models remain prohibitively expensive, recent work has shown improved ensemble reconstruction validation using computationally inexpensive linear inverse models (LIMs). Here we leverage these techniques in pursuit of a new multi-centennial field reconstruction of upper-ocean heat content (OHC), synthesizing model dynamics with observational constraints from proxy records. OHC is an important indicator of internal climate variability and responds to planetary energy imbalances. Therefore, a consistent extension of the OHC record in time will help inform aspects of low-frequency climate variability. We use the Community Climate System Model version 4 (CCSM4) and Max Planck Institute (MPI) last millennium simulations to derive the LIMs, and the PAGES2K v.2.0 proxy database to perform annually resolved reconstructions of upper-OHC, surface air temperature, and wind stress over the last 500 years. Annual OHC reconstructions and uncertainties for both the global mean and regional basins are compared against observational and reanalysis data. We then investigate differences in dynamical behavior at decadal and longer time scales between the reconstruction and simulations in the last-millennium Coupled Model Intercomparison Project version 5 (CMIP5). Preliminary investigation of 1-year forecast skill for an OHC-only LIM shows largely positive spatial grid point local anomaly correlations (LAC) with a global average LAC of 0.37. Compared to 1-year OHC persistence forecast LAC (global average LAC of 0.30), the LIM outperforms the persistence forecasts in the tropical Indo-Pacific region, the equatorial Atlantic, and in certain regions near the Antarctic Circumpolar Current. In other regions, the forecast correlations are less than the persistence case but still positive overall.

  15. Statistical Downscaling in Multi-dimensional Wave Climate Forecast

    NASA Astrophysics Data System (ADS)

    Camus, P.; Méndez, F. J.; Medina, R.; Losada, I. J.; Cofiño, A. S.; Gutiérrez, J. M.

    2009-04-01

    Wave climate at a particular site is defined by the statistical distribution of sea state parameters, such as significant wave height, mean wave period, mean wave direction, wind velocity, wind direction and storm surge. Nowadays, long-term time series of these parameters are available from reanalysis databases obtained by numerical models. The Self-Organizing Map (SOM) technique is applied to characterize multi-dimensional wave climate, obtaining the relevant "wave types" spanning the historical variability. This technique summarizes multi-dimension of wave climate in terms of a set of clusters projected in low-dimensional lattice with a spatial organization, providing Probability Density Functions (PDFs) on the lattice. On the other hand, wind and storm surge depend on instantaneous local large-scale sea level pressure (SLP) fields while waves depend on the recent history of these fields (say, 1 to 5 days). Thus, these variables are associated with large-scale atmospheric circulation patterns. In this work, a nearest-neighbors analog method is used to predict monthly multi-dimensional wave climate. This method establishes relationships between the large-scale atmospheric circulation patterns from numerical models (SLP fields as predictors) with local wave databases of observations (monthly wave climate SOM PDFs as predictand) to set up statistical models. A wave reanalysis database, developed by Puertos del Estado (Ministerio de Fomento), is considered as historical time series of local variables. The simultaneous SLP fields calculated by NCEP atmospheric reanalysis are used as predictors. Several applications with different size of sea level pressure grid and with different temporal domain resolution are compared to obtain the optimal statistical model that better represents the monthly wave climate at a particular site. In this work we examine the potential skill of this downscaling approach considering perfect-model conditions, but we will also analyze the suitability of this methodology to be used for seasonal forecast and for long-term climate change scenario projection of wave climate.

  16. Challenges in Defining Tsunami Wave Height

    NASA Astrophysics Data System (ADS)

    Stroker, K. J.; Dunbar, P. K.; Mungov, G.; Sweeney, A.; Arcos, N. P.

    2017-12-01

    The NOAA National Centers for Environmental Information (NCEI) and co-located World Data Service for Geophysics maintain the global tsunami archive consisting of the historical tsunami database, imagery, and raw and processed water level data. The historical tsunami database incorporates, where available, maximum wave heights for each coastal tide gauge and deep-ocean buoy that recorded a tsunami signal. These data are important because they are used for tsunami hazard assessment, model calibration, validation, and forecast and warning. There have been ongoing discussions in the tsunami community about the correct way to measure and report these wave heights. It is important to understand how these measurements might vary depending on how the data were processed and the definition of maximum wave height. On September 16, 2015, an 8.3 Mw earthquake located 48 km west of Illapel, Chile generated a tsunami that was observed all over the Pacific region. We processed the time-series water level data for 57 tide gauges that recorded this tsunami and compared the maximum wave heights determined from different definitions. We also compared the maximum wave heights from the NCEI-processed data with the heights reported by the NOAA Tsunami Warning Centers. We found that in the near field different methods of determining the maximum tsunami wave heights could result in large differences due to possible instrumental clipping. We also found that the maximum peak is usually larger than the maximum amplitude (½ peak-to-trough), but the differences for the majority of the stations were <20 cm. For this event, the maximum tsunami wave heights determined by either definition (maximum peak or amplitude) would have validated the forecasts issued by the NOAA Tsunami Warning Centers. Since there is currently only one field in the NCEI historical tsunami database to store the maximum tsunami wave height, NCEI will consider adding an additional field for the maximum peak measurement.

  17. Challenges in Defining Tsunami Wave Heights

    NASA Astrophysics Data System (ADS)

    Dunbar, Paula; Mungov, George; Sweeney, Aaron; Stroker, Kelly; Arcos, Nicolas

    2017-08-01

    The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) and co-located World Data Service for Geophysics maintain the global tsunami archive consisting of the historical tsunami database, imagery, and raw and processed water level data. The historical tsunami database incorporates, where available, maximum wave heights for each coastal tide gauge and deep-ocean buoy that recorded a tsunami signal. These data are important because they are used for tsunami hazard assessment, model calibration, validation, and forecast and warning. There have been ongoing discussions in the tsunami community about the correct way to measure and report these wave heights. It is important to understand how these measurements might vary depending on how the data were processed and the definition of maximum wave height. On September 16, 2015, an 8.3 M w earthquake located 48 km west of Illapel, Chile generated a tsunami that was observed all over the Pacific region. We processed the time-series water level data for 57 coastal tide gauges that recorded this tsunami and compared the maximum wave heights determined from different definitions. We also compared the maximum wave heights from the NCEI-processed data with the heights reported by the NOAA Tsunami Warning Centers. We found that in the near field different methods of determining the maximum tsunami wave heights could result in large differences due to possible instrumental clipping. We also found that the maximum peak is usually larger than the maximum amplitude (½ peak-to-trough), but the differences for the majority of the stations were <20 cm. For this event, the maximum tsunami wave heights determined by either definition (maximum peak or amplitude) would have validated the forecasts issued by the NOAA Tsunami Warning Centers. Since there is currently only one field in the NCEI historical tsunami database to store the maximum tsunami wave height for each tide gauge and deep-ocean buoy, NCEI will consider adding an additional field for the maximum peak measurement.

  18. Prediction of 222Rn in Danish dwellings using geology and house construction information from central databases.

    PubMed

    Andersen, Claus E; Raaschou-Nielsen, Ole; Andersen, Helle Primdal; Lind, Morten; Gravesen, Peter; Thomsen, Birthe L; Ulbak, Kaare

    2007-01-01

    A linear regression model has been developed for the prediction of indoor (222)Rn in Danish houses. The model provides proxy radon concentrations for about 21,000 houses in a Danish case-control study on the possible association between residential radon and childhood cancer (primarily leukaemia). The model was calibrated against radon measurements in 3116 houses. An independent dataset with 788 house measurements was used for model performance assessment. The model includes nine explanatory variables, of which the most important ones are house type and geology. All explanatory variables are available from central databases. The model was fitted to log-transformed radon concentrations and it has an R(2) of 40%. The uncertainty associated with individual predictions of (untransformed) radon concentrations is about a factor of 2.0 (one standard deviation). The comparison with the independent test data shows that the model makes sound predictions and that errors of radon predictions are only weakly correlated with the estimates themselves (R(2) = 10%).

  19. [THE USE OF OPEN REAL ESTATE DATABASES FOR THE ANALYSIS OF INFLUENCE OF CONCOMITANT FACTORS ON THE STATE OF THE URBAN POPULATION'S HEALTH].

    PubMed

    Zheleznyak, E V; Khripach, L V

    2015-01-01

    There was suggested a new method of the assessment of certain social-lifestyle factors in hygienic health examination of the urban population, based on the work with the open real estate databases on residential areas of the given city. On the example of the Moscow FlatInfo portal for a sample of 140 residents of the city of Moscow there was studied the distribution of such available for analysis factors as a typical design of the building, where studied citizen resides, the year of its construction and the market price of 1m2 of housing space in this house. The latter value is a quantitative integrated assessment of the social and lifestyle quality of housing, depending on the type and technical condition of the building, neighborhood environment, infrastructure of the region and many other factors, and may be a useful supplemental index in hygienic research.

  20. Ground-source heat pump case studies and utility programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lienau, P.J.; Boyd, T.L.; Rogers, R.L.

    1995-04-01

    Ground-source heat pump systems are one of the promising new energy technologies that has shown rapid increase in usage over the past ten years in the United States. These systems offer substantial benefits to consumers and utilities in energy (kWh) and demand (kW) savings. The purpose of this study was to determine what existing monitored data was available mainly from electric utilities on heat pump performance, energy savings and demand reduction for residential, school and commercial building applications. In order to verify the performance, information was collected for 253 case studies from mainly utilities throughout the United States. The casemore » studies were compiled into a database. The database was organized into general information, system information, ground system information, system performance, and additional information. Information was developed on the status of demand-side management of ground-source heat pump programs for about 60 electric utility and rural electric cooperatives on marketing, incentive programs, barriers to market penetration, number units installed in service area, and benefits.« less

  1. Establishment of a national database to link epidemiological and molecular data from norovirus outbreaks in Ireland

    PubMed Central

    KELLY, S.; FOLEY, B.; DUNFORD, L.; COUGHLAN, S.; TUITE, G.; DUFFY, M.; MITCHELL, S.; SMYTH, B.; O'NEILL, H.; McKEOWN, P.; HALL, W.; LYNCH, M.

    2008-01-01

    SUMMARY A prospective study of norovirus outbreaks in Ireland was carried out over a 1-year period from 1 October 2004 to 30 September 2005. Epidemiological and molecular data on norovirus outbreaks in the Republic of Ireland (ROI) and Northern Ireland (NI) were collected and combined in real time in a common database. Most reported outbreaks occurred in hospitals and residential institutions and person-to-person spread was the predominant mode of transmission. The predominant circulating norovirus strain was the GII.4-2004 strain with a small number of outbreaks due to GII.2. This study represents the first time that enhanced epidemiological and virological data on norovirus outbreaks in Ireland have been described. The link established between the epidemiological and virological institutions during the course of this study has been continued and the data is being used as a source of data for the Foodborne Viruses in Europe Network (DIVINE-NET). PMID:18252027

  2. A seamless global hydrological monitoring and forecasting system for water resources assessment and hydrological hazard early warning

    NASA Astrophysics Data System (ADS)

    Sheffield, Justin; He, Xiaogang; Wood, Eric; Pan, Ming; Wanders, Niko; Zhan, Wang; Peng, Liqing

    2017-04-01

    Sustainable management of water resources and mitigation of the impacts of hydrological hazards are becoming ever more important at large scales because of inter-basin, inter-country and inter-continental connections in water dependent sectors. These include water resources management, food production, and energy production, whose needs must be weighed against the water needs of ecosystems and preservation of water resources for future generations. The strains on these connections are likely to increase with climate change and increasing demand from burgeoning populations and rapid development, with potential for conflict over water. At the same time, network connections may provide opportunities to alleviate pressures on water availability through more efficient use of resources such as trade in water dependent goods. A key constraint on understanding, monitoring and identifying solutions to increasing competition for water resources and hazard risk is the availability of hydrological data for monitoring and forecasting water resources and hazards. We present a global online system that provides continuous and consistent water products across time scales, from the historic instrumental period, to real-time monitoring, short-term and seasonal forecasts, and climate change projections. The system is intended to provide data and tools for analysis of historic hydrological variability and trends, water resources assessment, monitoring of evolving hazards and forecasts for early warning, and climate change scale projections of changes in water availability and extreme events. The system is particular useful for scientists and stakeholders interested in regions with less available in-situ data, and where forecasts have the potential to help decision making. The system is built on a database of high-resolution climate data from 1950 to present that merges available observational records with bias-corrected reanalysis and satellite data, which then drives a coupled land surface model-flood inundation model to produce hydrological variables and indices at daily, 0.25-degree resolution, globally. The system is updated in near real-time (< 2 days) using satellite precipitation and weather model data, and produces forecasts at short-term (out to 7 days) based on the Global Forecast System (GFS) and seasonal (up to 6 months) based on U.S. National Multi-Model Ensemble (NMME) seasonal forecasts. Climate change projections are based on bias-corrected and downscaled CMIP5 climate data that is used to force the hydrological model. Example products from the system include real-time and forecast drought indices for precipitation, soil moisture, and streamflow, and flood magnitude and extent indices. The model outputs are complemented by satellite based products and indices based on satellite data for vegetation health (MODIS NDVI) and soil moisture (SMAP). We show examples of the validation of the system at regional scales, including how local information can significantly improve predictions, and examples of how the system can be used to understand large-scale water resource issues, and in real-world contexts for early warning, decision making and planning.

  3. Development, Use, and Impact of a Global Laboratory Database During the 2014 Ebola Outbreak in West Africa.

    PubMed

    Durski, Kara N; Singaravelu, Shalini; Teo, Junxiong; Naidoo, Dhamari; Bawo, Luke; Jambai, Amara; Keita, Sakoba; Yahaya, Ali Ahmed; Muraguri, Beatrice; Ahounou, Brice; Katawera, Victoria; Kuti-George, Fredson; Nebie, Yacouba; Kohar, T Henry; Hardy, Patrick Jowlehpah; Djingarey, Mamoudou Harouna; Kargbo, David; Mahmoud, Nuha; Assefa, Yewondwossen; Condell, Orla; N'Faly, Magassouba; Van Gurp, Leon; Lamanu, Margaret; Ryan, Julia; Diallo, Boubacar; Daffae, Foday; Jackson, Dikena; Malik, Fayyaz Ahmed; Raftery, Philomena; Formenty, Pierre

    2017-06-15

    The international impact, rapid widespread transmission, and reporting delays during the 2014 Ebola outbreak in West Africa highlighted the need for a global, centralized database to inform outbreak response. The World Health Organization and Emerging and Dangerous Pathogens Laboratory Network addressed this need by supporting the development of a global laboratory database. Specimens were collected in the affected countries from patients and dead bodies meeting the case definitions for Ebola virus disease. Test results were entered in nationally standardized spreadsheets and consolidated onto a central server. From March 2014 through August 2016, 256343 specimens tested for Ebola virus disease were captured in the database. Thirty-one specimen types were collected, and a variety of diagnostic tests were performed. Regular analysis of data described the functionality of laboratory and response systems, positivity rates, and the geographic distribution of specimens. With data standardization and end user buy-in, the collection and analysis of large amounts of data with multiple stakeholders and collaborators across various user-access levels was made possible and contributed to outbreak response needs. The usefulness and value of a multifunctional global laboratory database is far reaching, with uses including virtual biobanking, disease forecasting, and adaption to other disease outbreaks. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail: journals.permissions@oup.com.

  4. GIRAFE, a campaign forecast tool for anthropogenic and biomass burning plumes

    NASA Astrophysics Data System (ADS)

    Fontaine, Alain; Mari, Céline; Drouin, Marc-Antoine; Lussac, Laure

    2015-04-01

    GIRAFE (reGIonal ReAl time Fire plumEs, http://girafe.pole-ether.fr, alain.fontaine@obs-mip.fr) is a forecast tool supported by the French atmospheric chemistry data centre Ether (CNES and CNRS), build on the lagrangian particle dispersion model FLEXPART coupled with ECMWF meteorological fields and emission inventories. GIRAFE was used during the CHARMEX campaign (Chemistry-Aerosol Mediterranean Experiment http://charmex.lsce.ipsl.fr) in order to provide daily 5-days plumes trajectory forecast over the Mediterranean Sea. For this field experiment, the lagrangian model was used to mimic carbon monoxide pollution plumes emitted either by anthropogenic or biomass burning emissions. Sources from major industrial areas as Fos-Berre or the Po valley were extracted from the MACC-TNO inventory. Biomass burning sources were estimated based on MODIS fire detection. Comparison with MACC and CHIMERE APIFLAME models revealed that GIRAFE followed pollution plumes from small and short-duration fires which were not captured by low resolution models. GIRAFE was used as a decision-making tool to schedule field campaign like airbone operations or balloons launching. Thanks to recent features, GIRAFE is able to read the ECCAD database (http://eccad.pole-ether.fr) inventories. Global inventories such as MACCITY and ECLIPSE will be used to predict CO plumes trajectories from major urban and industrial sources over West Africa for the DACCIWA campaign (Dynamic-Aerosol-Chemistry-Cloud interactions in West Africa).

  5. Dynamic Forecasting Conditional Probability of Bombing Attacks Based on Time-Series and Intervention Analysis.

    PubMed

    Li, Shuying; Zhuang, Jun; Shen, Shifei

    2017-07-01

    In recent years, various types of terrorist attacks occurred, causing worldwide catastrophes. According to the Global Terrorism Database (GTD), among all attack tactics, bombing attacks happened most frequently, followed by armed assaults. In this article, a model for analyzing and forecasting the conditional probability of bombing attacks (CPBAs) based on time-series methods is developed. In addition, intervention analysis is used to analyze the sudden increase in the time-series process. The results show that the CPBA increased dramatically at the end of 2011. During that time, the CPBA increased by 16.0% in a two-month period to reach the peak value, but still stays 9.0% greater than the predicted level after the temporary effect gradually decays. By contrast, no significant fluctuation can be found in the conditional probability process of armed assault. It can be inferred that some social unrest, such as America's troop withdrawal from Afghanistan and Iraq, could have led to the increase of the CPBA in Afghanistan, Iraq, and Pakistan. The integrated time-series and intervention model is used to forecast the monthly CPBA in 2014 and through 2064. The average relative error compared with the real data in 2014 is 3.5%. The model is also applied to the total number of attacks recorded by the GTD between 2004 and 2014. © 2016 Society for Risk Analysis.

  6. Forgetting to remember our experiences: People overestimate how much they will retrospect about personal events.

    PubMed

    Tully, Stephanie; Meyvis, Tom

    2017-12-01

    People value experiences in part because of the memories they create. Yet, we find that people systematically overestimate how much they will retrospect about their experiences. This overestimation results from people focusing on their desire to retrospect about experiences, while failing to consider the experience's limited enduring accessibility in memory. Consistent with this view, we find that desirability is a stronger predictor of forecasted retrospection than it is of reported retrospection, resulting in greater overestimation when the desirability of retrospection is higher. Importantly, the desire to retrospect does not change over time. Instead, past experiences become less top-of-mind over time and, as a result, people simply forget to remember. In line with this account, our results show that obtaining physical reminders of an experience reduces the overestimation of retrospection by increasing how much people retrospect, bringing their realized retrospection more in line with their forecasts (and aspirations). We further observe that the extent to which reported retrospection falls short of forecasted retrospection reliably predicts declining satisfaction with an experience over time. Despite this potential negative consequence of retrospection falling short of expectations, we suggest that the initial overestimation itself may in fact be adaptive. This possibility and other potential implications of this work are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. Performance of the Prognocean Plus system during the El Niño 2015/2016: predictions of sea level anomalies as tools for forecasting El Niño

    NASA Astrophysics Data System (ADS)

    Świerczyńska-Chlaściak, Małgorzata; Niedzielski, Tomasz; Miziński, Bartłomiej

    2017-04-01

    The aim of this paper is to present the performance of the Prognocean Plus system, which produces long-term predictions of sea level anomalies, during the El Niño 2015/2016. The main objective of work is to identify such ocean areas in which long-term forecasts of sea level anomalies during El Niño 2015/2016 reveal a considerable accuracy. At present, the system produces prognoses using four data-based models and their combinations: polynomial-harmonic model, autoregressive model, threshold autoregressive model and multivariate autoregressive model. The system offers weekly forecasts, with lead time up to 12 weeks. Several statistics that describe the efficiency of the available prediction models in four seasons used for estimating Oceanic Niño index (ONI) are calculated. The accuracies/skills of the predicting models were computed in the specific locations in the equatorial Pacific, namely the geometrically-determined central points of all Niño regions. For the said locations, we focused on the forecasts which targeted at the local maximum of sea level, driven by the El Niño 2015/2016. As a result, a series of the "spaghetti" graphs (for each point, season and model) as well as plots presenting the prognostic performance of every model - for all lead times, seasons and locations - were created. It is found that the Prognocean Plus system has a potential to become a new solution which may enhance the diagnostic discussions on the El Niño development. The forecasts produced by the threshold autoregressive model, for lead times of 5-6 weeks and 9 weeks, within the Niño1+2 region for the November-to-January (NDJ) season anticipated the culmination of the El Niño 2015/2016. The longest forecasts (8-12 weeks) were found to be the most accurate in the phase of transition from El Niño to normal conditions (the multivariate autoregressive model, central point of Niño1+2 region, the December-to-February season). The study was conducted to verify the ability and usefulness of sea level anomaly forecasts in predicting phenomena that are controlled by the ocean-atmosphere processes, such as El Niño Southern Oscillation or North Atlantic Oscillation. The results may support further investigations into long-term forecasting of the quantitative indices of these oscillations, solely based on prognoses of sea level change. In particular, comparing the accuracies of prognoses of the North Atlantic Oscillation index remains one of the tasks of the research project no. 2016/21/N/ST10/03231, financed by the National Science Center of Poland.

  8. Existing generating assets squeezed as new project starts slow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, R.B.; Tiffany, E.D.

    Most forecasting reports concentrate on political or regulatory events to predict future industry trends. Frequently overlooked are the more empirical performance trends of the principal power generation technologies. Solomon and Associates queried its many power plant performance databases and crunched some numbers to identify those trends. Areas of investigation included reliability, utilization (net output factor and net capacity factor) and cost (operating costs). An in-depth analysis for North America and Europe is presented in this article, by region and by regeneration technology. 4 figs., 2 tabs.

  9. Molecular Oxygen in the Thermosphere: Issues and Measurement Strategies

    NASA Astrophysics Data System (ADS)

    Picone, J. M.; Hedin, A. E.; Drob, D. P.; Meier, R. R.; Bishop, J.; Budzien, S. A.

    2002-05-01

    We review the state of empirical knowledge regarding the distribution of molecular oxygen in the lower thermosphere (100-200 km), as embodied by the new NRLMSISE-00 empirical atmospheric model, its predecessors, and the underlying databases. For altitudes above 120 km, the two major classes of data (mass spectrometer and solar ultraviolet [UV] absorption) disagree significantly regarding the magnitude of the O2 density and the dependence on solar activity. As a result, the addition of the Solar Maximum Mission (SMM) data set (based on solar UV absorption) to the NRLMSIS database has directly impacted the new model, increasing the complexity of the model's formulation and generally reducing the thermospheric O2 density relative to MSISE-90. Beyond interest in the thermosphere itself, this issue materially affects detailed models of ionospheric chemistry and dynamics as well as modeling of the upper atmospheric airglow. Because these are key elements of both experimental and operational systems which measure and forecast the near-Earth space environment, we present strategies for augmenting the database through analysis of existing data and through future measurements in order to resolve this issue.

  10. Shuttle landing facility cloud cover study: Climatological analysis and two tenths cloud cover rule evaluation

    NASA Technical Reports Server (NTRS)

    Atchison, Michael K.; Schumann, Robin; Taylor, Greg; Warburton, John; Wheeler, Mark; Yersavich, Ann

    1993-01-01

    The two-tenths cloud cover rule in effect for all End Of Mission (EOM) STS landings at the Kennedy Space Center (KSC) states: 'for scattered cloud layers below 10,000 feet, cloud cover must be observed to be less than or equal to 0.2 at the de-orbit burn go/no-go decision time (approximately 90 minutes before landing time)'. This rule was designed to protect against a ceiling (below 10,000 feet) developing unexpectedly within the next 90 minutes (i.e., after the de-orbit burn decision and before landing). The Applied Meteorological Unit (AMU) developed and analyzed a database of cloud cover amounts and weather conditions at the Shuttle Landing Facility for a five-year (1986-1990) period. The data indicate the best time to land the shuttle at KSC is during the summer while the worst time is during the winter. The analysis also shows the highest frequency of landing opportunities occurs for the 0100-0600 UTC and 1300-1600 UTC time periods. The worst time of the day to land a shuttle is near sunrise and during the afternoon. An evaluation of the two-tenths cloud cover rule for most data categorizations has shown that there is a significant difference in the proportions of weather violations one and two hours subsequent to initial conditions of 0.2 and 0.3 cloud cover. However, for May, Oct., 700 mb northerly wind category, 1500 UTC category, and 1600 UTC category there is some evidence that the 0.2 cloud cover rule may be overly conservative. This possibility requires further investigation. As a result of these analyses, the AMU developed nomograms to help the Spaceflight Meteorological Group (SMG) and the Cape Canaveral Forecast Facility (CCFF) forecast cloud cover for EOM and Return to Launch Site (RTLS) at KSC. Future work will include updating the two tenths database, further analysis of the data for several categorizations, and developing a proof of concept artificial neural network to provide forecast guidance of weather constraint violations for shuttle landings.

  11. Health status of UK care home residents: a cohort study

    PubMed Central

    Gordon, Adam Lee; Franklin, Matthew; Bradshaw, Lucy; Logan, Pip; Elliott, Rachel; Gladman, John R.F.

    2014-01-01

    Background: UK care home residents are often poorly served by existing healthcare arrangements. Published descriptions of residents’ health status have been limited by lack of detail and use of data derived from surveys drawn from social, rather than health, care records. Aim: to describe in detail the health status and healthcare resource use of UK care home residents Design and setting: a 180-day longitudinal cohort study of 227 residents across 11 UK care homes, 5 nursing and 6 residential, selected to be representative for nursing/residential status and dementia registration. Method: Barthel index (BI), Mini-mental state examination (MMSE), Neuropsychiatric index (NPI), Mini-nutritional index (MNA), EuroQoL-5D (EQ-5D), 12-item General Health Questionnaire (GHQ-12), diagnoses and medications were recorded at baseline and BI, NPI, GHQ-12 and EQ-5D at follow-up after 180 days. National Health Service (NHS) resource use data were collected from databases of local healthcare providers. Results: out of a total of 323, 227 residents were recruited. The median BI was 9 (IQR: 2.5–15.5), MMSE 13 (4–22) and number of medications 8 (5.5–10.5). The mean number of diagnoses per resident was 6.2 (SD: 4). Thirty per cent were malnourished, 66% had evidence of behavioural disturbance. Residents had contact with the NHS on average once per month. Conclusion: residents from both residential and nursing settings are dependent, cognitively impaired, have mild frequent behavioural symptoms, multimorbidity, polypharmacy and frequently use NHS resources. Effective care for such a cohort requires broad expertise from multiple disciplines delivered in a co-ordinated and managed way. PMID:23864424

  12. Effectiveness and Implementation of Evidence-Based Practices in Residential Care Settings

    PubMed Central

    James, Sigrid; Alemi, Qais; Zepeda, Veronica

    2013-01-01

    Purpose Prompted by calls to implement evidence-based practices (EBPs) into residential care settings (RCS), this review addresses three questions: (1) Which EBPs have been tested with children and youth within the context of RCS? (2) What is the evidence for their effectiveness within such settings? (3) What implementation issues arise when transporting EBPs into RCS? Methods Evidence-based psychosocial interventions and respective outcome studies, published from 1990–2012, were identified through a multi-phase search process, involving the review of four major clearinghouse websites and relevant electronic databases. To be included, effectiveness had to have been previously established through a comparison group design regardless of the setting, and interventions tested subsequently with youth in RCS. All outcome studies were evaluated for quality and bias using a structured appraisal tool. Results Ten interventions matching a priori criteria were identified: Adolescent Community Reinforcement Approach, Aggression Replacement Training, Dialectical Behavioral Therapy, Ecologically-Based Family Therapy, Eye Movement and Desensitization Therapy, Functional Family Therapy, Multimodal Substance Abuse Prevention, Residential Student Assistance Program, Solution-Focused Brief Therapy, and Trauma Intervention Program for Adjudicated and At-Risk Youth. Interventions were tested in 13 studies, which were conducted in different types of RCS, using a variety of study methods. Outcomes were generally positive, establishing the relative effectiveness of the interventions with youth in RCS across a range of psychosocial outcomes. However, concerns about methodological bias and confounding factors remain. Most studies addressed implementation issues, reporting on treatment adaptations, training and supervision, treatment fidelity and implementation barriers. Conclusion The review unearthed a small but important body of knowledge that demonstrates that EBPs can be implemented in RCS with encouraging results. PMID:23606781

  13. Effectiveness and Implementation of Evidence-Based Practices in Residential Care Settings.

    PubMed

    James, Sigrid; Alemi, Qais; Zepeda, Veronica

    2013-04-01

    Prompted by calls to implement evidence-based practices (EBPs) into residential care settings (RCS), this review addresses three questions: (1) Which EBPs have been tested with children and youth within the context of RCS? (2) What is the evidence for their effectiveness within such settings? (3) What implementation issues arise when transporting EBPs into RCS? Evidence-based psychosocial interventions and respective outcome studies, published from 1990-2012, were identified through a multi-phase search process, involving the review of four major clearinghouse websites and relevant electronic databases. To be included, effectiveness had to have been previously established through a comparison group design regardless of the setting, and interventions tested subsequently with youth in RCS. All outcome studies were evaluated for quality and bias using a structured appraisal tool. Ten interventions matching a priori criteria were identified: Adolescent Community Reinforcement Approach, Aggression Replacement Training, Dialectical Behavioral Therapy, Ecologically-Based Family Therapy, Eye Movement and Desensitization Therapy, Functional Family Therapy, Multimodal Substance Abuse Prevention, Residential Student Assistance Program, Solution-Focused Brief Therapy, and Trauma Intervention Program for Adjudicated and At-Risk Youth. Interventions were tested in 13 studies, which were conducted in different types of RCS, using a variety of study methods. Outcomes were generally positive, establishing the relative effectiveness of the interventions with youth in RCS across a range of psychosocial outcomes. However, concerns about methodological bias and confounding factors remain. Most studies addressed implementation issues, reporting on treatment adaptations, training and supervision, treatment fidelity and implementation barriers. The review unearthed a small but important body of knowledge that demonstrates that EBPs can be implemented in RCS with encouraging results.

  14. Family involvement in decision making for people with dementia in residential aged care: a systematic review of quantitative literature.

    PubMed

    Petriwskyj, Andrea; Gibson, Alexandra; Parker, Deborah; Banks, Susan; Andrews, Sharon; Robinson, Andrew

    2014-06-01

    Ensuring older adults' involvement in their care is accepted as good practice and is vital, particularly for people with dementia, whose care and treatment needs change considerably over the course of the illness. However, involving family members in decision making on people's behalf is still practically difficult for staff and family. The aim of this review was to identify and appraise the existing quantitative evidence about family involvement in decision making for people with dementia living in residential aged care. The present Joanna Briggs Institute (JBI) metasynthesis assessed studies that investigated involvement of family members in decision making for people with dementia in residential aged care settings. While quantitative and qualitative studies were included in the review, this paper presents the quantitative findings. A comprehensive search of 15 electronic databases was performed. The search was limited to papers published in English, from 1990 to 2013. Twenty-six studies were identified as being relevant; 10 were quantitative, with 1 mixed method study. Two independent reviewers assessed the studies for methodological validity and extracted the data using the JBI Meta Analysis of Statistics Assessment and Review Instrument (JBI-MAStARI). The findings were synthesized and presented in narrative form. The findings related to decisions encountered and made by family surrogates, variables associated with decisions, surrogates' perceptions of, and preferences for, their roles, as well as outcomes for people with dementia and their families. The results identified patterns within, and variables associated with, surrogate decision making, all of which highlight the complexity and variation regarding family involvement. Attention needs to be paid to supporting family members in decision making in collaboration with staff.

  15. A randomized trial of assertive continuing care and contingency management for adolescents with substance use disorders.

    PubMed

    Godley, Mark D; Godley, Susan H; Dennis, Michael L; Funk, Rodney R; Passetti, Lora L; Petry, Nancy M

    2014-02-01

    Most adolescents relapse within 90 days of discharge from residential substance use treatment. We hypothesized that contingency management (CM), assertive continuing care (ACC), and their combination (CM + ACC) would each be more effective than usual continuing care (UCC). Following residential treatment, 337 adolescents were randomized to 4 continuing care conditions: UCC alone, CM, ACC, or CM + ACC. UCC was available across all conditions. Outcome measures over 12 months included percentage of days abstinent from alcohol, heavy alcohol, marijuana, and any alcohol or other drugs (AOD) using self-reports and toxicology testing and remission status at 12 months. CM had significantly higher rates of abstinence than UCC for heavy alcohol use, t(297) = 2.50, p < .01, d = 0.34; any alcohol use, t(297) = 2.58, p < .01, d = 0.36; or any AOD use, t(297) = 2.12, p = .01, d = 0.41; and had a higher rate in remission, odds ratio (OR) = 2.45, 90% confidence interval (CI) [1.18, 5.08], p = .02. ACC had significantly higher rates of abstinence than UCC from heavy alcohol use, t(297) = 2.66, p < .01, d = 0.31; any alcohol use, t(297) = 2.63, p < .01, d = 0.30; any marijuana use, t(297) = 1.95, p = .02, d = 0.28; or any AOD use, t(297) = 1.88, p = .02, d = 0.30; and had higher rates in remission, OR = 2.31, 90% CI [1.10, 4.85], p = .03. The ACC + CM condition was not significantly different from UCC on any outcomes. CM and ACC are promising continuing care approaches after residential treatment. Future research should seek to further improve their effectiveness. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  16. Moving to an active lifestyle? A systematic review of the effects of residential relocation on walking, physical activity and travel behaviour.

    PubMed

    Ding, Ding; Nguyen, Binh; Learnihan, Vincent; Bauman, Adrian E; Davey, Rachel; Jalaludin, Bin; Gebel, Klaus

    2018-06-01

    To synthesise the literature on the effects of neighbourhood environmental change through residential relocation on physical activity, walking and travel behaviour. Systematic review following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines (PROSPERO registration number CRD42017077681). Electronic databases for peer-reviewed and grey literature were systematically searched to March 2017, followed by forward and backward citation tracking. A study was eligible for inclusion if it (1) measured changes in neighbourhood built environment attributes as a result of residential relocation (either prospectively or retrospectively); (2) included a measure of physical activity, walking, cycling or travel modal change as an outcome; (3) was quantitative and (4) included an English abstract or summary. A total of 23 studies was included in the review. Among the eight retrospective longitudinal studies, there was good evidence for the relationship between relocation and walking (consistency score (CS)>90%). For the 15 prospective longitudinal studies, the evidence for the effects of environmental change/relocation on physical activity or walking was weak to moderate (CS mostly <45%), even weaker for effects on other outcomes, including physical activity, cycling, public transport use and driving. Results from risk of bias analyses support the robustness of the findings. The results are encouraging for the retrospective longitudinal relocation studies, but weaker evidence exists for the methodologically stronger prospective longitudinal relocation studies. The evidence base is currently limited, and continued longitudinal research should extend the plethora of cross-sectional studies to build higher-quality evidence. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  17. The AMMA information system

    NASA Astrophysics Data System (ADS)

    Fleury, Laurence; Brissebrat, Guillaume; Boichard, Jean-Luc; Cloché, Sophie; Mière, Arnaud; Moulaye, Oumarou; Ramage, Karim; Favot, Florence; Boulanger, Damien

    2015-04-01

    In the framework of the African Monsoon Multidisciplinary Analyses (AMMA) programme, several tools have been developed in order to boost the data and information exchange between researchers from different disciplines. The AMMA information system includes (i) a user-friendly data management and dissemination system, (ii) quasi real-time display websites and (iii) a scientific paper exchange collaborative tool. The AMMA information system is enriched by past and ongoing projects (IMPETUS, FENNEC, ESCAPE, QweCI, ACASIS, DACCIWA...) addressing meteorology, atmospheric chemistry, extreme events, health, adaptation of human societies... It is becoming a reference information system on environmental issues in West Africa. (i) The projects include airborne, ground-based and ocean measurements, social science surveys, satellite data use, modelling studies and value-added product development. Therefore, the AMMA data portal enables to access a great amount and a large variety of data: - 250 local observation datasets, that cover many geophysical components (atmosphere, ocean, soil, vegetation) and human activities (agronomy, health). They have been collected by operational networks since 1850, long term monitoring research networks (CATCH, IDAF, PIRATA...) and intensive scientific campaigns; - 1350 outputs of a socio-economics questionnaire; - 60 operational satellite products and several research products; - 10 output sets of meteorological and ocean operational models and 15 of research simulations. Data documentation complies with metadata international standards, and data are delivered into standard formats. The data request interface takes full advantage of the database relational structure and enables users to elaborate multicriteria requests (period, area, property, property value…). The AMMA data portal counts about 900 registered users, and 50 data requests every month. The AMMA databases and data portal have been developed and are operated jointly by SEDOO and ESPRI in France: http://database.amma-international.org. The complete system is fully duplicated and operated by CRA in Niger: http://amma.agrhymet.ne/amma-data. (ii) A day-to-day chart display software has been designed and operated in order to monitor meteorological and environment information and to meet the observational team needs during the AMMA 2006 SOP (http://aoc.amma-international.org) and FENNEC 2011 campaign (http://fenoc.sedoo.fr). At present the websites constitute a synthetic view on the campaigns and a preliminary investigation tool for researchers. Since 2011, the same application enables a group of French and Senegalese researchers and forecasters to exchange in near real-time physical indices and diagnosis calculated from numerical weather operational forecasts, satellite products and in situ operational observations along the monsoon season, in order to better assess, understand and anticipate the monsoon intraseasonal variability (http://misva.sedoo.fr). Another similar website is dedicated to diagnosis and forecast of heat waves in West Africa (http://acasis.sedoo.fr). It aims at becoming an operational component for national early warning systems. (iii) A collaborative WIKINDX tool has been set on line in order to gather together scientific publications, theses and communications of interest: http://biblio.amma-international.org. At present the bibliographic database counts about 1200 references. It is the most exhaustive document collection about the West African monsoon available for all. Every scientist is invited to make use of the AMMA online tools and data. Scientists or project leaders who have management needs for existing or future datasets concerning West Africa are welcome to use the AMMA database framework and to contact ammaAdmin@sedoo.fr .

  18. Corps Water Management System (CWMS) Decision Support Modeling and Integration Use in the June 2007 Texas Floods

    NASA Astrophysics Data System (ADS)

    Charley, W. J.; Luna, M.

    2007-12-01

    The U.S. Army Corps of Engineers Corps Water Management System (CWMS) is a comprehensive data acquisition and hydrologic modeling system for short-term decision support of water control operations in real time. It encompasses data collection, validation and transformation, data storage, visualization, real time model simulation for decision-making support, and data dissemination. CWMS uses an Oracle database and Sun Solaris workstations for data processes, storage and the execution of models, with a client application (the Control and Visualization Interface, or CAVI) that can run on a Windows PC. CWMS was used by the Lower Colorado River Authority (LCRA) to make hydrologic forecasts of flows on the Lower Colorado River and operate reservoirs during the June 2007 event in Texas. The LCRA receives real-time observed gridded spatial rainfall data from OneRain, Inc. that which is a result of adjusting NexRad rainfall data with precipitation gages. This data is used, along with future precipitation estimates, for hydrologic forecasting by the rainfall-runoff modeling program HEC-HMS. Forecasted flows from HEC-HMS and combined with observed flows and reservoir information to simulate LCRA's reservoir operations and help engineers make release decisions based on the results. The river hydraulics program, HEC-RAS, computes river stages and water surface profiles for the computed flow. An inundation boundary and depth map of water in the flood plain can be calculated from the HEC-RAS results using ArcInfo. By varying future precipitation and releases, engineers can evaluate different "What if?" scenarios. What was described as an "extraordinary cluster of thunderstorms" that stalled over Burnet and Llano counties in Texas on June 27, 2007, dropped 17 to 19 inches of rainfall over a 6-hour period. The storm was classified over a 500-year event and the resulting flow over some of the smaller tributaries as a 100-year or better. CWMS was used by LCRA for flood forecasting and reservoir operations. The models accurately forecasting the flows and allowed engineers to determine that only four floodgates needed to be opened for Mansfield dam, in the Chain of Highland lakes. CWMS also forecasted the peak of the flood well before it happened. Smaller rain storms continued for a period of weeks and CWMS was used throughout the event calculating lake levels, closing of gates along with a hydro-generation schedule.

  19. Interventions for promoting smoke alarm ownership and function.

    PubMed

    DiGuiseppi, C; Higgins, J P

    2001-01-01

    Residential fires caused at least 67 deaths and 2,500 non-fatal injuries to children aged 0-16 in the United Kingdom in 1998. Smoke alarm ownership is associated with a reduced risk of residential fire death. We evaluated interventions to promote residential smoke alarms, to assess their effect on smoke alarm ownership, smoke alarm function, fires and burns and other fire-related injuries. We searched the Cochrane Controlled Trials Register, Cochrane Injuries Group database, MEDLINE, EMBASE, PsycLIT, CINAHL, ERIC, Dissertation Abstracts, International Bibliography of Social Sciences, ISTP, FIREDOC and LRC. Conference proceedings, published case studies, and bibliographies were systematically searched, and investigators and relevant organisations were contacted, to identify trials. Randomised, quasi-randomised or nonrandomised controlled trials completed or published after 1969 evaluating an intervention to promote residential smoke alarms. Two reviewers independently extracted data and assessed trial quality. We identified 26 trials, of which 13 were randomised. Overall, counselling and educational interventions had only a modest effect on the likelihood of owning an alarm (OR=1.26; 95% CI: 0.87 to 1.82) or having a functional alarm (OR=1.19; 0.85 to 1.66). Counselling as part of primary care child health surveillance had greater effects on ownership (OR=1.96; 1.03 to 3.72) and function (OR=1.72; 0.78 to 3.80). Results were sensitive to trial quality, however, and effects on fire-related injuries were not reported. In two non randomised trials, direct provision of free alarms significantly increased functioning alarms and reduced fire-related injuries. Media and community education showed little benefit in non randomised trials. Counselling as part of child health surveillance may increase smoke alarm ownership and function, but its effects on injuries are unevaluated. Community smoke alarm give-away programmes apparently reduce fire-related injuries, but these trials were not randomised and results must be interpreted cautiously. Further efforts to promote smoke alarms in primary care or through give-away programmes should be evaluated by adequately designed randomised controlled trials measuring injury outcomes.

  20. The health impact of residential retreats: a systematic review.

    PubMed

    Naidoo, Dhevaksha; Schembri, Adrian; Cohen, Marc

    2018-01-10

    Unhealthy lifestyles are a major factor in the development and exacerbation of many chronic diseases. Improving lifestyles though immersive residential experiences that promote healthy behaviours is a focus of the health retreat industry. This systematic review aims to identify and explore published studies on the health, wellbeing and economic impact of retreat experiences. MEDLINE, CINAHL and PsychINFO databases were searched for residential retreat studies in English published prior to February 2017. Studies were included if they were written in English, involved an intervention program in a residential setting of one or more nights, and included before-and-after data related to the health of participants. Studies that did not meet the above criteria or contained only descriptive data from interviews or case studies were excluded. A total of 23 studies including eight randomised controlled trials, six non-randomised controlled trials and nine longitudinal cohort studies met the inclusion criteria. These studies included a total of 2592 participants from diverse geographical and demographic populations and a great heterogeneity of outcome measures, with seven studies examining objective outcomes such as blood pressure or biological makers of disease, and 16 studies examining subjective outcomes that mostly involved self-reported questionnaires on psychological and spiritual measures. All studies reported post-retreat health benefits ranging from immediately after to five-years post-retreat. Study populations varied widely and most studies had small sample sizes, poorly described methodology and little follow-up data, and no studies reported on health economic outcomes or adverse effects, making it difficult to make definite conclusions about specific conditions, safety or return on investment. Health retreat experiences appear to have health benefits that include benefits for people with chronic diseases such as multiple sclerosis, various cancers, HIV/AIDS, heart conditions and mental health. Future research with larger numbers of subjects and longer follow-up periods are needed to investigate the health impact of different retreat experiences and the clinical populations most likely to benefit. Further studies are also needed to determine the economic benefits of retreat experiences for individuals, as well as for businesses, health insurers and policy makers.

  1. Development of urban water consumption models for the City of Los Angeles

    NASA Astrophysics Data System (ADS)

    Mini, C.; Hogue, T. S.; Pincetl, S.

    2011-12-01

    Population growth and rapid urbanization coupled with uncertain climate change are causing new challenges for meeting urban water needs. In arid and semi-arid regions, increasing drought periods and decreasing precipitation have led to water supply shortages and cities are struggling with trade-offs between the water needs of growing urban populations and the well-being of urban ecosystems. The goal of the current research is to build models that can represent urban water use patterns in semi-arid cities by identifying the determinants that control both total and outdoor residential water use over the Los Angeles urban domain. The initial database contains monthly water use records aggregated to the zip code level collected from the Los Angeles Department of Water and Power (LADWP) from 2000 to 2010. Residential water use was normalized per capita and was correlated with socio-demographic, economic, climatic and vegetation characteristics across the City for the 2000-2010 period. Results show that ethnicity, per capita income, and the average number of persons per household are linearly related to total water use per capita. Inter-annual differences in precipitation and implementation of conservation measures affect water use levels across the City. The high variability in water use patterns across the City also appears strongly influenced by income and education levels. The temporal analysis of vegetation indices in the studied neighborhoods shows little correlation between precipitation patterns and vegetation greenness. Urban vegetation appears well-watered, presenting the same greenness activity over the study period despite an overall decrease in water use across the City. We hypothesize that over-watering is occurring and that outdoor water use represents a significant part of the residential water budget in various regions of the City. A multiple regression model has been developed that integrates these fundamental controlling factors to simulate residential water use patterns across the City. The performance of the linear regression model is being tested and compared with other algorithm-based simulations for improved modeling of urban water consumption in the region. Ultimately, projects results will contribute to the implementation of sustainable strategies targeted to specific urban areas for a growing population under uncertain climate variability.

  2. Analyzing the efficiency of short-term air quality plans in European cities, using the CHIMERE air quality model.

    PubMed

    Thunis, P; Degraeuwe, B; Pisoni, E; Meleux, F; Clappier, A

    2017-01-01

    Regional and local authorities have the obligation to design air quality plans and assess their impacts when concentration levels exceed the limit values. Because these limit values cover both short- (day) and long-term (year) effects, air quality plans also follow these two formats. In this work, we propose a methodology to analyze modeled air quality forecast results, looking at emission reduction for different sectors (residential, transport, agriculture, etc.) with the aim of supporting policy makers in assessing the impact of short-term action plans. Regarding PM 10 , results highlight the diversity of responses across European cities, in terms of magnitude and type that raises the necessity of designing area-specific air quality plans. Action plans extended from 1 to 3 days (i.e., emissions reductions applied for 24 and 72 h, respectively) point to the added value of trans-city coordinated actions. The largest benefits are seen in central Europe (Vienna, Prague) while major cities (e.g., Paris) already solve a large part of the problem on their own. Eastern Europe would particularly benefit from plans based on emission reduction in the residential sectors; while in northern cities, agriculture seems to be the key sector on which to focus attention. Transport is playing a key role in most cities whereas the impact of industry is limited to a few cities in south-eastern Europe. For NO 2 , short-term action plans focusing on traffic emission reductions are efficient in all cities. This is due to the local character of this type of pollution. It is important, however, to stress that these results remain dependent on the selected months available for this study.

  3. Has upwelling strengthened along worldwide coasts over 1982-2010?

    NASA Astrophysics Data System (ADS)

    Varela, R.; Álvarez, I.; Santos, F.; Decastro, M.; Gómez-Gesteira, M.

    2015-05-01

    Changes in coastal upwelling strength have been widely studied since 1990 when Bakun proposed that global warming can induce the intensification of upwelling in coastal areas. Whether present wind trends support this hypothesis remains controversial, as results of previous studies seem to depend on the study area, the length of the time series, the season, and even the database used. In this study, temporal and spatial trends in the coastal upwelling regime worldwide were investigated during upwelling seasons from 1982 to 2010 using a single wind database (Climate Forecast System Reanalysis) with high spatial resolution (0.3°). Of the major upwelling systems, increasing trends were only observed in the coastal areas of Benguela, Peru, Canary, and northern California. A tendency for an increase in upwelling-favourable winds was also identified along several less studied regions, such as the western Australian and southern Caribbean coasts.

  4. Has upwelling strengthened along worldwide coasts over 1982-2010?

    PubMed Central

    Varela, R.; Álvarez, I.; Santos, F.;  deCastro, M.; Gómez-Gesteira, M.

    2015-01-01

    Changes in coastal upwelling strength have been widely studied since 1990 when Bakun proposed that global warming can induce the intensification of upwelling in coastal areas. Whether present wind trends support this hypothesis remains controversial, as results of previous studies seem to depend on the study area, the length of the time series, the season, and even the database used. In this study, temporal and spatial trends in the coastal upwelling regime worldwide were investigated during upwelling seasons from 1982 to 2010 using a single wind database (Climate Forecast System Reanalysis) with high spatial resolution (0.3°). Of the major upwelling systems, increasing trends were only observed in the coastal areas of Benguela, Peru, Canary, and northern California. A tendency for an increase in upwelling-favourable winds was also identified along several less studied regions, such as the western Australian and southern Caribbean coasts. PMID:25952477

  5. Clear-Sky Probability for the August 21, 2017, Total Solar Eclipse Using the NREL National Solar Radiation Database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habte, Aron M; Roberts, Billy J; Kutchenreiter, Mark C

    The National Renewable Energy Laboratory (NREL) and collaborators have created a clear-sky probability analysis to help guide viewers of the August 21, 2017, total solar eclipse, the first continent-spanning eclipse in nearly 100 years in the United States. Using cloud and solar data from NREL's National Solar Radiation Database (NSRDB), the analysis provides cloudless sky probabilities specific to the date and time of the eclipse. Although this paper is not intended to be an eclipse weather forecast, the detailed maps can help guide eclipse enthusiasts to likely optimal viewing locations. Additionally, high-resolution data are presented for the centerline of themore » path of totality, representing the likelihood for cloudless skies and atmospheric clarity. The NSRDB provides industry, academia, and other stakeholders with high-resolution solar irradiance data to support feasibility analyses for photovoltaic and concentrating solar power generation projects.« less

  6. Communicating with residential electrical devices via a vehicle telematics unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roth, Rebecca C.; Pebbles, Paul H.

    A method of communicating with residential electrical devices using a vehicle telematics unit includes receiving information identifying a residential electrical device to control; displaying in a vehicle one or more controlled features of the identified residential electrical device; receiving from a vehicle occupant a selection of the displayed controlled features of the residential electrical device; sending an instruction from the vehicle telematics unit to the residential electrical device via a wireless carrier system in response to the received selection; and controlling the residential electrical device using the sent instruction.

  7. 12 CFR 541.16 - Improved residential real estate.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 5 2011-01-01 2011-01-01 false Improved residential real estate. 541.16... REGULATIONS AFFECTING FEDERAL SAVINGS ASSOCIATIONS § 541.16 Improved residential real estate. The term improved residential real estate means residential real estate containing offsite or other improvements...

  8. 12 CFR 541.16 - Improved residential real estate.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 6 2012-01-01 2012-01-01 false Improved residential real estate. 541.16... REGULATIONS AFFECTING FEDERAL SAVINGS ASSOCIATIONS § 541.16 Improved residential real estate. The term improved residential real estate means residential real estate containing offsite or other improvements...

  9. 12 CFR 141.16 - Improved residential real estate.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 1 2014-01-01 2014-01-01 false Improved residential real estate. 141.16... REGULATIONS AFFECTING FEDERAL SAVINGS ASSOCIATIONS § 141.16 Improved residential real estate. The term improved residential real estate means residential real estate containing offsite or other improvements...

  10. 12 CFR 141.16 - Improved residential real estate.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 1 2012-01-01 2012-01-01 false Improved residential real estate. 141.16... REGULATIONS AFFECTING FEDERAL SAVINGS ASSOCIATIONS § 141.16 Improved residential real estate. The term improved residential real estate means residential real estate containing offsite or other improvements...

  11. 12 CFR 141.16 - Improved residential real estate.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 1 2013-01-01 2013-01-01 false Improved residential real estate. 141.16... REGULATIONS AFFECTING FEDERAL SAVINGS ASSOCIATIONS § 141.16 Improved residential real estate. The term improved residential real estate means residential real estate containing offsite or other improvements...

  12. 12 CFR 541.16 - Improved residential real estate.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 6 2013-01-01 2012-01-01 true Improved residential real estate. 541.16 Section... REGULATIONS AFFECTING FEDERAL SAVINGS ASSOCIATIONS § 541.16 Improved residential real estate. The term improved residential real estate means residential real estate containing offsite or other improvements...

  13. 12 CFR 541.16 - Improved residential real estate.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 6 2014-01-01 2012-01-01 true Improved residential real estate. 541.16 Section... REGULATIONS AFFECTING FEDERAL SAVINGS ASSOCIATIONS § 541.16 Improved residential real estate. The term improved residential real estate means residential real estate containing offsite or other improvements...

  14. Birth, growth and progresses through the last twelve years of a regional scale landslide warning system

    NASA Astrophysics Data System (ADS)

    Fanti, Riccardo; Segoni, Samuele; Rosi, Ascanio; Lagomarsino, Daniela; Catani, Filippo

    2017-04-01

    SIGMA is a regional landslide warning system that operates in the Emilia Romagna region (Italy). In this work, we depict its birth and the continuous development process, still ongoing, after over a decade of operational employ. Traditionally, landslide rainfall thresholds are defined by the empirical correspondence between a rainfall database and a landslide database. However, in the early stages of the research, a complete catalogue of dated landslides was not available. Therefore, the prototypal version of SIGMA was based on rainfall thresholds defined by means of a statistical analysis performed over the rainfall time series. SIGMA was purposely designed to take into account both shallow and deep seated landslides and it was based on the hypothesis that anomalous or extreme values of accumulated rainfall are responsible for landslide triggering. The statistical distribution of the rainfall series was analyzed, and multiples of the standard deviation (σ) were used as thresholds to discriminate between ordinary and extraordinary rainfall events. In the warning system, the measured and the forecasted rainfall are compared with these thresholds. Since the response of slope stability to rainfall may be complex, SIGMA is based on a decision algorithm aimed at identifying short but exceptionally intense rainfalls and mild but exceptionally prolonged rains: while the former are commonly associated with shallow landslides, the latter are mainly associated with deep-seated landslides. In the first case, the rainfall threshold is defined by high σ values and short durations (i.e. a few days); in the second case, σ values are lower but the decision algorithm checks long durations (i.e. some months). The exact definition of "high" and "low" σ values and of "short" and "long" duration varied through time according as it was adjusted during the evolution of the model. Indeed, since 2005, a constant work was carried out to gather and organize newly available data (rainfall recordings and landslides occurred) and to use them to define more robust relationships between rainfalls and landslide triggering, with the final aim to increase the forecasting effectiveness of the warning system. The updated rainfall and landslide database were used to periodically perform a quantitative validation and to analyze the errors affecting the system forecasts. The errors characterization was used to implement a continuous process of updating and modification of SIGMA, that included: - Main model upgrades (generalization from a pilot test site to the whole Emilia Romagna region; calibration against well documented landslide events to define specific σ levels for each territorial units; definition of different alert levels according to the number of expected - Ordinary updates (periodically, the new landslide and rainfall data were used to re-calibrate the thresholds, taking into account a more robust sample). - Model tuning (set up of the optimal version of the decisional algorithm, including different definitions of "long" and "short" periods; selection of the optimal reference rain gauge for each Territorial Unit; modification of the boundaries of some territorial - Additional features (definition of a module that takes into account the effect of snow melt and snow accumulation; coupling with a landslide susceptibility model to improve the spatial accuracy of the model). - Various performance tests (including the comparison with alternate versions of SIGMA or with thresholds based on rainfall intensity and duration). This process has led to an evolution of the warning system and to a documented improvement of its forecasting effectiveness. Landslide forecasting at regional scale is a very complex task, but as time passes by and with the systematic gathering of new substantial data and the continuous progresses of research, uncertainties can be progressively reduced and a warning system can be set that increases its performances and reliability with time.

  15. Surface Current Skill Assessment of Global and Regional forecast models.

    NASA Astrophysics Data System (ADS)

    Allen, A. A.

    2016-02-01

    The U.S. Coast Guard has been using SAROPS since January 2007 at all fifty of its operational centers to plan search and rescue missions. SAROPS relies on an Environmental Data Server (EDS) that integrates global, national, and regional ocean and meteorological observation and forecast data. The server manages spatial and temporal aggregation of hindcast, nowcast, and forecast data so the SAROPS controller has the best available data for search planning. The EDS harvests a wide range of global and regional forecasts and data, including NOAA NCEP's global HYCOM model (RTOFS), the U.S. Navy's Global HYCOM model, the 5 NOAA NOS Great Lakes models and a suite of other reginal forecasts from NOS and IOOS Regional Associations. The EDS also integrates surface drifter data as the U.S. Coast Guard regularly deploys Self-Locating Datum Marker Buoys (SLDMBs) during SAR cases and a significant set of drifter data has been collected and the archive continues to grow. This data is critically useful during real-time SAR planning, but also represents a valuable scientific dataset for analyzing surface currents. In 2014, a new initiative was started by the U.S. Coast Guard to evaluate the skill of the various models to support the decision making process during search and rescue planning. This analysis falls into 2 categories: historical analysis of drifter tracks and model predictions to provide skill assessment of models in different regions and real-time analysis of models and drifter tracks during a SAR incident. The EDS, using Liu and Wiesberg's (2014) autonomously determines surface skill measurements of the co-located models' simulated surface trajectories versus the actual drift of the SLDMBs (CODE/Davis style surface drifters GPS positioned at 30min intervals). Surface skill measurements are archived in a database and are user retrieval by lat/long/time cubes. This paper will focus on the comparison of models from in the period from 23 August to 21 September 2015. Surface Skill was determined for the following regions: California Coast, Gulf of Mexico, South and Mid Atlantic Bights. Skill was determined for the two version of the NCEP Global RTOFS, Navy's Global HYCOM model, and where appropriated the local regional models

  16. [Development of forecasting models for fatal road traffic injuries].

    PubMed

    Tan, Aichun; Tian, Danping; Huang, Yuanxiu; Gao, Lin; Deng, Xin; Li, Li; He, Qiong; Chen, Tianmu; Hu, Guoqing; Wu, Jing

    2014-02-01

    To develop the forecasting models for fatal road traffic injuries and to provide evidence for predicting the future trends on road traffic injuries. Data on the mortality of road traffic injury including factors as gender and age in different countries, were obtained from the World Health Organization Mortality Database. Other information on GDP per capita, urbanization, motorization and education were collected from online resources of World Bank, WHO, the United Nations Population Division and other agencies. We fitted logarithmic models of road traffic injury mortality by gender and age group, including predictors of GDP per capita, urbanization, motorization and education. Sex- and age-specific forecasting models developed by WHO that including GDP per capita, education and time etc. were also fitted. Coefficient of determination(R(2)) was used to compare the performance between our modes and WHO models. 2 626 sets of data were collected from 153 countries/regions for both genders, between 1965 and 2010. The forecasting models of road traffic injury mortality based on GDP per capita, motorization, urbanization and education appeared to be statistically significant(P < 0.001), and the coefficients of determination for males at the age groups of 0-4, 5-14, 15-24, 25-34, 35-44, 45-54, 55-64, 65+ were 22.7% , 31.1%, 51.8%, 52.3%, 44.9%, 41.8%, 40.1%, 25.5%, respectively while the coefficients for these age groups in women were 22.9%, 32.6%, 51.1%, 49.3%, 41.3%, 35.9%, 30.7%, 20.1%, respectively. The WHO models that were based on the GDP per capita, education and time variables were statistically significant (P < 0.001)and the coefficients of determination were 14.9% , 22.0%, 31.5%, 33.1% , 30.7%, 28.5%, 27.7% and 17.8% for males, but 14.1%, 20.6%, 30.4%, 31.8%, 26.7%, 24.3%, 17.3% and 8.8% for females, respectively. The forecasting models that we developed seemed to be better than those developed by WHO.

  17. 12 CFR 541.23 - Residential real estate.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 5 2011-01-01 2011-01-01 false Residential real estate. 541.23 Section 541.23... AFFECTING FEDERAL SAVINGS ASSOCIATIONS § 541.23 Residential real estate. The terms residential real estate... home used in part for business); (c) Other real estate used for primarily residential purposes other...

  18. 12 CFR 541.23 - Residential real estate.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 6 2012-01-01 2012-01-01 false Residential real estate. 541.23 Section 541.23... AFFECTING FEDERAL SAVINGS ASSOCIATIONS § 541.23 Residential real estate. The terms residential real estate... home used in part for business); (c) Other real estate used for primarily residential purposes other...

  19. 12 CFR 541.23 - Residential real estate.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 6 2014-01-01 2012-01-01 true Residential real estate. 541.23 Section 541.23... AFFECTING FEDERAL SAVINGS ASSOCIATIONS § 541.23 Residential real estate. The terms residential real estate... home used in part for business); (c) Other real estate used for primarily residential purposes other...

  20. 12 CFR 541.23 - Residential real estate.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 6 2013-01-01 2012-01-01 true Residential real estate. 541.23 Section 541.23... AFFECTING FEDERAL SAVINGS ASSOCIATIONS § 541.23 Residential real estate. The terms residential real estate... home used in part for business); (c) Other real estate used for primarily residential purposes other...

  1. 12 CFR 141.23 - Residential real estate.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 1 2013-01-01 2013-01-01 false Residential real estate. 141.23 Section 141.23... AFFECTING FEDERAL SAVINGS ASSOCIATIONS § 141.23 Residential real estate. The terms residential real estate... home used in part for business); (c) Other real estate used for primarily residential purposes other...

  2. 12 CFR 141.23 - Residential real estate.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 1 2014-01-01 2014-01-01 false Residential real estate. 141.23 Section 141.23... AFFECTING FEDERAL SAVINGS ASSOCIATIONS § 141.23 Residential real estate. The terms residential real estate... home used in part for business); (c) Other real estate used for primarily residential purposes other...

  3. 12 CFR 541.23 - Residential real estate.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Residential real estate. 541.23 Section 541.23... AFFECTING FEDERAL SAVINGS ASSOCIATIONS § 541.23 Residential real estate. The terms residential real estate... home used in part for business); (c) Other real estate used for primarily residential purposes other...

  4. 12 CFR 141.23 - Residential real estate.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 1 2012-01-01 2012-01-01 false Residential real estate. 141.23 Section 141.23... AFFECTING FEDERAL SAVINGS ASSOCIATIONS § 141.23 Residential real estate. The terms residential real estate... home used in part for business); (c) Other real estate used for primarily residential purposes other...

  5. Fatal falls in the U.S. residential construction industry.

    PubMed

    Dong, Xiuwen Sue; Wang, Xuanwen; Largay, Julie A; Platner, James W; Stafford, Erich; Cain, Chris Trahan; Choi, Sang D

    2014-09-01

    Falls from heights remain the most common cause of workplace fatalities among residential construction workers in the United States. This paper examines patterns and trends of fall fatalities in U.S. residential construction between 2003 and 2010 by analyzing two large national datasets. Almost half of the fatalities in residential construction were from falls. In the residential roofing industry, 80% of fatalities were from falls. In addition, about one-third of fatal falls in residential construction were among self-employed workers. Workers who were older than 55 years, were Hispanic foreign-born, or employed in small establishments (1-10 employees) also had higher proportions of fatal falls in residential construction compared to those in nonresidential construction. The findings suggest that fall safety within the residential construction industry lags behind commercial construction and industrial settings. Fall prevention in residential construction should be enhanced to better protect construction workers in this sector. © 2014 Wiley Periodicals, Inc.

  6. Flash flood warnings for ungauged basins based on high-resolution precipitation forecasts

    NASA Astrophysics Data System (ADS)

    Demargne, Julie; Javelle, Pierre; Organde, Didier; de Saint Aubin, Céline; Janet, Bruno

    2016-04-01

    Early detection of flash floods, which are typically triggered by severe rainfall events, is still challenging due to large meteorological and hydrologic uncertainties at the spatial and temporal scales of interest. Also the rapid rising of waters necessarily limits the lead time of warnings to alert communities and activate effective emergency procedures. To better anticipate such events and mitigate their impacts, the French national service in charge of flood forecasting (SCHAPI) is implementing a national flash flood warning system for small-to-medium (up to 1000 km²) ungauged basins based on a discharge-threshold flood warning method called AIGA (Javelle et al. 2014). The current deterministic AIGA system has been run in real-time in the South of France since 2005 and has been tested in the RHYTMME project (rhytmme.irstea.fr/). It ingests the operational radar-gauge QPE grids from Météo-France to run a simplified hourly distributed hydrologic model at a 1-km² resolution every 15 minutes. This produces real-time peak discharge estimates along the river network, which are subsequently compared to regionalized flood frequency estimates to provide warnings according to the AIGA-estimated return period of the ongoing event. The calibration and regionalization of the hydrologic model has been recently enhanced for implementing the national flash flood warning system for the entire French territory by 2016. To further extend the effective warning lead time, the flash flood warning system is being enhanced to ingest Météo-France's AROME-NWC high-resolution precipitation nowcasts. The AROME-NWC system combines the most recent available observations with forecasts from the nowcasting version of the AROME convection-permitting model (Auger et al. 2015). AROME-NWC pre-operational deterministic precipitation forecasts, produced every hour at a 2.5-km resolution for a 6-hr forecast horizon, were provided for 3 significant rain events in September and November 2014 and ingested as time-lagged ensembles. The time-lagged approach is a practical choice of accounting for the atmospheric forecast uncertainty when no extensive forecast archive is available for statistical modelling. The evaluation on 185 basins in the South of France showed significant improvements in terms of flash flood event detection and effective warning lead-time, compared to warnings from the current AIGA setup (without any future precipitation). Various verification metrics (e.g., Relative Mean Error, Brier Skill Score) show the skill of ensemble precipitation and flow forecasts compared to single-valued persistency benchmarks. Planned enhancements include integrating additional probabilistic NWP products (e.g., AROME precipitation ensembles on longer forecast horizon), accounting for and reducing hydrologic uncertainties from the model parameters and initial conditions via data assimilation, and developing a comprehensive observational and post-event damage database to determine decision-relevant warning thresholds for flood magnitude and probability. Javelle, P., Demargne, J., Defrance, D., Arnaud, P., 2014. Evaluating flash flood warnings at ungauged locations using post-event surveys: a case study with the AIGA warning system. Hydrological Sciences Journal, doi: 10.1080/02626667.2014.923970 Auger, L., Dupont, O., Hagelin, S., Brousseau, P., Brovelli, P., 2015. AROME-NWC: a new nowcasting tool based on an operational mesoscale forecasting system. Quarterly Journal of the Royal Meteorological Society, 141: 1603-1611, doi: 10.1002/qj.2463

  7. 1996-2004 Trends in the Single-Family Housing Market: Spatial Analysis of the Residential Sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Dave M.; Elliott, Douglas B.

    2006-09-05

    This report provides a detailed geographic analysis of two specific topics affecting the residential sector. First, we performed an analysis of new construction market trends using annual building permit data. We report summarized tables and national maps to help illustrate market conditions. Second, we performed a detailed geographic analysis of the housing finance market. We analyzed mortgage application data to provide citable statistics and detailed geographic summarization of the residential housing picture in the US for each year in the 1996-2004 period. The databases were linked to geographic information system tools to provide various map series detailing the results geographically.more » Looking at these results geographically may suggest potential new markets for TD programs addressing the residential sector that have not been considered previously. For example, we show which lenders affect which regions and which income or mortgage product classes. These results also highlight the issue of housing affordability. Energy efficiency R&D programs focused on developing new technology for the residential sector must be conscious of the costs of products resulting from research that will eventually impact the home owner or new home buyer. Results indicate that home values as a proportion of median family income in Building America communities are closely aligned with the national average of home value as a proportion of median income. Other key findings: • The share of home building and home buying activity continues to rise steadily in the Hot-Dry and Hot-Humid climate zones, while the Mixed-Humid and Cold climate zone shares continue to decline. Other zones remain relatively stable in terms of share of housing activity. • The proportion of home buyers having three times the median family income for their geography has been steadily increasing during the study period. • Growth in the Hispanic/Latino population and to a lesser degree in the Asian population has translated into proportional increases in share of home purchasing by both groups. White home buyers continue to decline as a proportion all home buyers. • Low interest rate climate resulted in lenders moving back to conventional financing, as opposed to government-backed financing, for cases that would be harder to financing in higher rate environments. Government loan products are one mechanism for affecting energy efficiency gains in the residential sector. • The rate environment and concurrent deregulation of the finance industry resulted unprecedented merger and acquisition activity among financial institutions during the study period. This study conducted a thorough accounting of this merger activity to inform the market share analysis provided. • The home finance industry quartiles feature 5 lenders making up the first quartile of home purchase loans, 18 lenders making up the second quartile, 111 lenders making up the third quartile, and the remaining nearly 8,000 lenders make up the fourth quartile.« less

  8. Liver cancer mortality rate model in Thailand

    NASA Astrophysics Data System (ADS)

    Sriwattanapongse, Wattanavadee; Prasitwattanaseree, Sukon

    2013-09-01

    Liver Cancer has been a leading cause of death in Thailand. The purpose of this study was to model and forecast liver cancer mortality rate in Thailand using death certificate reports. A retrospective analysis of the liver cancer mortality rate was conducted. Numbering of 123,280 liver cancer causes of death cases were obtained from the national vital registration database for the 10-year period from 2000 to 2009, provided by the Ministry of Interior and coded as cause-of-death using ICD-10 by the Ministry of Public Health. Multivariate regression model was used for modeling and forecasting age-specific liver cancer mortality rates in Thailand. Liver cancer mortality increased with increasing age for each sex and was also higher in the North East provinces. The trends of liver cancer mortality remained stable in most age groups with increases during ten-year period (2000 to 2009) in the Northern and Southern. Liver cancer mortality was higher in males and increase with increasing age. There is need of liver cancer control measures to remain on a sustained and long-term basis for the high liver cancer burden rate of Thailand.

  9. Applied Meteorology Unit (AMU) Quarterly Report Third Quarter FY · 13

    NASA Technical Reports Server (NTRS)

    Bauman, William; Crawford, Winifred; Watson, Leela; Shafer, Jaclyn; Huddleston, Lisa

    2013-01-01

    The AMU team worked on seven tasks for their customers: (1) Ms. Crawford completed the objective lightning forecast tool for east -central Florida airports and delivered the tool and the final report to the customers. (2) Ms. Shafer continued work for Vandenberg Air Force Base on an automated tool to relate pressure gradients to peak winds. (3) Dr. Huddleston updated and delivered the tool that shows statistics on the timing of the first lightning strike of the day in the Kennedy Space Center (KSC)/Cape Canaveral Air Force Station (CCAFS) area. (4) Dr. Bauman continued work on a severe weather forecast tool focused on the Eastern Range (ER). (5) Ms. Crawford acquired the software and radar data needed to create a dual-Doppler analysis over the east-central Florida and KSC/CCAFS areas. (6) Mr. Decker continued developing a wind pairs database for the Launch Services Program to use when evaluating upper-level winds for launch vehicles. (7) Dr. Watson continued work to assimilate observational data into the high-resolution model configurations she created for Wallops Flight Facility and the ER.

  10. Assimilating a decade of hydrometeorological ship measurements across the North American Great Lakes

    NASA Astrophysics Data System (ADS)

    Fries, K. J.; Kerkez, B.

    2015-12-01

    We use a decade of measurements made by the Volunteer Observing Ships (VOS) program on the North American Great Lakes to derive spatial estimates of over-lake air temperature, sea surface temperature, dewpoint, and wind speed. This Lagrangian data set, which annually comprises over 200,000 point observations from over 80,000 ship reports across a 244,000 square kilometer study area, is assimilated using a Gaussian Process machine learning algorithm. This algorithm classifies a model for each hydrometeorological variable using a combination of latitudes, longitudes, seasons of the year, as well as predictions made by the National Digital Forecast Database (NDFD) and Great Lakes Coastal Forecasting System (GLCFS) operational models. We show that our data-driven method significantly improves the spatial and temporal estimation of overlake hydrometeorological variables, while simultaneously providing uncertainty estimates that can be used to improve historical and future predictions on dense spatial and temporal scales. This method stands to improve the prediction of water levels on the Great Lakes, which comprise over 90% of America's surface fresh water, and impact the lives of millions of people living in the basin.

  11. Moving beyond the residential neighborhood to explore social inequalities in exposure to area-level disadvantage: Results from the Interdisciplinary Study on Inequalities in Smoking.

    PubMed

    Shareck, Martine; Kestens, Yan; Frohlich, Katherine L

    2014-05-01

    The focus, in place and health research, on a single, residential, context overlooks the fact that individuals are mobile and experience other settings in the course of their daily activities. Socio-economic characteristics are associated with activity patterns, as well as with the quality of places where certain groups conduct activities, i.e. their non-residential activity space. Examining how measures of exposure to resources, and inequalities thereof, compare between residential and non-residential contexts is required. Baseline data from 1890 young adults (18-25 years-old) participating in the Interdisciplinary Study of Inequalities in Smoking, Montreal, Canada (2011-2012), were analyzed. Socio-demographic and activity location data were collected using a validated, self-administered questionnaire. Area-level material deprivation was measured within 500-m road-network buffer zones around participants' residential and activity locations. Deprivation scores in the residential area and non-residential activity space were compared between social groups. Multivariate linear regression was used to estimate associations between individual- and area-level characteristics and non-residential activity space deprivation, and to explore whether these characteristics attenuated the education-deprivation association. Participants in low educational categories lived and conducted activities in more disadvantaged areas than university students/graduates. Educational inequalities in exposure to area-level deprivation were larger in the non-residential activity space than in the residential area for the least educated, but smaller for the intermediate group. Adjusting for selected covariates such as transportation resources and residential deprivation did not significantly attenuate the education-deprivation associations. Results support the existence of social isolation in residential areas and activity locations, whereby less educated individuals tend to be confined to more disadvantaged areas than their more educated counterparts. They also highlight the relevance of investigating both residential and non-residential contexts when studying inequalities in health-relevant exposures. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Time series modelling to forecast prehospital EMS demand for diabetic emergencies.

    PubMed

    Villani, Melanie; Earnest, Arul; Nanayakkara, Natalie; Smith, Karen; de Courten, Barbora; Zoungas, Sophia

    2017-05-05

    Acute diabetic emergencies are often managed by prehospital Emergency Medical Services (EMS). The projected growth in prevalence of diabetes is likely to result in rising demand for prehospital EMS that are already under pressure. The aims of this study were to model the temporal trends and provide forecasts of prehospital attendances for diabetic emergencies. A time series analysis on monthly cases of hypoglycemia and hyperglycemia was conducted using data from the Ambulance Victoria (AV) electronic database between 2009 and 2015. Using the seasonal autoregressive integrated moving average (SARIMA) modelling process, different models were evaluated. The most parsimonious model with the highest accuracy was selected. Forty-one thousand four hundred fifty-four prehospital diabetic emergencies were attended over a seven-year period with an increase in the annual median monthly caseload between 2009 (484.5) and 2015 (549.5). Hypoglycemia (70%) and people with type 1 diabetes (48%) accounted for most attendances. The SARIMA (0,1,0,12) model provided the best fit, with a MAPE of 4.2% and predicts a monthly caseload of approximately 740 by the end of 2017. Prehospital EMS demand for diabetic emergencies is increasing. SARIMA time series models are a valuable tool to allow forecasting of future caseload with high accuracy and predict increasing cases of prehospital diabetic emergencies into the future. The model generated by this study may be used by service providers to allow appropriate planning and resource allocation of EMS for diabetic emergencies.

  13. Using SMAP Data to Investigate the Role of Soil Moisture Variability on Realtime Flood Forecasting

    NASA Astrophysics Data System (ADS)

    Krajewski, W. F.; Jadidoleslam, N.; Mantilla, R.

    2017-12-01

    The Iowa Flood Center has developed a regional high-resolution flood-forecasting model for the state of Iowa that decomposes the landscape into hillslopes of about 0.1 km2. For the model to benefit, through data assimilation, from SMAP observations of soil moisture (SM) at scales of approximately 100 km2, we are testing a framework to connect SMAP-scale observations to the small-scale SM variability calculated by our rainfall-runoff models. As a step in this direction, we performed data analyses of 15-min point SM observations using a network of about 30 TDR instruments spread throughout the state. We developed a stochastic point-scale SM model that captures 1) SM increases due to rainfall inputs, and 2) SM decay during dry periods. We use a power law model to describe soil moisture decay during dry periods, and a single parameter logistic curve to describe precipitation feedback on soil moisture. We find that the parameters of the models behave as time-independent random variables with stationary distributions. Using data-based simulation, we explore differences in the dynamical range of variability of hillslope and SMAP-scale domains. The simulations allow us to predict the runoff field and streamflow hydrographs for the state of Iowa during the three largest flooding periods (2008, 2014, and 2016). We also use the results to determine the reduction in forecast uncertainty from assimilation of unbiased SMAP-scale soil moisture observations.

  14. Potential assessment of the "support vector machine" method in forecasting ambient air pollutant trends.

    PubMed

    Lu, Wei-Zhen; Wang, Wen-Jian

    2005-04-01

    Monitoring and forecasting of air quality parameters are popular and important topics of atmospheric and environmental research today due to the health impact caused by exposing to air pollutants existing in urban air. The accurate models for air pollutant prediction are needed because such models would allow forecasting and diagnosing potential compliance or non-compliance in both short- and long-term aspects. Artificial neural networks (ANN) are regarded as reliable and cost-effective method to achieve such tasks and have produced some promising results to date. Although ANN has addressed more attentions to environmental researchers, its inherent drawbacks, e.g., local minima, over-fitting training, poor generalization performance, determination of the appropriate network architecture, etc., impede the practical application of ANN. Support vector machine (SVM), a novel type of learning machine based on statistical learning theory, can be used for regression and time series prediction and have been reported to perform well by some promising results. The work presented in this paper aims to examine the feasibility of applying SVM to predict air pollutant levels in advancing time series based on the monitored air pollutant database in Hong Kong downtown area. At the same time, the functional characteristics of SVM are investigated in the study. The experimental comparisons between the SVM model and the classical radial basis function (RBF) network demonstrate that the SVM is superior to the conventional RBF network in predicting air quality parameters with different time series and of better generalization performance than the RBF model.

  15. Affective forecasting about hedonic loss and adaptation: Implications for damage awards.

    PubMed

    Greene, Edie; Sturm, Kristin A; Evelo, Andrew J

    2016-06-01

    In tort lawsuits, plaintiffs may seek damages for loss of enjoyment of life, so-called hedonic loss, which occurred as a result of an accident or injury. In 2 studies, we examined how people judge others' adaptation and hedonic loss after an injury. Laypeople's forecasts of hedonic loss are relevant to concerns about whether jurors appropriately compensate plaintiffs. Longitudinal data of subjective well-being (e.g., Binder & Coad, 2013) show that hedonic loss is domain-specific: Many physical impairments (e.g., strokes) inflict less hedonic loss than many persistent yet invisible ailments (e.g., mental illness and conditions that cause chronic pain). We used vignette methodology to determine whether laypeople (n = 68 community members and 65 students in Study 1; 87 community members and 93 students in Study 2) and rehabilitation professionals (n = 47 in Study 2) were aware of this fact. In Study 1, participants' ratings of hedonic loss subsequent to a physical injury and a comparably severe psychological impairment did not differ. In Study 2, ratings of short- and long-term hedonic loss stemming from paraplegia and chronic back pain showed that neither laypeople nor professionals understood that hedonic loss is domain-specific. These findings imply that observers may forecast a future for people who suffered serious physical injuries as grimmer than it is likely to be, and a future for people who experience chronic pain and psychological disorders as rosier than is likely. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  16. Confronting the demand and supply of snow seasonal forecasts for ski resorts : the case of French Alps

    NASA Astrophysics Data System (ADS)

    Dubois, Ghislain

    2017-04-01

    Alpine ski resorts are highly dependent on snow, which availability is characterized by a both a high inter-annual variability and a gradual diminution due to climate change. Due to this dependency to climatic resources, the ski industry is increasingly affected by climate change: higher temperatures limit snow falls, increase melting and limit the possibilities of technical snow making. Therefore, since the seventies, managers drastically improved their practices, both to adapt to climate change and to this inter-annual variability of snow conditions. Through slope preparation and maintenance, snow stock management, artificial snow making, a typical resort can approximately keep the same season duration with 30% less snow. The ski industry became an activity of high technicity The EUPORIAS FP7 (www.euporias.eu) project developed between 2012 and 2016 a deep understanding of the supply and demand conditions for the provision of climate services disseminating seasonal forecasts. In particular, we developed a case study, which allowed conducting several activities for a better understanding of the demand and of the business model of future services applied to the ski industry. The investigations conducted in France inventoried the existing tools and databases, assessed the decision making process and data needs of ski operators, and provided evidences that some discernable skill of seasonal forecasts exist. This case study formed the basis of the recently funded PROSNOW H2020 project. We will present the main results of EUPORIAS project for the ski industry.

  17. Rural-metropolitan disparities in ovarian cancer survival: a statewide population-based study.

    PubMed

    Park, Jihye; Blackburn, Brenna E; Rowe, Kerry; Snyder, John; Wan, Yuan; Deshmukh, Vikrant; Newman, Michael; Fraser, Alison; Smith, Ken; Herget, Kim; Burt, Lindsay; Werner, Theresa; Gaffney, David K; Lopez, Ana Maria; Mooney, Kathi; Hashibe, Mia

    2018-06-01

    To investigate rural-metropolitan disparities in ovarian cancer survival, we assessed ovarian cancer mortality and differences in prognostic factors by rural-metropolitan residence. The Utah Population Database was used to identify ovarian cancer cases diagnosed between 1997 and 2012. Residential location information at the time of cancer diagnosis was used to stratify rural-metropolitan residence. All-cause death and ovarian cancer death risks were estimated using Cox proportional hazard regression models. Among 1661 patients diagnosed with ovarian cancer, 11.8% were living in rural counties of Utah. Although ovarian cancer patients residing in rural counties had different characteristics compared with metropolitan residents, we did not observe an association between rural residence and risk of all-cause nor ovarian cancer-specific death after adjusting for confounders. However, among rural residents, ovarian cancer mortality risk was very high in older age at diagnosis and for mucinous carcinoma, and low in overweight at baseline. Rural residence was not significantly associated with the risk of ovarian cancer death. Nevertheless, patients residing in rural-metropolitan areas had different factors affecting the risk of all-cause mortality and cancer-specific death. Further research is needed to quantify how mortality risk can differ by residential location accounting for degree of health care access and lifestyle-related factors. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. How do community-dwelling LGBT people perceive sexuality in residential aged care? A systematic literature review.

    PubMed

    Mahieu, Lieslot; Cavolo, Alice; Gastmans, Chris

    2018-01-22

    To investigate what empirical studies report on the perceptions of community-dwelling LGBT adults regarding sexuality and sexual expression in residential aged care (RAC), and how their sexuality should be addressed in RAC. Relevant papers were identified through electronic searches in databases; and by reference tracking and citation tracking. Data were extracted using a standardized data extraction form and were compared, related, and synthesized using thematic analyses. We evaluated the methodological quality of the studies. Eighteen articles were identified. Three major topics emerged regarding sexuality in RAC: (1) factors affecting LGBT people's perceptions, subdivided into (a) discrimination, (b) loss of sexual identity, (c) failure to acknowledge the same-sex partner, and (d) lack of privacy; (2) LGBT-specific RAC facilities; and (3) characteristics of LGBT friendly RAC facilities and caregivers. LGBT people have clear perceptions about how sexuality and sexual expression is or should be managed in RAC. Despite the general increase in acceptance of sexual minorities, many community-dwelling LGBT people believe older LGBT residents are discriminated against because of their sexual orientation or gender identity. Taking into account these opinions is crucial for increasing accessibility of RAC to LGBT people and to ensure the quality of the provided care.

  19. Investigation on the impacts of low-sulfur fuel used in residential heating and oil-fired power plants on PM2.5-concentrations and its composition in Fairbanks, Alaska

    NASA Astrophysics Data System (ADS)

    Leelasakultum, Ketsiri

    The effects of using low-sulfur fuel for oil-heating and oil-burning facilities on the PM2.5-concentrations at breathing level in an Alaska city surrounded by vast forested areas were examined with the Weather Research and Forecasting model coupled with chemistry packages that were modified for the subarctic. Simulations were performed in forecast mode for a cold season using the National Emission Inventory 2008 and alternatively emissions that represent the use of low-sulfur fuel for oil-heating and oil-burning facilities while keeping the emissions of other sources the same as in the reference simulation. The simulations suggest that introducing low-sulfur fuel would decrease the monthly mean 24h-averaged PM2.5-concentrations over the city's PM2.5-nonattainment area by 4%, 9%, 8%, 6%, 5% and 7% in October, November, December, January, February and March, respectively. The quarterly mean relative response factors for PM2.5-concentrations of 0.96 indicate that with a design value of 44.7microg/m3. introducing low-sulfur fuel would lead to a new design value of 42.9microg/m 3 that still exceeds the US National Ambient Air Quality Standard of 35microg/m3. The magnitude of the relation between the relative response of sulfate and nitrate changes differs with temperature. The simulations suggest that in the city, PM2.5-concentrations would decrease more on days with low atmospheric boundary layer heights, low hydrometeor mixing ratio, low downward shortwave radiation and low temperatures. Furthermore, a literature review of other emission control measure studies is given, and recommendations for future studies are made based on the findings.

  20. Reducing residential solid fuel combustion through electrified space heating leads to substantial air quality, health and climate benefits in China's Beijing-Tianjin-Hebei region

    NASA Astrophysics Data System (ADS)

    Yang, J.; Mauzerall, D. L.

    2017-12-01

    During periods of high pollution in winter, household space heating can contribute more than half of PM2.5 concentrations in China's Beijing-Tianjin-Hebei (BTH) region. The majority of rural households and some urban households in the region still heat with small stoves and solid fuels such as raw coal, coal briquettes and biomass. Thus, reducing emissions from residential space heating has become a top priority of the Chinese government's air pollution mitigation plan. Electrified space heating is a promising alternative to solid fuel. However, there is little analysis of the air quality and climate implications of choosing various electrified heating devices and utilizing different electricity sources. Here we conduct an integrated assessment of the air quality, human health and climate implications of various electrified heating scenarios in the BTH region using the Weather Research and Forecasting model with Chemistry. We use the Multi-resolution Emission Inventory for China for the year 2012 as our base case and design two electrification scenarios in which either direct resistance heaters or air source heat pumps are installed to replace all household heating stoves. We initially assume all electrified heating devices use electricity from supercritical coal-fired power plants. We find that installing air source heat pumps reduces CO2 emissions and premature deaths due to PM2.5 pollution more than resistance heaters, relative to the base case. The increased health and climate benefits of heat pumps occur because they have a higher heat conversion efficiency and thus require less electricity for space heating than resistance heaters. We also find that with the same heat pump installation, a hybrid electricity source (40% of the electricity generated from renewable sources and the rest from coal) further reduces both CO2 emissions and premature deaths than using electricity only from coal. Our study demonstrates the air pollution and CO2 mitigation potential and public health benefits of using electrified space heating. In particular, we find air source heat pumps could bring more climate and health benefits than direct resistance heaters. Our results also support policies to integrate renewable energy sources with the reduction of solid fuel combustion for residential space heating.

Top