Ishii, Audrey L.; Soong, David T.; Sharpe, Jennifer B.
2010-01-01
Illinois StreamStats (ILSS) is a Web-based application for computing selected basin characteristics and flood-peak quantiles based on the most recently (2010) published (Soong and others, 2004) regional flood-frequency equations at any rural stream location in Illinois. Limited streamflow statistics including general statistics, flow durations, and base flows also are available for U.S. Geological Survey (USGS) streamflow-gaging stations. ILSS can be accessed on the Web at http://streamstats.usgs.gov/ by selecting the State Applications hyperlink and choosing Illinois from the pull-down menu. ILSS was implemented for Illinois by obtaining and projecting ancillary geographic information system (GIS) coverages; populating the StreamStats database with streamflow-gaging station data; hydroprocessing the 30-meter digital elevation model (DEM) for Illinois to conform to streams represented in the National Hydrographic Dataset 1:100,000 stream coverage; and customizing the Web-based Extensible Markup Language (XML) programs for computing basin characteristics for Illinois. The basin characteristics computed by ILSS then were compared to the basin characteristics used in the published study, and adjustments were applied to the XML algorithms for slope and basin length. Testing of ILSS was accomplished by comparing flood quantiles computed by ILSS at a an approximately random sample of 170 streamflow-gaging stations computed by ILSS with the published flood quantile estimates. Differences between the log-transformed flood quantiles were not statistically significant at the 95-percent confidence level for the State as a whole, nor by the regions determined by each equation, except for region 1, in the northwest corner of the State. In region 1, the average difference in flood quantile estimates ranged from 3.76 percent for the 2-year flood quantile to 4.27 percent for the 500-year flood quantile. The total number of stations in region 1 was small (21) and the mean difference is not large (less than one-tenth of the average prediction error for the regression-equation estimates). The sensitivity of the flood-quantile estimates to differences in the computed basin characteristics are determined and presented in tables. A test of usage consistency was conducted by having at least 7 new users compute flood quantile estimates at 27 locations. The average maximum deviation of the estimate from the mode value at each site was 1.31 percent after four mislocated sites were removed. A comparison of manual 100-year flood-quantile computations with ILSS at 34 sites indicated no statistically significant difference. ILSS appears to be an accurate, reliable, and effective tool for flood-quantile estimates.
NASA Astrophysics Data System (ADS)
Formetta, Giuseppe; Bell, Victoria; Stewart, Elizabeth
2018-02-01
Regional flood frequency analysis is one of the most commonly applied methods for estimating extreme flood events at ungauged sites or locations with short measurement records. It is based on: (i) the definition of a homogeneous group (pooling-group) of catchments, and on (ii) the use of the pooling-group data to estimate flood quantiles. Although many methods to define a pooling-group (pooling schemes, PS) are based on catchment physiographic similarity measures, in the last decade methods based on flood seasonality similarity have been contemplated. In this paper, two seasonality-based PS are proposed and tested both in terms of the homogeneity of the pooling-groups they generate and in terms of the accuracy in estimating extreme flood events. The method has been applied in 420 catchments in Great Britain (considered as both gauged and ungauged) and compared against the current Flood Estimation Handbook (FEH) PS. Results for gauged sites show that, compared to the current PS, the seasonality-based PS performs better both in terms of homogeneity of the pooling-group and in terms of the accuracy of flood quantile estimates. For ungauged locations, a national-scale hydrological model has been used for the first time to quantify flood seasonality. Results show that in 75% of the tested locations the seasonality-based PS provides an improvement in the accuracy of the flood quantile estimates. The remaining 25% were located in highly urbanized, groundwater-dependent catchments. The promising results support the aspiration that large-scale hydrological models complement traditional methods for estimating design floods.
Confidence intervals for expected moments algorithm flood quantile estimates
Cohn, Timothy A.; Lane, William L.; Stedinger, Jery R.
2001-01-01
Historical and paleoflood information can substantially improve flood frequency estimates if appropriate statistical procedures are properly applied. However, the Federal guidelines for flood frequency analysis, set forth in Bulletin 17B, rely on an inefficient “weighting” procedure that fails to take advantage of historical and paleoflood information. This has led researchers to propose several more efficient alternatives including the Expected Moments Algorithm (EMA), which is attractive because it retains Bulletin 17B's statistical structure (method of moments with the Log Pearson Type 3 distribution) and thus can be easily integrated into flood analyses employing the rest of the Bulletin 17B approach. The practical utility of EMA, however, has been limited because no closed‐form method has been available for quantifying the uncertainty of EMA‐based flood quantile estimates. This paper addresses that concern by providing analytical expressions for the asymptotic variance of EMA flood‐quantile estimators and confidence intervals for flood quantile estimates. Monte Carlo simulations demonstrate the properties of such confidence intervals for sites where a 25‐ to 100‐year streamgage record is augmented by 50 to 150 years of historical information. The experiments show that the confidence intervals, though not exact, should be acceptable for most purposes.
A hierarchical Bayesian GEV model for improving local and regional flood quantile estimates
NASA Astrophysics Data System (ADS)
Lima, Carlos H. R.; Lall, Upmanu; Troy, Tara; Devineni, Naresh
2016-10-01
We estimate local and regional Generalized Extreme Value (GEV) distribution parameters for flood frequency analysis in a multilevel, hierarchical Bayesian framework, to explicitly model and reduce uncertainties. As prior information for the model, we assume that the GEV location and scale parameters for each site come from independent log-normal distributions, whose mean parameter scales with the drainage area. From empirical and theoretical arguments, the shape parameter for each site is shrunk towards a common mean. Non-informative prior distributions are assumed for the hyperparameters and the MCMC method is used to sample from the joint posterior distribution. The model is tested using annual maximum series from 20 streamflow gauges located in an 83,000 km2 flood prone basin in Southeast Brazil. The results show a significant reduction of uncertainty estimates of flood quantile estimates over the traditional GEV model, particularly for sites with shorter records. For return periods within the range of the data (around 50 years), the Bayesian credible intervals for the flood quantiles tend to be narrower than the classical confidence limits based on the delta method. As the return period increases beyond the range of the data, the confidence limits from the delta method become unreliable and the Bayesian credible intervals provide a way to estimate satisfactory confidence bands for the flood quantiles considering parameter uncertainties and regional information. In order to evaluate the applicability of the proposed hierarchical Bayesian model for regional flood frequency analysis, we estimate flood quantiles for three randomly chosen out-of-sample sites and compare with classical estimates using the index flood method. The posterior distributions of the scaling law coefficients are used to define the predictive distributions of the GEV location and scale parameters for the out-of-sample sites given only their drainage areas and the posterior distribution of the average shape parameter is taken as the regional predictive distribution for this parameter. While the index flood method does not provide a straightforward way to consider the uncertainties in the index flood and in the regional parameters, the results obtained here show that the proposed Bayesian method is able to produce adequate credible intervals for flood quantiles that are in accordance with empirical estimates.
NASA Astrophysics Data System (ADS)
Costa, Veber; Fernandes, Wilson
2017-11-01
Extreme flood estimation has been a key research topic in hydrological sciences. Reliable estimates of such events are necessary as structures for flood conveyance are continuously evolving in size and complexity and, as a result, their failure-associated hazards become more and more pronounced. Due to this fact, several estimation techniques intended to improve flood frequency analysis and reducing uncertainty in extreme quantile estimation have been addressed in the literature in the last decades. In this paper, we develop a Bayesian framework for the indirect estimation of extreme flood quantiles from rainfall-runoff models. In the proposed approach, an ensemble of long daily rainfall series is simulated with a stochastic generator, which models extreme rainfall amounts with an upper-bounded distribution function, namely, the 4-parameter lognormal model. The rationale behind the generation model is that physical limits for rainfall amounts, and consequently for floods, exist and, by imposing an appropriate upper bound for the probabilistic model, more plausible estimates can be obtained for those rainfall quantiles with very low exceedance probabilities. Daily rainfall time series are converted into streamflows by routing each realization of the synthetic ensemble through a conceptual hydrologic model, the Rio Grande rainfall-runoff model. Calibration of parameters is performed through a nonlinear regression model, by means of the specification of a statistical model for the residuals that is able to accommodate autocorrelation, heteroscedasticity and nonnormality. By combining the outlined steps in a Bayesian structure of analysis, one is able to properly summarize the resulting uncertainty and estimating more accurate credible intervals for a set of flood quantiles of interest. The method for extreme flood indirect estimation was applied to the American river catchment, at the Folsom dam, in the state of California, USA. Results show that most floods, including exceptionally large non-systematic events, were reasonably estimated with the proposed approach. In addition, by accounting for uncertainties in each modeling step, one is able to obtain a better understanding of the influential factors in large flood formation dynamics.
Flood frequency analysis - the challenge of using historical data
NASA Astrophysics Data System (ADS)
Engeland, Kolbjorn
2015-04-01
Estimates of high flood quantiles are needed for many applications, .e.g. dam safety assessments are based on the 1000 years flood, whereas the dimensioning of important infrastructure requires estimates of the 200 year flood. The flood quantiles are estimated by fitting a parametric distribution to a dataset of high flows comprising either annual maximum values or peaks over a selected threshold. Since the record length of data is limited compared to the desired flood quantile, the estimated flood magnitudes are based on a high degree of extrapolation. E.g. the longest time series available in Norway are around 120 years, and as a result any estimation of a 1000 years flood will require extrapolation. One solution is to extend the temporal dimension of a data series by including information about historical floods before the stream flow was systematically gaugeded. Such information could be flood marks or written documentation about flood events. The aim of this study was to evaluate the added value of using historical flood data for at-site flood frequency estimation. The historical floods were included in two ways by assuming: (1) the size of (all) floods above a high threshold within a time interval is known; and (2) the number of floods above a high threshold for a time interval is known. We used a Bayesian model formulation, with MCMC used for model estimation. This estimation procedure allowed us to estimate the predictive uncertainty of flood quantiles (i.e. both sampling and parameter uncertainty is accounted for). We tested the methods using 123 years of systematic data from Bulken in western Norway. In 2014 the largest flood in the systematic record was observed. From written documentation and flood marks we had information from three severe floods in the 18th century and they were likely to exceed the 2014 flood. We evaluated the added value in two ways. First we used the 123 year long streamflow time series and investigated the effect of having several shorter series' which could be supplemented with a limited number of known large flood events. Then we used the three historical floods from the 18th century combined with the whole and subsets of the 123 years of systematic observations. In the latter case several challenges were identified: i) The possibility to transfer water levels to river streamflows due to man made changes in the river profile, (ii) The stationarity of the data might be questioned since the three largest historical floods occurred during the "little ice age" with different climatic conditions compared to today.
Flood quantile estimation at ungauged sites by Bayesian networks
NASA Astrophysics Data System (ADS)
Mediero, L.; Santillán, D.; Garrote, L.
2012-04-01
Estimating flood quantiles at a site for which no observed measurements are available is essential for water resources planning and management. Ungauged sites have no observations about the magnitude of floods, but some site and basin characteristics are known. The most common technique used is the multiple regression analysis, which relates physical and climatic basin characteristic to flood quantiles. Regression equations are fitted from flood frequency data and basin characteristics at gauged sites. Regression equations are a rigid technique that assumes linear relationships between variables and cannot take the measurement errors into account. In addition, the prediction intervals are estimated in a very simplistic way from the variance of the residuals in the estimated model. Bayesian networks are a probabilistic computational structure taken from the field of Artificial Intelligence, which have been widely and successfully applied to many scientific fields like medicine and informatics, but application to the field of hydrology is recent. Bayesian networks infer the joint probability distribution of several related variables from observations through nodes, which represent random variables, and links, which represent causal dependencies between them. A Bayesian network is more flexible than regression equations, as they capture non-linear relationships between variables. In addition, the probabilistic nature of Bayesian networks allows taking the different sources of estimation uncertainty into account, as they give a probability distribution as result. A homogeneous region in the Tagus Basin was selected as case study. A regression equation was fitted taking the basin area, the annual maximum 24-hour rainfall for a given recurrence interval and the mean height as explanatory variables. Flood quantiles at ungauged sites were estimated by Bayesian networks. Bayesian networks need to be learnt from a huge enough data set. As observational data are reduced, a stochastic generator of synthetic data was developed. Synthetic basin characteristics were randomised, keeping the statistical properties of observed physical and climatic variables in the homogeneous region. The synthetic flood quantiles were stochastically generated taking the regression equation as basis. The learnt Bayesian network was validated by the reliability diagram, the Brier Score and the ROC diagram, which are common measures used in the validation of probabilistic forecasts. Summarising, the flood quantile estimations through Bayesian networks supply information about the prediction uncertainty as a probability distribution function of discharges is given as result. Therefore, the Bayesian network model has application as a decision support for water resources and planning management.
NASA Astrophysics Data System (ADS)
Lima, C. H.; Lall, U.
2010-12-01
Flood frequency statistical analysis most often relies on stationary assumptions, where distribution moments (e.g. mean, standard deviation) and associated flood quantiles do not change over time. In this sense, one expects that flood magnitudes and their frequency of occurrence will remain constant as observed in the historical information. However, evidence of inter-annual and decadal climate variability and anthropogenic change as well as an apparent increase in the number and magnitude of flood events across the globe have made the stationary assumption questionable. Here, we show how to estimate flood quantiles (e.g. 100-year flood) at ungauged basins without needing to consider stationarity. A statistical model based on the well known flow-area scaling law is proposed to estimate flood flows at ungauged basins. The slope and intercept scaling law coefficients are assumed time varying and a hierarchical Bayesian model is used to include climate information and reduce parameter uncertainties. Cross-validated results from 34 streamflow gauges located in a nested Basin in Brazil show that the proposed model is able to estimate flood quantiles at ungauged basins with remarkable skills compared with data based estimates using the full record. The model as developed in this work is also able to simulate sequences of flood flows considering global climate changes provided an appropriate climate index developed from the General Circulation Model is used as a predictor. The time varying flood frequency estimates can be used for pricing insurance models, and in a forecast mode for preparations for flooding, and finally, for timing infrastructure investments and location. Non-stationary 95% interval estimation for the 100-year Flood (shaded gray region) and 95% interval for the 100-year flood estimated from data (horizontal dashed and solid lines). The average distribution of the 100-year flood is shown in green in the right side.
A Study on Regional Rainfall Frequency Analysis for Flood Simulation Scenarios
NASA Astrophysics Data System (ADS)
Jung, Younghun; Ahn, Hyunjun; Joo, Kyungwon; Heo, Jun-Haeng
2014-05-01
Recently, climate change has been observed in Korea as well as in the entire world. The rainstorm has been gradually increased and then the damage has been grown. It is very important to manage the flood control facilities because of increasing the frequency and magnitude of severe rain storm. For managing flood control facilities in risky regions, data sets such as elevation, gradient, channel, land use and soil data should be filed up. Using this information, the disaster situations can be simulated to secure evacuation routes for various rainfall scenarios. The aim of this study is to investigate and determine extreme rainfall quantile estimates in Uijeongbu City using index flood method with L-moments parameter estimation. Regional frequency analysis trades space for time by using annual maximum rainfall data from nearby or similar sites to derive estimates for any given site in a homogeneous region. Regional frequency analysis based on pooled data is recommended for estimation of rainfall quantiles at sites with record lengths less than 5T, where T is return period of interest. Many variables relevant to precipitation can be used for grouping a region in regional frequency analysis. For regionalization of Han River basin, the k-means method is applied for grouping regions by variables of meteorology and geomorphology. The results from the k-means method are compared for each region using various probability distributions. In the final step of the regionalization analysis, goodness-of-fit measure is used to evaluate the accuracy of a set of candidate distributions. And rainfall quantiles by index flood method are obtained based on the appropriate distribution. And then, rainfall quantiles based on various scenarios are used as input data for disaster simulations. Keywords: Regional Frequency Analysis; Scenarios of Rainfall Quantile Acknowledgements This research was supported by a grant 'Establishing Active Disaster Management System of Flood Control Structures by using 3D BIM Technique' [NEMA-12-NH-57] from the Natural Hazard Mitigation Research Group, National Emergency Management Agency of Korea.
Quantification of Uncertainty in the Flood Frequency Analysis
NASA Astrophysics Data System (ADS)
Kasiapillai Sudalaimuthu, K.; He, J.; Swami, D.
2017-12-01
Flood frequency analysis (FFA) is usually carried out for planning and designing of water resources and hydraulic structures. Owing to the existence of variability in sample representation, selection of distribution and estimation of distribution parameters, the estimation of flood quantile has been always uncertain. Hence, suitable approaches must be developed to quantify the uncertainty in the form of prediction interval as an alternate to deterministic approach. The developed framework in the present study to include uncertainty in the FFA discusses a multi-objective optimization approach to construct the prediction interval using ensemble of flood quantile. Through this approach, an optimal variability of distribution parameters is identified to carry out FFA. To demonstrate the proposed approach, annual maximum flow data from two gauge stations (Bow river at Calgary and Banff, Canada) are used. The major focus of the present study was to evaluate the changes in magnitude of flood quantiles due to the recent extreme flood event occurred during the year 2013. In addition, the efficacy of the proposed method was further verified using standard bootstrap based sampling approaches and found that the proposed method is reliable in modeling extreme floods as compared to the bootstrap methods.
Archfield, Stacey A.; Pugliese, Alessio; Castellarin, Attilio; Skøien, Jon O.; Kiang, Julie E.
2013-01-01
In the United States, estimation of flood frequency quantiles at ungauged locations has been largely based on regional regression techniques that relate measurable catchment descriptors to flood quantiles. More recently, spatial interpolation techniques of point data have been shown to be effective for predicting streamflow statistics (i.e., flood flows and low-flow indices) in ungauged catchments. Literature reports successful applications of two techniques, canonical kriging, CK (or physiographical-space-based interpolation, PSBI), and topological kriging, TK (or top-kriging). CK performs the spatial interpolation of the streamflow statistic of interest in the two-dimensional space of catchment descriptors. TK predicts the streamflow statistic along river networks taking both the catchment area and nested nature of catchments into account. It is of interest to understand how these spatial interpolation methods compare with generalized least squares (GLS) regression, one of the most common approaches to estimate flood quantiles at ungauged locations. By means of a leave-one-out cross-validation procedure, the performance of CK and TK was compared to GLS regression equations developed for the prediction of 10, 50, 100 and 500 yr floods for 61 streamgauges in the southeast United States. TK substantially outperforms GLS and CK for the study area, particularly for large catchments. The performance of TK over GLS highlights an important distinction between the treatments of spatial correlation when using regression-based or spatial interpolation methods to estimate flood quantiles at ungauged locations. The analysis also shows that coupling TK with CK slightly improves the performance of TK; however, the improvement is marginal when compared to the improvement in performance over GLS.
NASA Astrophysics Data System (ADS)
Lee, J. Y.; Chae, B. S.; Wi, S.; KIm, T. W.
2017-12-01
Various climate change scenarios expect the rainfall in South Korea to increase by 3-10% in the future. The future increased rainfall has significant effect on the frequency of flood in future as well. This study analyzed the probability of future flood to investigate the stability of existing and new installed hydraulic structures and the possibility of increasing flood damage in mid-sized watersheds in South Korea. To achieve this goal, we first clarified the relationship between flood quantiles acquired from the flood-frequency analysis (FFA) and design rainfall-runoff analysis (DRRA) in gauged watersheds. Then, after synthetically generating the regional natural flow data according to RCP climate change scenarios, we developed mathematical formulas to estimate future flood quantiles based on the regression between DRRA and FFA incorporated with regional natural flows in unguaged watersheds. Finally, we developed a flood risk map to investigate the change of flood risk in terms of the return period for the past, present, and future. The results identified that the future flood quantiles and risks would increase in accordance with the RCP climate change scenarios. Because the regional flood risk was identified to increase in future comparing with the present status, comprehensive flood control will be needed to cope with extreme floods in future.
NASA Astrophysics Data System (ADS)
Odry, Jean; Arnaud, Patrick
2016-04-01
The SHYREG method (Aubert et al., 2014) associates a stochastic rainfall generator and a rainfall-runoff model to produce rainfall and flood quantiles on a 1 km2 mesh covering the whole French territory. The rainfall generator is based on the description of rainy events by descriptive variables following probability distributions and is characterised by a high stability. This stochastic generator is fully regionalised, and the rainfall-runoff transformation is calibrated with a single parameter. Thanks to the stability of the approach, calibration can be performed against only flood quantiles associated with observated frequencies which can be extracted from relatively short time series. The aggregation of SHYREG flood quantiles to the catchment scale is performed using an areal reduction factor technique unique on the whole territory. Past studies demonstrated the accuracy of SHYREG flood quantiles estimation for catchments where flow data are available (Arnaud et al., 2015). Nevertheless, the parameter of the rainfall-runoff model is independently calibrated for each target catchment. As a consequence, this parameter plays a corrective role and compensates approximations and modelling errors which makes difficult to identify its proper spatial pattern. It is an inherent objective of the SHYREG approach to be completely regionalised in order to provide a complete and accurate flood quantiles database throughout France. Consequently, it appears necessary to identify the model configuration in which the calibrated parameter could be regionalised with acceptable performances. The revaluation of some of the method hypothesis is a necessary step before the regionalisation. Especially the inclusion or the modification of the spatial variability of imposed parameters (like production and transfer reservoir size, base flow addition and quantiles aggregation function) should lead to more realistic values of the only calibrated parameter. The objective of the work presented here is to develop a SHYREG evaluation scheme focusing on both local and regional performances. Indeed, it is necessary to maintain the accuracy of at site flood quantiles estimation while identifying a configuration leading to a satisfactory spatial pattern of the calibrated parameter. This ability to be regionalised can be appraised by the association of common regionalisation techniques and split sample validation tests on a set of around 1,500 catchments representing the whole diversity of France physiography. Also, the presence of many nested catchments and a size-based split sample validation make possible to assess the relevance of the calibrated parameter spatial structure inside the largest catchments. The application of this multi-objective evaluation leads to the selection of a version of SHYREG more suitable for regionalisation. References: Arnaud, P., Cantet, P., Aubert, Y., 2015. Relevance of an at-site flood frequency analysis method for extreme events based on stochastic simulation of hourly rainfall. Hydrological Sciences Journal: on press. DOI:10.1080/02626667.2014.965174 Aubert, Y., Arnaud, P., Ribstein, P., Fine, J.A., 2014. The SHYREG flow method-application to 1605 basins in metropolitan France. Hydrological Sciences Journal, 59(5): 993-1005. DOI:10.1080/02626667.2014.902061
NASA Astrophysics Data System (ADS)
Nobert, Joel; Mugo, Margaret; Gadain, Hussein
Reliable estimation of flood magnitudes corresponding to required return periods, vital for structural design purposes, is impacted by lack of hydrological data in the study area of Lake Victoria Basin in Kenya. Use of regional information, derived from data at gauged sites and regionalized for use at any location within a homogenous region, would improve the reliability of the design flood estimation. Therefore, the regional index flood method has been applied. Based on data from 14 gauged sites, a delineation of the basin into two homogenous regions was achieved using elevation variation (90-m DEM), spatial annual rainfall pattern and Principal Component Analysis of seasonal rainfall patterns (from 94 rainfall stations). At site annual maximum series were modelled using the Log normal (LN) (3P), Log Logistic Distribution (LLG), Generalized Extreme Value (GEV) and Log Pearson Type 3 (LP3) distributions. The parameters of the distributions were estimated using the method of probability weighted moments. Goodness of fit tests were applied and the GEV was identified as the most appropriate model for each site. Based on the GEV model, flood quantiles were estimated and regional frequency curves derived from the averaged at site growth curves. Using the least squares regression method, relationships were developed between the index flood, which is defined as the Mean Annual Flood (MAF) and catchment characteristics. The relationships indicated area, mean annual rainfall and altitude were the three significant variables that greatly influence the index flood. Thereafter, estimates of flood magnitudes in ungauged catchments within a homogenous region were estimated from the derived equations for index flood and quantiles from the regional curves. These estimates will improve flood risk estimation and to support water management and engineering decisions and actions.
NASA Astrophysics Data System (ADS)
Yin, Yixing; Chen, Haishan; Xu, Chong-Yu; Xu, Wucheng; Chen, Changchun; Sun, Shanlei
2016-05-01
The regionalization methods, which "trade space for time" by pooling information from different locations in the frequency analysis, are efficient tools to enhance the reliability of extreme quantile estimates. This paper aims at improving the understanding of the regional frequency of extreme precipitation by using regionalization methods, and providing scientific background and practical assistance in formulating the regional development strategies for water resources management in one of the most developed and flood-prone regions in China, the Yangtze River Delta (YRD) region. To achieve the main goals, L-moment-based index-flood (LMIF) method, one of the most popular regionalization methods, is used in the regional frequency analysis of extreme precipitation with special attention paid to inter-site dependence and its influence on the accuracy of quantile estimates, which has not been considered by most of the studies using LMIF method. Extensive data screening of stationarity, serial dependence, and inter-site dependence was carried out first. The entire YRD region was then categorized into four homogeneous regions through cluster analysis and homogenous analysis. Based on goodness-of-fit statistic and L-moment ratio diagrams, generalized extreme-value (GEV) and generalized normal (GNO) distributions were identified as the best fitted distributions for most of the sub-regions, and estimated quantiles for each region were obtained. Monte Carlo simulation was used to evaluate the accuracy of the quantile estimates taking inter-site dependence into consideration. The results showed that the root-mean-square errors (RMSEs) were bigger and the 90 % error bounds were wider with inter-site dependence than those without inter-site dependence for both the regional growth curve and quantile curve. The spatial patterns of extreme precipitation with a return period of 100 years were finally obtained which indicated that there are two regions with highest precipitation extremes and a large region with low precipitation extremes. However, the regions with low precipitation extremes are the most developed and densely populated regions of the country, and floods will cause great loss of human life and property damage due to the high vulnerability. The study methods and procedure demonstrated in this paper will provide useful reference for frequency analysis of precipitation extremes in large regions, and the findings of the paper will be beneficial in flood control and management in the study area.
Over, Thomas M.; Saito, Riki J.; Veilleux, Andrea G.; Sharpe, Jennifer B.; Soong, David T.; Ishii, Audrey L.
2016-06-28
This report provides two sets of equations for estimating peak discharge quantiles at annual exceedance probabilities (AEPs) of 0.50, 0.20, 0.10, 0.04, 0.02, 0.01, 0.005, and 0.002 (recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years, respectively) for watersheds in Illinois based on annual maximum peak discharge data from 117 watersheds in and near northeastern Illinois. One set of equations was developed through a temporal analysis with a two-step least squares-quantile regression technique that measures the average effect of changes in the urbanization of the watersheds used in the study. The resulting equations can be used to adjust rural peak discharge quantiles for the effect of urbanization, and in this study the equations also were used to adjust the annual maximum peak discharges from the study watersheds to 2010 urbanization conditions.The other set of equations was developed by a spatial analysis. This analysis used generalized least-squares regression to fit the peak discharge quantiles computed from the urbanization-adjusted annual maximum peak discharges from the study watersheds to drainage-basin characteristics. The peak discharge quantiles were computed by using the Expected Moments Algorithm following the removal of potentially influential low floods defined by a multiple Grubbs-Beck test. To improve the quantile estimates, regional skew coefficients were obtained from a newly developed regional skew model in which the skew increases with the urbanized land use fraction. The drainage-basin characteristics used as explanatory variables in the spatial analysis include drainage area, the fraction of developed land, the fraction of land with poorly drained soils or likely water, and the basin slope estimated as the ratio of the basin relief to basin perimeter.This report also provides the following: (1) examples to illustrate the use of the spatial and urbanization-adjustment equations for estimating peak discharge quantiles at ungaged sites and to improve flood-quantile estimates at and near a gaged site; (2) the urbanization-adjusted annual maximum peak discharges and peak discharge quantile estimates at streamgages from 181 watersheds including the 117 study watersheds and 64 additional watersheds in the study region that were originally considered for use in the study but later deemed to be redundant.The urbanization-adjustment equations, spatial regression equations, and peak discharge quantile estimates developed in this study will be made available in the web application StreamStats, which provides automated regression-equation solutions for user-selected stream locations. Figures and tables comparing the observed and urbanization-adjusted annual maximum peak discharge records by streamgage are provided at https://doi.org/10.3133/sir20165050 for download.
Regional L-Moment-Based Flood Frequency Analysis in the Upper Vistula River Basin, Poland
NASA Astrophysics Data System (ADS)
Rutkowska, A.; Żelazny, M.; Kohnová, S.; Łyp, M.; Banasik, K.
2017-02-01
The Upper Vistula River basin was divided into pooling groups with similar dimensionless frequency distributions of annual maximum river discharge. The cluster analysis and the Hosking and Wallis (HW) L-moment-based method were used to divide the set of 52 mid-sized catchments into disjoint clusters with similar morphometric, land use, and rainfall variables, and to test the homogeneity within clusters. Finally, three and four pooling groups were obtained alternatively. Two methods for identification of the regional distribution function were used, the HW method and the method of Kjeldsen and Prosdocimi based on a bivariate extension of the HW measure. Subsequently, the flood quantile estimates were calculated using the index flood method. The ordinary least squares (OLS) and the generalised least squares (GLS) regression techniques were used to relate the index flood to catchment characteristics. Predictive performance of the regression scheme for the southern part of the Upper Vistula River basin was improved by using GLS instead of OLS. The results of the study can be recommended for the estimation of flood quantiles at ungauged sites, in flood risk mapping applications, and in engineering hydrology to help design flood protection structures.
The weighted function method: A handy tool for flood frequency analysis or just a curiosity?
NASA Astrophysics Data System (ADS)
Bogdanowicz, Ewa; Kochanek, Krzysztof; Strupczewski, Witold G.
2018-04-01
The idea of the Weighted Function (WF) method for estimation of Pearson type 3 (Pe3) distribution introduced by Ma in 1984 has been revised and successfully applied for shifted inverse Gaussian (IGa3) distribution. Also the conditions of WF applicability to a shifted distribution have been formulated. The accuracy of WF flood quantiles for both Pe3 and IGa3 distributions was assessed by Monte Caro simulations under the true and false distribution assumption versus the maximum likelihood (MLM), moment (MOM) and L-moments (LMM) methods. Three datasets of annual peak flows of Polish catchments serve the case studies to compare the results of the WF, MOM, MLM and LMM performance for the real flood data. For the hundred-year flood the WF method revealed the explicit superiority only over the MLM surpassing the MOM and especially LMM both for the true and false distributional assumption with respect to relative bias and relative mean root square error values. Generally, the WF method performs well and for hydrological sample size and constitutes good alternative for the estimation of the flood upper quantiles.
Log Pearson type 3 quantile estimators with regional skew information and low outlier adjustments
Griffis, V.W.; Stedinger, Jery R.; Cohn, T.A.
2004-01-01
The recently developed expected moments algorithm (EMA) [Cohn et al., 1997] does as well as maximum likelihood estimations at estimating log‐Pearson type 3 (LP3) flood quantiles using systematic and historical flood information. Needed extensions include use of a regional skewness estimator and its precision to be consistent with Bulletin 17B. Another issue addressed by Bulletin 17B is the treatment of low outliers. A Monte Carlo study compares the performance of Bulletin 17B using the entire sample with and without regional skew with estimators that use regional skew and censor low outliers, including an extended EMA estimator, the conditional probability adjustment (CPA) from Bulletin 17B, and an estimator that uses probability plot regression (PPR) to compute substitute values for low outliers. Estimators that neglect regional skew information do much worse than estimators that use an informative regional skewness estimator. For LP3 data the low outlier rejection procedure generally results in no loss of overall accuracy, and the differences between the MSEs of the estimators that used an informative regional skew are generally modest in the skewness range of real interest. Samples contaminated to model actual flood data demonstrate that estimators which give special treatment to low outliers significantly outperform estimators that make no such adjustment.
Log Pearson type 3 quantile estimators with regional skew information and low outlier adjustments
NASA Astrophysics Data System (ADS)
Griffis, V. W.; Stedinger, J. R.; Cohn, T. A.
2004-07-01
The recently developed expected moments algorithm (EMA) [, 1997] does as well as maximum likelihood estimations at estimating log-Pearson type 3 (LP3) flood quantiles using systematic and historical flood information. Needed extensions include use of a regional skewness estimator and its precision to be consistent with Bulletin 17B. Another issue addressed by Bulletin 17B is the treatment of low outliers. A Monte Carlo study compares the performance of Bulletin 17B using the entire sample with and without regional skew with estimators that use regional skew and censor low outliers, including an extended EMA estimator, the conditional probability adjustment (CPA) from Bulletin 17B, and an estimator that uses probability plot regression (PPR) to compute substitute values for low outliers. Estimators that neglect regional skew information do much worse than estimators that use an informative regional skewness estimator. For LP3 data the low outlier rejection procedure generally results in no loss of overall accuracy, and the differences between the MSEs of the estimators that used an informative regional skew are generally modest in the skewness range of real interest. Samples contaminated to model actual flood data demonstrate that estimators which give special treatment to low outliers significantly outperform estimators that make no such adjustment.
NASA Astrophysics Data System (ADS)
Yin, Yixing; Chen, Haishan; Xu, Chongyu; Xu, Wucheng; Chen, Changchun
2014-05-01
The regionalization methods which 'trade space for time' by including several at-site data records in the frequency analysis are an efficient tool to improve the reliability of extreme quantile estimates. With the main aims of improving the understanding of the regional frequency of extreme precipitation and providing scientific and practical background and assistance in formulating the regional development strategies for water resources management in one of the most developed and flood-prone regions in China, the Yangtze River Delta (YRD) region, in this paper, L-moment-based index-flood (LMIF) method, one of the popular regionalization methods, is used in the regional frequency analysis of extreme precipitation; attention was paid to inter-site dependence and its influence on the accuracy of quantile estimates, which hasn't been considered for most of the studies using LMIF method. Extensive data screening of stationarity, serial dependence and inter-site dependence was carried out first. The entire YRD region was then categorized into four homogeneous regions through cluster analysis and homogenous analysis. Based on goodness-of-fit statistic and L-moment ratio diagrams, Generalized extreme-value (GEV) and Generalized Normal (GNO) distributions were identified as the best-fit distributions for most of the sub regions. Estimated quantiles for each region were further obtained. Monte-Carlo simulation was used to evaluate the accuracy of the quantile estimates taking inter-site dependence into consideration. The results showed that the root mean square errors (RMSEs) were bigger and the 90% error bounds were wider with inter-site dependence than those with no inter-site dependence for both the regional growth curve and quantile curve. The spatial patterns of extreme precipitation with return period of 100 years were obtained which indicated that there are two regions with the highest precipitation extremes (southeastern coastal area of Zhejiang Province and the southwest part of Anhui Province) and a large region with low precipitation extremes in the north and middle parts of Zhejiang Province, Shanghai City and Jiangsu Province. However, the central areas with low precipitation extremes are the most developed and densely populated regions in the study area, thus floods will cause great loss of human life and property damage. These findings will contribute to formulating the regional development strategies for policymakers and stakeholders in water resource management against the menaces of frequently emerged floods.
NASA Astrophysics Data System (ADS)
Lopez, Patricia; Verkade, Jan; Weerts, Albrecht; Solomatine, Dimitri
2014-05-01
Hydrological forecasting is subject to many sources of uncertainty, including those originating in initial state, boundary conditions, model structure and model parameters. Although uncertainty can be reduced, it can never be fully eliminated. Statistical post-processing techniques constitute an often used approach to estimate the hydrological predictive uncertainty, where a model of forecast error is built using a historical record of past forecasts and observations. The present study focuses on the use of the Quantile Regression (QR) technique as a hydrological post-processor. It estimates the predictive distribution of water levels using deterministic water level forecasts as predictors. This work aims to thoroughly verify uncertainty estimates using the implementation of QR that was applied in an operational setting in the UK National Flood Forecasting System, and to inter-compare forecast quality and skill in various, differing configurations of QR. These configurations are (i) 'classical' QR, (ii) QR constrained by a requirement that quantiles do not cross, (iii) QR derived on time series that have been transformed into the Normal domain (Normal Quantile Transformation - NQT), and (iv) a piecewise linear derivation of QR models. The QR configurations are applied to fourteen hydrological stations on the Upper Severn River with different catchments characteristics. Results of each QR configuration are conditionally verified for progressively higher flood levels, in terms of commonly used verification metrics and skill scores. These include Brier's probability score (BS), the continuous ranked probability score (CRPS) and corresponding skill scores as well as the Relative Operating Characteristic score (ROCS). Reliability diagrams are also presented and analysed. The results indicate that none of the four Quantile Regression configurations clearly outperforms the others.
NASA Astrophysics Data System (ADS)
Gaál, Ladislav; Szolgay, Ján.; Bacigál, Tomáå.¡; Kohnová, Silvia
2010-05-01
Copula-based estimation methods of hydro-climatological extremes have increasingly been gaining attention of researchers and practitioners in the last couple of years. Unlike the traditional estimation methods which are based on bivariate cumulative distribution functions (CDFs), copulas are a relatively flexible tool of statistics that allow for modelling dependencies between two or more variables such as flood peaks and flood volumes without making strict assumptions on the marginal distributions. The dependence structure and the reliability of the joint estimates of hydro-climatological extremes, mainly in the right tail of the joint CDF not only depends on the particular copula adopted but also on the data available for the estimation of the marginal distributions of the individual variables. Generally, data samples for frequency modelling have limited temporal extent, which is a considerable drawback of frequency analyses in practice. Therefore, it is advised to deal with statistical methods that improve any part of the process of copula construction and result in more reliable design values of hydrological variables. The scarcity of the data sample mostly in the extreme tail of the joint CDF can be bypassed, e.g., by using a considerably larger amount of simulated data by rainfall-runoff analysis or by including historical information on the variables under study. The latter approach of data extension is used here to make the quantile estimates of the individual marginals of the copula more reliable. In the presented paper it is proposed to use historical information in the frequency analysis of the marginal distributions in the framework of Bayesian Monte Carlo Markov Chain (MCMC) simulations. Generally, a Bayesian approach allows for a straightforward combination of different sources of information on floods (e.g. flood data from systematic measurements and historical flood records, respectively) in terms of a product of the corresponding likelihood functions. On the other hand, the MCMC algorithm is a numerical approach for sampling from the likelihood distributions. The Bayesian MCMC methods therefore provide an attractive way to estimate the uncertainty in parameters and quantile metrics of frequency distributions. The applicability of the method is demonstrated in a case study of the hydroelectric power station Orlík on the Vltava River. This site has a key role in the flood prevention of Prague, the capital city of the Czech Republic. The record length of the available flood data is 126 years from the period 1877-2002, while the flood event observed in 2002 that caused extensive damages and numerous casualties is treated as a historic one. To estimate the joint probabilities of flood peaks and volumes, different copulas are fitted and their goodness-of-fit are evaluated by bootstrap simulations. Finally, selected quantiles of flood volumes conditioned on given flood peaks are derived and compared with those obtained by the traditional method used in the practice of water management specialists of the Vltava River.
Flood Frequency Analysis With Historical and Paleoflood Information
NASA Astrophysics Data System (ADS)
Stedinger, Jery R.; Cohn, Timothy A.
1986-05-01
An investigation is made of flood quantile estimators which can employ "historical" and paleoflood information in flood frequency analyses. Two categories of historical information are considered: "censored" data, where the magnitudes of historical flood peaks are known; and "binomial" data, where only threshold exceedance information is available. A Monte Carlo study employing the two-parameter lognormal distribution shows that maximum likelihood estimators (MLEs) can extract the equivalent of an additional 10-30 years of gage record from a 50-year period of historical observation. The MLE routines are shown to be substantially better than an adjusted-moment estimator similar to the one recommended in Bulletin 17B of the United States Water Resources Council Hydrology Committee (1982). The MLE methods performed well even when floods were drawn from other than the assumed lognormal distribution.
A comparison of three approaches to non-stationary flood frequency analysis
NASA Astrophysics Data System (ADS)
Debele, S. E.; Strupczewski, W. G.; Bogdanowicz, E.
2017-08-01
Non-stationary flood frequency analysis (FFA) is applied to statistical analysis of seasonal flow maxima from Polish and Norwegian catchments. Three non-stationary estimation methods, namely, maximum likelihood (ML), two stage (WLS/TS) and GAMLSS (generalized additive model for location, scale and shape parameters), are compared in the context of capturing the effect of non-stationarity on the estimation of time-dependent moments and design quantiles. The use of a multimodel approach is recommended, to reduce the errors due to the model misspecification in the magnitude of quantiles. The results of calculations based on observed seasonal daily flow maxima and computer simulation experiments showed that GAMLSS gave the best results with respect to the relative bias and root mean square error in the estimates of trend in the standard deviation and the constant shape parameter, while WLS/TS provided better accuracy in the estimates of trend in the mean value. Within three compared methods the WLS/TS method is recommended to deal with non-stationarity in short time series. Some practical aspects of the GAMLSS package application are also presented. The detailed discussion of general issues related to consequences of climate change in the FFA is presented in the second part of the article entitled "Around and about an application of the GAMLSS package in non-stationary flood frequency analysis".
Magnitude of flood flows for selected annual exceedance probabilities in Rhode Island through 2010
Zarriello, Phillip J.; Ahearn, Elizabeth A.; Levin, Sara B.
2012-01-01
Heavy persistent rains from late February through March 2010 caused severe widespread flooding in Rhode Island that set or nearly set record flows and water levels at many long-term streamgages in the State. In response, the U.S. Geological Survey, in partnership with the Federal Emergency Management Agency, conducted a study to update estimates of flood magnitudes at streamgages and regional equations for estimating flood flows at ungaged locations. This report provides information needed for flood plain management, transportation infrastructure design, flood insurance studies, and other purposes that can help minimize future flood damages and risks. The magnitudes of floods were determined from the annual peak flows at 43 streamgages in Rhode Island (20 sites), Connecticut (14 sites), and Massachusetts (9 sites) using the standard Bulletin 17B log-Pearson type III method and a modification of this method called the expected moments algorithm (EMA) for 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probability (AEP) floods. Annual-peak flows were analyzed for the period of record through the 2010 water year; however, records were extended at 23 streamgages using the maintenance of variance extension (MOVE) procedure to best represent the longest period possible for determining the generalized skew and flood magnitudes. Generalized least square regression equations were developed from the flood quantiles computed at 41 streamgages (2 streamgages in Rhode Island with reported flood quantiles were not used in the regional regression because of regulation or redundancy) and their respective basin characteristics to estimate magnitude of floods at ungaged sites. Of 55 basin characteristics evaluated as potential explanatory variables, 3 were statistically significant—drainage area, stream density, and basin storage. The pseudo-coefficient of determination (pseudo-R2) indicates these three explanatory variables explain 95 to 96 percent of the variance in the flood magnitudes from 20- to 0.2-percent AEPs. Estimates of uncertainty of the at-site and regression flood magnitudes are provided and were combined with their respective estimated flood quantiles to improve estimates of flood flows at streamgages. This region has a long history of urban development, which is considered to have an important effect on flood flows. This study includes basins that have an impervious area ranging from 0.5 to 37 percent. Although imperviousness provided some explanatory power in the regression, it was not statistically significant at the 95-percent confidence level for any of the AEPs examined. Influence of urbanization on flood flows indicates a complex interaction with other characteristics that confounds a statistical explanation of its effects. Standard methods for calculating magnitude of floods for given AEP are based on the assumption of stationarity, that is, the annual peak flows exhibit no significant trend over time. A subset of 16 streamgages with 70 or more years of unregulated systematic record indicates all but 4 streamgages have a statistically significant positive trend at the 95-percent confidence level; three of these are statistically significant at about the 90-percent confidence level or above. If the trend continues linearly in time, the estimated magnitude of floods for any AEP, on average, will increase by 6, 13, and 21 percent in 10, 20, and 30 years' time, respectively. In 2010, new peaks of record were set at 18 of the 21 active streamgages in Rhode Island. The updated flood frequency analysis indicates the peaks at these streamgages ranged from 2- to 0.2-percent AEP. Many streamgages in the State peaked at a 0.5- and 0.2-percent AEP, except for streamgages in the Blackstone River Basin, which peaked from a 4- to 2-percent AEP.
Regional maximum rainfall analysis using L-moments at the Titicaca Lake drainage, Peru
NASA Astrophysics Data System (ADS)
Fernández-Palomino, Carlos Antonio; Lavado-Casimiro, Waldo Sven
2017-08-01
The present study investigates the application of the index flood L-moments-based regional frequency analysis procedure (RFA-LM) to the annual maximum 24-h rainfall (AM) of 33 rainfall gauge stations (RGs) to estimate rainfall quantiles at the Titicaca Lake drainage (TL). The study region was chosen because it is characterised by common floods that affect agricultural production and infrastructure. First, detailed quality analyses and verification of the RFA-LM assumptions were conducted. For this purpose, different tests for outlier verification, homogeneity, stationarity, and serial independence were employed. Then, the application of RFA-LM procedure allowed us to consider the TL as a single, hydrologically homogeneous region, in terms of its maximum rainfall frequency. That is, this region can be modelled by a generalised normal (GNO) distribution, chosen according to the Z test for goodness-of-fit, L-moments (LM) ratio diagram, and an additional evaluation of the precision of the regional growth curve. Due to the low density of RG in the TL, it was important to produce maps of the AM design quantiles estimated using RFA-LM. Therefore, the ordinary Kriging interpolation (OK) technique was used. These maps will be a useful tool for determining the different AM quantiles at any point of interest for hydrologists in the region.
Historical floods in flood frequency analysis: Is this game worth the candle?
NASA Astrophysics Data System (ADS)
Strupczewski, Witold G.; Kochanek, Krzysztof; Bogdanowicz, Ewa
2017-11-01
In flood frequency analysis (FFA) the profit from inclusion of historical information on the largest historical pre-instrumental floods depends primarily on reliability of the information, i.e. the accuracy of magnitude and return period of floods. This study is focused on possible theoretical maximum gain in accuracy of estimates of upper quantiles, that can be obtained by incorporating the largest historical floods of known return periods into the FFA. We assumed a simple case: N years of systematic records of annual maximum flows and either one largest (XM1) or two largest (XM1 and XM2) flood peak flows in a historical M-year long period. The problem is explored by Monte Carlo simulations with the maximum likelihood (ML) method. Both correct and false distributional assumptions are considered. In the first case the two-parameter extreme value models (Gumbel, log-Gumbel, Weibull) with various coefficients of variation serve as parent distributions. In the case of unknown parent distribution, the Weibull distribution was assumed as estimating model and the truncated Gumbel as parent distribution. The return periods of XM1 and XM2 are determined from the parent distribution. The results are then compared with the case, when return periods of XM1 and XM2 are defined by their plotting positions. The results are presented in terms of bias, root mean square error and the probability of overestimation of the quantile with 100-year return period. The results of the research indicate that the maximal profit of inclusion of pre-instrumental foods in the FFA may prove smaller than the cost of reconstruction of historical hydrological information.
Comparison between changes in flood hazard and risk in Spain using historical information
NASA Astrophysics Data System (ADS)
Llasat, Maria-Carmen; Mediero, Luis; Garrote, Luis; Gilabert, Joan
2015-04-01
Recently, the COST Action ES0901 "European procedures for flood frequency estimation (FloodFreq)" had as objective "the comparison and evaluation of methods for flood frequency estimation under the various climatologic and geographic conditions found in Europe". It was highlighted the improvement of regional analyses on at-site estimates, in terms of the uncertainty of quantile estimates. In the case of Spain, a regional analysis was carried out at a national scale, which allows identifying the flow threshold corresponding to a given return period from the observed flow series recorded at a gauging station. In addition, Mediero et al. (2014) studied the possible influence of non-stationarity on flood series for the period 1942-2009. In parallel, Barnolas and Llasat (2007), among others, collected documentary information of catastrophic flood events in Spain for the last centuries. Traditionally, the first approach ("top-down") usually identifies a flood as catastrophic, when its exceeds the 500-year return period flood. However, the second one ("bottom-up approach") accounts for flood damages (Llasat et al, 2005). This study presents a comparison between both approaches, discussing the potential factors that can lead to discrepancies between them, as well as accounting for information about major changes experienced in the catchment that could lead to changes in flood hazard and risk.
High Risk Flash Flood Rainstorm Mapping Based on Regional L-moments Approach
NASA Astrophysics Data System (ADS)
Ding, Hui; Liao, Yifan; Lin, Bingzhang
2017-04-01
Difficulties and complexities in elaborating flash flood early-warning and forecasting system prompt hydrologists to develop some techniques to substantially reduce the disastrous outcome of a flash flood in advance. An ideal to specify those areas that are subject at high risk to flash flood in terms of rainfall intensity in a relatively large region is proposed in this paper. It is accomplished through design of the High Risk Flash Flood Rainstorm Area (HRFFRA) based on statistical analysis of historical rainfall data, synoptic analysis of prevailing storm rainfalls as well as the field survey of historical flash flood events in the region. A HRFFRA is defined as the area potentially under hitting by higher intense-precipitation for a given duration with certain return period that may cause a flash flood disaster in the area. This paper has presented in detail the development of the HRFFRA through the application of the end-to-end Regional L-moments Approach (RLMA) to precipitation frequency analysis in combination with the technique of spatial interpolation in Jiangxi Province, South China Mainland. Among others, the concept of hydrometeorologically homogenous region, the precision of frequency analysis in terms of parameter estimation, the accuracy of quantiles in terms of uncertainties and the consistency adjustments of quantiles over durations and space, etc., have been addressed. At the end of this paper, the mapping of the HRFFRA and an internet-based visualized user-friendly data-server of the HRFFRA are also introduced. Key words: HRFFRA; Flash Flood; RLMA; rainfall intensity; Hydrometeorological homogenous region.
Regional estimation of extreme suspended sediment concentrations using watershed characteristics
NASA Astrophysics Data System (ADS)
Tramblay, Yves; Ouarda, Taha B. M. J.; St-Hilaire, André; Poulin, Jimmy
2010-01-01
SummaryThe number of stations monitoring daily suspended sediment concentration (SSC) has been decreasing since the 1980s in North America while suspended sediment is considered as a key variable for water quality. The objective of this study is to test the feasibility of regionalising extreme SSC, i.e. estimating SSC extremes values for ungauged basins. Annual maximum SSC for 72 rivers in Canada and USA were modelled with probability distributions in order to estimate quantiles corresponding to different return periods. Regionalisation techniques, originally developed for flood prediction in ungauged basins, were tested using the climatic, topographic, land cover and soils attributes of the watersheds. Two approaches were compared, using either physiographic characteristics or seasonality of extreme SSC to delineate the regions. Multiple regression models to estimate SSC quantiles as a function of watershed characteristics were built in each region, and compared to a global model including all sites. Regional estimates of SSC quantiles were compared with the local values. Results show that regional estimation of extreme SSC is more efficient than a global regression model including all sites. Groups/regions of stations have been identified, using either the watershed characteristics or the seasonality of occurrence for extreme SSC values providing a method to better describe the extreme events of SSC. The most important variables for predicting extreme SSC are the percentage of clay in the soils, precipitation intensity and forest cover.
Towards a systematic approach to comparing distributions used in flood frequency analysis
NASA Astrophysics Data System (ADS)
Bobée, B.; Cavadias, G.; Ashkar, F.; Bernier, J.; Rasmussen, P.
1993-02-01
The estimation of flood quantiles from available streamflow records has been a topic of extensive research in this century. However, the large number of distributions and estimation methods proposed in the scientific literature has led to a state of confusion, and a gap prevails between theory and practice. This concerns both at-site and regional flood frequency estimation. To facilitate the work of "hydrologists, designers of hydraulic structures, irrigation engineers and planners of water resources", the World Meteorological Organization recently published a report which surveys and compares current methodologies, and recommends a number of statistical distributions and estimation procedures. This report is an important step towards the clarification of this difficult topic, but we think that it does not effectively satisfy the needs of practitioners as intended, because it contains some statements which are not statistically justified and which require further discussion. In the present paper we review commonly used procedures for flood frequency estimation, point out some of the reasons for the present state of confusion concerning the advantages and disadvantages of the various methods, and propose the broad lines of a possible comparison strategy. We recommend that the results of such comparisons be discussed in an international forum of experts, with the purpose of attaining a more coherent and broadly accepted strategy for estimating floods.
Evaluating changes to reservoir rule curves using historical water-level data
Mower, Ethan; Miranda, Leandro E.
2013-01-01
Flood control reservoirs are typically managed through rule curves (i.e. target water levels) which control the storage and release timing of flood waters. Changes to rule curves are often contemplated and requested by various user groups and management agencies with no information available about the actual flood risk of such requests. Methods of estimating flood risk in reservoirs are not easily available to those unfamiliar with hydrological models that track water movement through a river basin. We developed a quantile regression model that uses readily available daily water-level data to estimate risk of spilling. Our model provided a relatively simple process for estimating the maximum applicable water level under a specific flood risk for any day of the year. This water level represents an upper-limit umbrella under which water levels can be operated in a variety of ways. Our model allows the visualization of water-level management under a user-specified flood risk and provides a framework for incorporating the effect of a changing environment on water-level management in reservoirs, but is not designed to replace existing hydrological models. The model can improve communication and collaboration among agencies responsible for managing natural resources dependent on reservoir water levels.
NASA Astrophysics Data System (ADS)
Gaál, Ladislav; Szolgay, Ján; Kohnová, Silvia; Hlavčová, Kamila; Viglione, Alberto
2010-01-01
The paper deals with at-site flood frequency estimation in the case when also information on hydrological events from the past with extraordinary magnitude are available. For the joint frequency analysis of systematic observations and historical data, respectively, the Bayesian framework is chosen, which, through adequately defined likelihood functions, allows for incorporation of different sources of hydrological information, e.g., maximum annual flood peaks, historical events as well as measurement errors. The distribution of the parameters of the fitted distribution function and the confidence intervals of the flood quantiles are derived by means of the Markov chain Monte Carlo simulation (MCMC) technique. The paper presents a sensitivity analysis related to the choice of the most influential parameters of the statistical model, which are the length of the historical period
Cohn, T.A.; Lane, W.L.; Baier, W.G.
1997-01-01
This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record; information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters, of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.
NASA Astrophysics Data System (ADS)
Cohn, T. A.; Lane, W. L.; Baier, W. G.
This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record; information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters, of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.
NASA Astrophysics Data System (ADS)
Arnaud, Patrick; Cantet, Philippe; Odry, Jean
2017-11-01
Flood frequency analyses (FFAs) are needed for flood risk management. Many methods exist ranging from classical purely statistical approaches to more complex approaches based on process simulation. The results of these methods are associated with uncertainties that are sometimes difficult to estimate due to the complexity of the approaches or the number of parameters, especially for process simulation. This is the case of the simulation-based FFA approach called SHYREG presented in this paper, in which a rainfall generator is coupled with a simple rainfall-runoff model in an attempt to estimate the uncertainties due to the estimation of the seven parameters needed to estimate flood frequencies. The six parameters of the rainfall generator are mean values, so their theoretical distribution is known and can be used to estimate the generator uncertainties. In contrast, the theoretical distribution of the single hydrological model parameter is unknown; consequently, a bootstrap method is applied to estimate the calibration uncertainties. The propagation of uncertainty from the rainfall generator to the hydrological model is also taken into account. This method is applied to 1112 basins throughout France. Uncertainties coming from the SHYREG method and from purely statistical approaches are compared, and the results are discussed according to the length of the recorded observations, basin size and basin location. Uncertainties of the SHYREG method decrease as the basin size increases or as the length of the recorded flow increases. Moreover, the results show that the confidence intervals of the SHYREG method are relatively small despite the complexity of the method and the number of parameters (seven). This is due to the stability of the parameters and takes into account the dependence of uncertainties due to the rainfall model and the hydrological calibration. Indeed, the uncertainties on the flow quantiles are on the same order of magnitude as those associated with the use of a statistical law with two parameters (here generalised extreme value Type I distribution) and clearly lower than those associated with the use of a three-parameter law (here generalised extreme value Type II distribution). For extreme flood quantiles, the uncertainties are mostly due to the rainfall generator because of the progressive saturation of the hydrological model.
Nonstationary decision model for flood risk decision scaling
NASA Astrophysics Data System (ADS)
Spence, Caitlin M.; Brown, Casey M.
2016-11-01
Hydroclimatic stationarity is increasingly questioned as a default assumption in flood risk management (FRM), but successor methods are not yet established. Some potential successors depend on estimates of future flood quantiles, but methods for estimating future design storms are subject to high levels of uncertainty. Here we apply a Nonstationary Decision Model (NDM) to flood risk planning within the decision scaling framework. The NDM combines a nonstationary probability distribution of annual peak flow with optimal selection of flood management alternatives using robustness measures. The NDM incorporates structural and nonstructural FRM interventions and valuation of flows supporting ecosystem services to calculate expected cost of a given FRM strategy. A search for the minimum-cost strategy under incrementally varied representative scenarios extending across the plausible range of flood trend and value of the natural flow regime discovers candidate FRM strategies that are evaluated and compared through a decision scaling analysis (DSA). The DSA selects a management strategy that is optimal or close to optimal across the broadest range of scenarios or across the set of scenarios deemed most likely to occur according to estimates of future flood hazard. We illustrate the decision framework using a stylized example flood management decision based on the Iowa City flood management system, which has experienced recent unprecedented high flow episodes. The DSA indicates a preference for combining infrastructural and nonstructural adaptation measures to manage flood risk and makes clear that options-based approaches cannot be assumed to be "no" or "low regret."
England, John F.; Salas, José D.; Jarrett, Robert D.
2003-01-01
The expected moments algorithm (EMA) [Cohn et al., 1997] and the Bulletin 17B [Interagency Committee on Water Data, 1982] historical weighting procedure (B17H) for the log Pearson type III distribution are compared by Monte Carlo computer simulation for cases in which historical and/or paleoflood data are available. The relative performance of the estimators was explored for three cases: fixed‐threshold exceedances, a fixed number of large floods, and floods generated from a different parent distribution. EMA can effectively incorporate four types of historical and paleoflood data: floods where the discharge is explicitly known, unknown discharges below a single threshold, floods with unknown discharge that exceed some level, and floods with discharges described in a range. The B17H estimator can utilize only the first two types of historical information. Including historical/paleoflood data in the simulation experiments significantly improved the quantile estimates in terms of mean square error and bias relative to using gage data alone. EMA performed significantly better than B17H in nearly all cases considered. B17H performed as well as EMA for estimating X100 in some limited fixed‐threshold exceedance cases. EMA performed comparatively much better in other fixed‐threshold situations, for the single large flood case, and in cases when estimating extreme floods equal to or greater than X500. B17H did not fully utilize historical information when the historical period exceeded 200 years. Robustness studies using GEV‐simulated data confirmed that EMA performed better than B17H. Overall, EMA is preferred to B17H when historical and paleoflood data are available for flood frequency analysis.
NASA Astrophysics Data System (ADS)
England, John F.; Salas, José D.; Jarrett, Robert D.
2003-09-01
The expected moments algorithm (EMA) [, 1997] and the Bulletin 17B [, 1982] historical weighting procedure (B17H) for the log Pearson type III distribution are compared by Monte Carlo computer simulation for cases in which historical and/or paleoflood data are available. The relative performance of the estimators was explored for three cases: fixed-threshold exceedances, a fixed number of large floods, and floods generated from a different parent distribution. EMA can effectively incorporate four types of historical and paleoflood data: floods where the discharge is explicitly known, unknown discharges below a single threshold, floods with unknown discharge that exceed some level, and floods with discharges described in a range. The B17H estimator can utilize only the first two types of historical information. Including historical/paleoflood data in the simulation experiments significantly improved the quantile estimates in terms of mean square error and bias relative to using gage data alone. EMA performed significantly better than B17H in nearly all cases considered. B17H performed as well as EMA for estimating X100 in some limited fixed-threshold exceedance cases. EMA performed comparatively much better in other fixed-threshold situations, for the single large flood case, and in cases when estimating extreme floods equal to or greater than X500. B17H did not fully utilize historical information when the historical period exceeded 200 years. Robustness studies using GEV-simulated data confirmed that EMA performed better than B17H. Overall, EMA is preferred to B17H when historical and paleoflood data are available for flood frequency analysis.
Impact of climate change on Gironde Estuary
NASA Astrophysics Data System (ADS)
Laborie, Vanessya; Hissel, François; Sergent, Philippe
2014-05-01
Within the THESEUS European project, a simplified mathematical model for storm surge levels in the Bay of Biscay was adjusted on 10 events at Le Verdon using wind and pressure fields from CLM/SGA, so that the water levels at Le Verdon have the same statistic quantiles as observed tide records for the period [1960-2000]. The analysis of future storm surge levels shows a decrease in their quantiles at Le Verdon, whereas there is an increase of the quantiles of total water levels. This increase is smaller than the sea level rise and gets even smaller as one enters farther upstream in the estuary. A numerical model of the Gironde Estuary was then used to evaluate future water levels at 6 locations of the estuary from Le Verdon to Bordeaux and to assess the changes in the quantiles of water levels during the XXIst century using ONERC's pessimistic scenario for sea level rise (60 cm). The model was fed by several data sources : wind fields at Royan and Mérignac interpolated from the grid of the European Climatolologic Model CLM/SGA, a tide signal at Le Verdon, the discharges of Garonne (at La Réole), the Dordogne (at Pessac) and Isle (at Libourne). A series of flood maps for different return periods between 2 and 100 years and for four time periods ([1960-1999], [2010-2039], [2040-2069] and [2070-2099]) have been built for the region of Bordeaux. Quantiles of water levels in the floodplain have also been calculated. The impact of climate change on the evolution of flooded areas in the Gironde Estuary and on quantiles of water levels in the floodplain mainly depends on the sea level rise. Areas which are not currently flooded for low return periods will be inundated in 2100. The influence of river discharges and dike breaching should also be taken into account for more accurate results.
Interquantile Shrinkage in Regression Models
Jiang, Liewen; Wang, Huixia Judy; Bondell, Howard D.
2012-01-01
Conventional analysis using quantile regression typically focuses on fitting the regression model at different quantiles separately. However, in situations where the quantile coefficients share some common feature, joint modeling of multiple quantiles to accommodate the commonality often leads to more efficient estimation. One example of common features is that a predictor may have a constant effect over one region of quantile levels but varying effects in other regions. To automatically perform estimation and detection of the interquantile commonality, we develop two penalization methods. When the quantile slope coefficients indeed do not change across quantile levels, the proposed methods will shrink the slopes towards constant and thus improve the estimation efficiency. We establish the oracle properties of the two proposed penalization methods. Through numerical investigations, we demonstrate that the proposed methods lead to estimations with competitive or higher efficiency than the standard quantile regression estimation in finite samples. Supplemental materials for the article are available online. PMID:24363546
What do we gain with Probabilistic Flood Loss Models?
NASA Astrophysics Data System (ADS)
Schroeter, K.; Kreibich, H.; Vogel, K.; Merz, B.; Lüdtke, S.
2015-12-01
The reliability of flood loss models is a prerequisite for their practical usefulness. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions which are cast in a probabilistic framework. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.
Confronting uncertainty in flood damage predictions
NASA Astrophysics Data System (ADS)
Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Merz, Bruno
2015-04-01
Reliable flood damage models are a prerequisite for the practical usefulness of the model results. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005 and 2006, in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.
A methodology to derive Synthetic Design Hydrographs for river flood management
NASA Astrophysics Data System (ADS)
Tomirotti, Massimo; Mignosa, Paolo
2017-12-01
The design of flood protection measures requires in many cases not only the estimation of the peak discharges, but also of the volume of the floods and its time distribution. A typical solution to this kind of problems is the formulation of Synthetic Design Hydrographs (SDHs). In this paper a methodology to derive SDHs is proposed on the basis of the estimation of the Flow Duration Frequency (FDF) reduction curve and of a Peak-Duration (PD) relationship furnishing respectively the quantiles of the maximum average discharge and the average peak position in each duration. The methodology is intended to synthesize the main features of the historical floods in a unique SDH for each return period. The shape of the SDH is not selected a priori but is a result of the behaviour of FDF and PD curves, allowing to account in a very convenient way for the variability of the shapes of the observed hydrographs at local time scale. The validation of the methodology is performed with reference to flood routing problems in reservoirs, lakes and rivers. The results obtained demonstrate the capability of the SDHs to describe the effects of different hydraulic systems on the statistical regime of floods, even in presence of strong modifications induced on the probability distribution of peak flows.
Simultaneous multiple non-crossing quantile regression estimation using kernel constraints
Liu, Yufeng; Wu, Yichao
2011-01-01
Quantile regression (QR) is a very useful statistical tool for learning the relationship between the response variable and covariates. For many applications, one often needs to estimate multiple conditional quantile functions of the response variable given covariates. Although one can estimate multiple quantiles separately, it is of great interest to estimate them simultaneously. One advantage of simultaneous estimation is that multiple quantiles can share strength among them to gain better estimation accuracy than individually estimated quantile functions. Another important advantage of joint estimation is the feasibility of incorporating simultaneous non-crossing constraints of QR functions. In this paper, we propose a new kernel-based multiple QR estimation technique, namely simultaneous non-crossing quantile regression (SNQR). We use kernel representations for QR functions and apply constraints on the kernel coefficients to avoid crossing. Both unregularised and regularised SNQR techniques are considered. Asymptotic properties such as asymptotic normality of linear SNQR and oracle properties of the sparse linear SNQR are developed. Our numerical results demonstrate the competitive performance of our SNQR over the original individual QR estimation. PMID:22190842
Prediction of flood quantiles at ungaged watersheds in Louisiana : final report.
DOT National Transportation Integrated Search
1989-12-01
Four popular regional flood frequency methods were compared using Louisiana stream flow series. The state was divided into four homogeneous regions and all undistorted, long-term stream gages were used in the analysis. The GEV, TCEV, regional LP3 and...
NASA Astrophysics Data System (ADS)
Lam, Daryl; Thompson, Chris; Croke, Jacky; Sharma, Ashneel; Macklin, Mark
2017-03-01
Using a combination of stream gauge, historical, and paleoflood records to extend extreme flood records has proven to be useful in improving flood frequency analysis (FFA). The approach has typically been applied in localities with long historical records and/or suitable river settings for paleoflood reconstruction from slack-water deposits (SWDs). However, many regions around the world have neither extensive historical information nor bedrock gorges suitable for SWDs preservation and paleoflood reconstruction. This study from subtropical Australia demonstrates that confined, semialluvial channels such as macrochannels provide relatively stable boundaries over the 1000-2000 year time period and the preserved SWDs enabled paleoflood reconstruction and their incorporation into FFA. FFA for three sites in subtropical Australia with the integration of historical and paleoflood data using Bayesian Inference methods showed a significant reduction in uncertainty associated with the estimated discharge of a flood quantile. Uncertainty associated with estimated discharge for the 1% Annual Exceedance Probability (AEP) flood is reduced by more than 50%. In addition, sensitivity analysis of possible within-channel boundary changes shows that FFA is not significantly affected by any associated changes in channel capacity. Therefore, a greater range of channel types may be used for reliable paleoflood reconstruction by evaluating the stability of inset alluvial units, thereby increasing the quantity of temporal data available for FFA. The reduction in uncertainty, particularly in the prediction of the ≤1% AEP design flood, will improve flood risk planning and management in regions with limited temporal flood data.
Modelling probabilities of heavy precipitation by regional approaches
NASA Astrophysics Data System (ADS)
Gaal, L.; Kysely, J.
2009-09-01
Extreme precipitation events are associated with large negative consequences for human society, mainly as they may trigger floods and landslides. The recent series of flash floods in central Europe (affecting several isolated areas) on June 24-28, 2009, the worst one over several decades in the Czech Republic as to the number of persons killed and the extent of damage to buildings and infrastructure, is an example. Estimates of growth curves and design values (corresponding e.g. to 50-yr and 100-yr return periods) of precipitation amounts, together with their uncertainty, are important in hydrological modelling and other applications. The interest in high quantiles of precipitation distributions is also related to possible climate change effects, as climate model simulations tend to project increased severity of precipitation extremes in a warmer climate. The present study compares - in terms of Monte Carlo simulation experiments - several methods to modelling probabilities of precipitation extremes that make use of ‘regional approaches’: the estimation of distributions of extremes takes into account data in a ‘region’ (‘pooling group’), in which one may assume that the distributions at individual sites are identical apart from a site-specific scaling factor (the condition is referred to as ‘regional homogeneity’). In other words, all data in a region - often weighted in some way - are taken into account when estimating the probability distribution of extremes at a given site. The advantage is that sampling variations in the estimates of model parameters and high quantiles are to a large extent reduced compared to the single-site analysis. We focus on the ‘region-of-influence’ (ROI) method which is based on the identification of unique pooling groups (forming the database for the estimation) for each site under study. The similarity of sites is evaluated in terms of a set of site attributes related to the distributions of extremes. The issue of the size of the region is linked with a built-in test on regional homogeneity of data. Once a pooling group is delineated, weights based on a dissimilarity measure are assigned to individual sites involved in a pooling group, and all (weighted) data are employed in the estimation of model parameters and high quantiles at a given location. The ROI method is compared with the Hosking-Wallis (HW) regional frequency analysis, which is based on delineating fixed regions (instead of flexible pooling groups) and assigning unit weights to all sites in a region. The comparison of the performance of the individual regional models makes use of data on annual maxima of 1-day precipitation amounts at 209 stations covering the Czech Republic, with altitudes ranging from 150 to 1490 m a.s.l. We conclude that the ROI methodology is superior to the HW analysis, particularly for very high quantiles (100-yr return values). Another advantage of the ROI approach is that subjective decisions - unavoidable when fixed regions in the HW analysis are formed - may efficiently be suppressed, and almost all settings of the ROI method may be justified by results of the simulation experiments. The differences between (any) regional method and single-site analysis are very pronounced and suggest that the at-site estimation is highly unreliable. The ROI method is then applied to estimate high quantiles of precipitation amounts at individual sites. The estimates and their uncertainty are compared with those from a single-site analysis. We focus on the eastern part of the Czech Republic, i.e. an area with complex orography and a particularly pronounced role of Mediterranean cyclones in producing precipitation extremes. The design values are compared with precipitation amounts recorded during the recent heavy precipitation events, including the one associated with the flash flood on June 24, 2009. We also show that the ROI methodology may easily be transferred to the analysis of precipitation extremes in climate model outputs. It efficiently reduces (random) variations in the estimates of parameters of the extreme value distributions in individual gridboxes that result from large spatial variability of heavy precipitation, and represents a straightforward tool for ‘weighting’ data from neighbouring gridboxes within the estimation procedure. The study is supported by the Grant Agency of AS CR under project B300420801.
Regional Frequency Analysis of Annual Maximum Streamflow in Gipuzkoa (Spain)
NASA Astrophysics Data System (ADS)
Erro, J.; López, J. J.
2012-04-01
Extreme streamflow events have been an important cause of recent flooding in Gipuzkoa, and any change in the magnitude of such events may have severe impacts upon urban structures such as dams, urban drainage systems and flood defences, and cause failures to occur. So a regional frequency analysis of annual maximum streamflow was developed for Gipuzkoa, using the well known L-moments approach together with the index-flood procedure, and following the four steps that characterize it: initial screening of the data, identification of homogeneous regions, choice of the appropriate frequency distribution and estimation of quantiles for different return periods. The preliminary study, completed in 2009, was based on the observations recorded at 22 stations distributed throughout the area. A primary filtering of the data revealed the absence of jumps, inconsistencies and changes in trends within the series, and the discordancy measures showed that none of the sites used in the analysis had to be considered discordant with the others. Regionalization was performed by cluster analysis, grouping the stations according to eight physical site characteristics: latitude, longitude, drainage basin area, elevation, main channel length of the basin, slope, annual mean rainfall and annual maximum rainfall. It resulted in two groups - one cluster with the 18 sites of small-medium basin area, and a second cluster with the 4 remaining sites of major basin area - in which the homogeneity criteria were tested and satisfied. However, the short lenght of the series together with the introduction of the observations of 2010 and the inclusion of a historic extreme streamflow event occurred in northern Spain in November 2011, completely changed the results. With this consideration and adjustment, all Gipuzkoa could be treated as a homogeneus region. The goodness-of-fit measures indicated that Generalized Logistic (GLO) is the only suitable distribution to characterize Gipuzkoa. Using the regional L-moment algorithm, quantiles associated with return periods of interest were estimated, and Monte Carlo simulation was used to compute RMSE, bias and error bounds for the estimates.
A comparison of moment-based methods of estimation for the log Pearson type 3 distribution
NASA Astrophysics Data System (ADS)
Koutrouvelis, I. A.; Canavos, G. C.
2000-06-01
The log Pearson type 3 distribution is a very important model in statistical hydrology, especially for modeling annual flood series. In this paper we compare the various methods based on moments for estimating quantiles of this distribution. Besides the methods of direct and mixed moments which were found most successful in previous studies and the well-known indirect method of moments, we develop generalized direct moments and generalized mixed moments methods and a new method of adaptive mixed moments. The last method chooses the orders of two moments for the original observations by utilizing information contained in the sample itself. The results of Monte Carlo experiments demonstrated the superiority of this method in estimating flood events of high return periods when a large sample is available and in estimating flood events of low return periods regardless of the sample size. In addition, a comparison of simulation and asymptotic results shows that the adaptive method may be used for the construction of meaningful confidence intervals for design events based on the asymptotic theory even with small samples. The simulation results also point to the specific members of the class of generalized moments estimates which maintain small values for bias and/or mean square error.
NASA Astrophysics Data System (ADS)
Konrad, Christopher P.; Dettinger, Michael D.
2017-11-01
Atmospheric rivers (ARs) have a significant role in generating floods across the western United States. We analyze daily streamflow for water years 1949 to 2015 from 5,477 gages in relation to water vapor transport by ARs using a 6 h chronology resolved to 2.5° latitude and longitude. The probability that an AR will generate 50 mm/d of runoff in a river on the Pacific Coast increases from 12% when daily mean water vapor transport, DVT, is greater than 300 kg m-1 s-1 to 54% when DVT > 600 kg m-1 s-1. Extreme runoff, represented by the 99th quantile of daily values, doubles from 80 mm/d at DVT = 300 kg m-1 s-1 to 160 mm/d at DVT = 500 kg m-1 s-1. Forecasts and predictions of water vapor transport by atmospheric rivers can support flood risk assessment and estimates of future flood frequencies and magnitude in the western United States.
NASA Astrophysics Data System (ADS)
Gaál, Ladislav; Kohnová, Silvia; Szolgay, Ján.
2010-05-01
During the last 10-15 years, the Slovak hydrologists and water resources managers have been devoting considerable efforts to develop statistical tools for modelling probabilities of flood occurrence in a regional context. Initially, these models followed concepts to regional flood frequency analysis that were based on fixed regions, later the Hosking and Wallis's (HW; 1997) theory was adopted and modified. Nevertheless, it turned out to be that delineating homogeneous regions using these approaches is not a straightforward task, mostly due to the complex orography of the country. In this poster we aim at revisiting flood frequency analyses so far accomplished for Slovakia by adopting one of the pooling approaches, i.e. the region-of-influence (ROI) approach (Burn, 1990). In the ROI approach, unique pooling groups of similar sites are defined for each site under study. The similarity of sites is defined through Euclidean distance in the space of site attributes that had also proved applicability in former cluster analyses: catchment area, afforested area, hydrogeological catchment index and the mean annual precipitation. The homogeneity of the proposed pooling groups is evaluated by the built-in homogeneity test by Lu and Stedinger (1992). Two alternatives of the ROI approach are examined: in the first one the target size of the pooling groups is adjusted to the target return period T of the estimated flood quantiles, while in the other one, the target size is fixed, regardless of the target T. The statistical models of the ROI approach are inter-compared by the conventional regionalization approach based on the HW methodology where the parameters of flood frequency distributions were derived by means of L-moment statistics and a regional formula for the estimation of the index flood was derived by multiple regression methods using physiographic and climatic catchment characteristics. The inter-comparison of different frequency models is evaluated by means of the root mean square error of data from Monte Carlo simulations. The analysis is based on the annual peak discharges from 168 small and mid-sized catchments from Slovakia. The study is supported by the Grant Agency of AS CR under project B300420801; the Slovak Research and Development Agency under the contract No. APVV-0443-07 and the Slovak VEGA Grant Agency under the project No. 1/0103/10. Burn, D.H., 1990: Evaluation of regional flood frequency analysis with a region of influence approach. Water Resources Research, 26(10), 2257-2265. Hosking, J.R.M., Wallis, J.R., 1997: Regional frequency analysis: an approach based on L-moments. Cambridge University Press, Cambridge. Lu, L.-H., Stedinger, J.R., 1992: Sampling variance of normalized GEV/PWM quantile estimators and a regional homogeneity test. Journal of Hydrology, 138(1-2), 223-245.
Variable Selection for Nonparametric Quantile Regression via Smoothing Spline AN OVA
Lin, Chen-Yen; Bondell, Howard; Zhang, Hao Helen; Zou, Hui
2014-01-01
Quantile regression provides a more thorough view of the effect of covariates on a response. Nonparametric quantile regression has become a viable alternative to avoid restrictive parametric assumption. The problem of variable selection for quantile regression is challenging, since important variables can influence various quantiles in different ways. We tackle the problem via regularization in the context of smoothing spline ANOVA models. The proposed sparse nonparametric quantile regression (SNQR) can identify important variables and provide flexible estimates for quantiles. Our numerical study suggests the promising performance of the new procedure in variable selection and function estimation. Supplementary materials for this article are available online. PMID:24554792
Efficient Regressions via Optimally Combining Quantile Information*
Zhao, Zhibiao; Xiao, Zhijie
2014-01-01
We develop a generally applicable framework for constructing efficient estimators of regression models via quantile regressions. The proposed method is based on optimally combining information over multiple quantiles and can be applied to a broad range of parametric and nonparametric settings. When combining information over a fixed number of quantiles, we derive an upper bound on the distance between the efficiency of the proposed estimator and the Fisher information. As the number of quantiles increases, this upper bound decreases and the asymptotic variance of the proposed estimator approaches the Cramér-Rao lower bound under appropriate conditions. In the case of non-regular statistical estimation, the proposed estimator leads to super-efficient estimation. We illustrate the proposed method for several widely used regression models. Both asymptotic theory and Monte Carlo experiments show the superior performance over existing methods. PMID:25484481
Opportunities of probabilistic flood loss models
NASA Astrophysics Data System (ADS)
Schröter, Kai; Kreibich, Heidi; Lüdtke, Stefan; Vogel, Kristin; Merz, Bruno
2016-04-01
Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. However, reliable flood damage models are a prerequisite for the practical usefulness of the model results. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of sharpness of the predictions the reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The comparison of the uni-variable Stage damage function and the multivariable model approach emphasises the importance to quantify predictive uncertainty. With each explanatory variable, the multi-variable model reveals an additional source of uncertainty. However, the predictive performance in terms of precision (mbe), accuracy (mae) and reliability (HR) is clearly improved in comparison to uni-variable Stage damage function. Overall, Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.
Flood Change Assessment and Attribution in Austrian alpine Basins
NASA Astrophysics Data System (ADS)
Claps, Pierluigi; Allamano, Paola; Como, Anastasia; Viglione, Alberto
2016-04-01
The present paper aims to investigate the sensitivity of flood peaks to global warming in the Austrian alpine basins. A group of 97 Austrian watersheds, with areas ranging from 14 to 6000 km2 and with average elevation ranging from 1000 to 2900 m a.s.l. have been considered. Annual maximum floods are available for the basins from 1890 to 2007 with two densities of observation. In a first period, until 1950, an average of 42 records of flood peaks are available. From 1951 to 2007 the density of observation increases to an average amount of contemporary peaks of 85. This information is very important with reference to the statistical tools used for the empirical assessment of change over time, that is linear quantile regressions. Application of this tool to the data set unveils trends in extreme events, confirmed by statistical testing, for the 0.75 and 0.95 empirical quantiles. All applications are made with specific (discharges/area) values . Similarly of what done in a previous approach, multiple quantile regressions have also been applied, confirming the presence of trends even when the possible interference of the specific discharge and morphoclimatic parameters (i.e. mean elevation and catchment area). Application of a geomorphoclimatic model by Allamano et al (2009) can allow to mimic to which extent the empirically available increase in air temperature and annual rainfall can justify the attribution of change derived by the empirical statistical tools. An comparison with data from Swiss alpine basins treated in a previous paper is finally undertaken.
NASA Astrophysics Data System (ADS)
Shastri, Hiteshri; Ghosh, Subimal; Karmakar, Subhankar
2017-02-01
Forecasting of extreme precipitation events at a regional scale is of high importance due to their severe impacts on society. The impacts are stronger in urban regions due to high flood potential as well high population density leading to high vulnerability. Although significant scientific improvements took place in the global models for weather forecasting, they are still not adequate at a regional scale (e.g., for an urban region) with high false alarms and low detection. There has been a need to improve the weather forecast skill at a local scale with probabilistic outcome. Here we develop a methodology with quantile regression, where the reliably simulated variables from Global Forecast System are used as predictors and different quantiles of rainfall are generated corresponding to that set of predictors. We apply this method to a flood-prone coastal city of India, Mumbai, which has experienced severe floods in recent years. We find significant improvements in the forecast with high detection and skill scores. We apply the methodology to 10 ensemble members of Global Ensemble Forecast System and find a reduction in ensemble uncertainty of precipitation across realizations with respect to that of original precipitation forecasts. We validate our model for the monsoon season of 2006 and 2007, which are independent of the training/calibration data set used in the study. We find promising results and emphasize to implement such data-driven methods for a better probabilistic forecast at an urban scale primarily for an early flood warning.
Comparing the index-flood and multiple-regression methods using L-moments
NASA Astrophysics Data System (ADS)
Malekinezhad, H.; Nachtnebel, H. P.; Klik, A.
In arid and semi-arid regions, the length of records is usually too short to ensure reliable quantile estimates. Comparing index-flood and multiple-regression analyses based on L-moments was the main objective of this study. Factor analysis was applied to determine main influencing variables on flood magnitude. Ward’s cluster and L-moments approaches were applied to several sites in the Namak-Lake basin in central Iran to delineate homogeneous regions based on site characteristics. Homogeneity test was done using L-moments-based measures. Several distributions were fitted to the regional flood data and index-flood and multiple-regression methods as two regional flood frequency methods were compared. The results of factor analysis showed that length of main waterway, compactness coefficient, mean annual precipitation, and mean annual temperature were the main variables affecting flood magnitude. The study area was divided into three regions based on the Ward’s method of clustering approach. The homogeneity test based on L-moments showed that all three regions were acceptably homogeneous. Five distributions were fitted to the annual peak flood data of three homogeneous regions. Using the L-moment ratios and the Z-statistic criteria, GEV distribution was identified as the most robust distribution among five candidate distributions for all the proposed sub-regions of the study area, and in general, it was concluded that the generalised extreme value distribution was the best-fit distribution for every three regions. The relative root mean square error (RRMSE) measure was applied for evaluating the performance of the index-flood and multiple-regression methods in comparison with the curve fitting (plotting position) method. In general, index-flood method gives more reliable estimations for various flood magnitudes of different recurrence intervals. Therefore, this method should be adopted as regional flood frequency method for the study area and the Namak-Lake basin in central Iran. To estimate floods of various return periods for gauged catchments in the study area, the mean annual peak flood of the catchments may be multiplied by corresponding values of the growth factors, and computed using the GEV distribution.
SEMIPARAMETRIC QUANTILE REGRESSION WITH HIGH-DIMENSIONAL COVARIATES
Zhu, Liping; Huang, Mian; Li, Runze
2012-01-01
This paper is concerned with quantile regression for a semiparametric regression model, in which both the conditional mean and conditional variance function of the response given the covariates admit a single-index structure. This semiparametric regression model enables us to reduce the dimension of the covariates and simultaneously retains the flexibility of nonparametric regression. Under mild conditions, we show that the simple linear quantile regression offers a consistent estimate of the index parameter vector. This is a surprising and interesting result because the single-index model is possibly misspecified under the linear quantile regression. With a root-n consistent estimate of the index vector, one may employ a local polynomial regression technique to estimate the conditional quantile function. This procedure is computationally efficient, which is very appealing in high-dimensional data analysis. We show that the resulting estimator of the quantile function performs asymptotically as efficiently as if the true value of the index vector were known. The methodologies are demonstrated through comprehensive simulation studies and an application to a real dataset. PMID:24501536
Likelihood-based confidence intervals for estimating floods with given return periods
NASA Astrophysics Data System (ADS)
Martins, Eduardo Sávio P. R.; Clarke, Robin T.
1993-06-01
This paper discusses aspects of the calculation of likelihood-based confidence intervals for T-year floods, with particular reference to (1) the two-parameter gamma distribution; (2) the Gumbel distribution; (3) the two-parameter log-normal distribution, and other distributions related to the normal by Box-Cox transformations. Calculation of the confidence limits is straightforward using the Nelder-Mead algorithm with a constraint incorporated, although care is necessary to ensure convergence either of the Nelder-Mead algorithm, or of the Newton-Raphson calculation of maximum-likelihood estimates. Methods are illustrated using records from 18 gauging stations in the basin of the River Itajai-Acu, State of Santa Catarina, southern Brazil. A small and restricted simulation compared likelihood-based confidence limits with those given by use of the central limit theorem; for the same confidence probability, the confidence limits of the simulation were wider than those of the central limit theorem, which failed more frequently to contain the true quantile being estimated. The paper discusses possible applications of likelihood-based confidence intervals in other areas of hydrological analysis.
Multiple imputation for cure rate quantile regression with censored data.
Wu, Yuanshan; Yin, Guosheng
2017-03-01
The main challenge in the context of cure rate analysis is that one never knows whether censored subjects are cured or uncured, or whether they are susceptible or insusceptible to the event of interest. Considering the susceptible indicator as missing data, we propose a multiple imputation approach to cure rate quantile regression for censored data with a survival fraction. We develop an iterative algorithm to estimate the conditionally uncured probability for each subject. By utilizing this estimated probability and Bernoulli sample imputation, we can classify each subject as cured or uncured, and then employ the locally weighted method to estimate the quantile regression coefficients with only the uncured subjects. Repeating the imputation procedure multiple times and taking an average over the resultant estimators, we obtain consistent estimators for the quantile regression coefficients. Our approach relaxes the usual global linearity assumption, so that we can apply quantile regression to any particular quantile of interest. We establish asymptotic properties for the proposed estimators, including both consistency and asymptotic normality. We conduct simulation studies to assess the finite-sample performance of the proposed multiple imputation method and apply it to a lung cancer study as an illustration. © 2016, The International Biometric Society.
GLOBALLY ADAPTIVE QUANTILE REGRESSION WITH ULTRA-HIGH DIMENSIONAL DATA
Zheng, Qi; Peng, Limin; He, Xuming
2015-01-01
Quantile regression has become a valuable tool to analyze heterogeneous covaraite-response associations that are often encountered in practice. The development of quantile regression methodology for high dimensional covariates primarily focuses on examination of model sparsity at a single or multiple quantile levels, which are typically prespecified ad hoc by the users. The resulting models may be sensitive to the specific choices of the quantile levels, leading to difficulties in interpretation and erosion of confidence in the results. In this article, we propose a new penalization framework for quantile regression in the high dimensional setting. We employ adaptive L1 penalties, and more importantly, propose a uniform selector of the tuning parameter for a set of quantile levels to avoid some of the potential problems with model selection at individual quantile levels. Our proposed approach achieves consistent shrinkage of regression quantile estimates across a continuous range of quantiles levels, enhancing the flexibility and robustness of the existing penalized quantile regression methods. Our theoretical results include the oracle rate of uniform convergence and weak convergence of the parameter estimators. We also use numerical studies to confirm our theoretical findings and illustrate the practical utility of our proposal. PMID:26604424
Implementing the national AIGA flash flood warning system in France
NASA Astrophysics Data System (ADS)
Organde, Didier; Javelle, Pierre; Demargne, Julie; Arnaud, Patrick; Caseri, Angelica; Fine, Jean-Alain; de Saint Aubin, Céline
2015-04-01
The French national hydro-meteorological and flood forecasting centre (SCHAPI) aims to implement a national flash flood warning system to improve flood alerts for small-to-medium (up to 1000 km2) ungauged basins. This system is based on the AIGA method, co-developed by IRSTEA these last 10 years. The method, initially set up for the Mediterranean area, is based on a simple event-based hourly hydrologic distributed model run every 15 minutes (Javelle et al. 2014). The hydrologic model ingests operational radar-gauge rainfall grids from Météo-France at a 1-km² resolution to produce discharges for successive outlets along the river network. Discharges are then compared to regionalized flood quantiles of given return periods and warnings (expressed as the range of the return period estimated in real-time) are provided on a river network map. The main interest of the method is to provide forecasters and emergency services with a synthetic view in real time of the ongoing flood situation, information that is especially critical in ungauged flood prone areas. In its enhanced national version, the hourly event-based distributed model is coupled to a continuous daily rainfall-runoff model which provides baseflow and a soil moisture index (for each 1-km² pixel) at the beginning of the hourly simulation. The rainfall-runoff models were calibrated on a selection of 700 French hydrometric stations with Météo-France radar-gauge reanalysis dataset for the 2002-2006 period. To estimate model parameters for ungauged basins, the 2 hydrologic models were regionalised by testing both regressions (using different catchment attributes, such as catchment area, soil type, and climate characteristic) and spatial proximity techniques (transposing parameters from neighbouring donor catchments), as well as different homogeneous hydrological areas. The most valuable regionalisation method was determined for each model through jack-knife cross-validation. The system performance was then evaluated with contingency criteria (e.g., Critical Success Index, Probability Of Detection, Success Ratio) using operational rainfall radar-gauge products from Météo-France for the 2009-2012 period. The regionalised parameters of the distributed model were finally adjusted for each homogeneous hydrological area to optimize the Heidke skill score (HSS) calculated with three levels of warnings (2-, 10- and 50-year flood quantiles). This work is currently being implemented by the SCHAPI to set up an automated national flash flood warning system by 2016. Planned improvements include developing a unique continuous model to be run at a sub-hourly timestep, discharge assimilation, as well as integrating precipitation forecasts while accounting for the main sources of forecast uncertainty. Javelle, P., Demargne, J., Defrance, D., and Arnaud, P. 2014. Evaluating flash flood warnings at ungauged locations using post-event surveys: a case study with the AIGA warning system. Hydrological Sciences Journal, DOI: 10.1080/02626667.2014.923970
Harden, Tessa M.; O'Connor, Jim E.; Driscoll, Daniel G.; Stamm, John F.
2011-01-01
Flood-frequency analyses for the Black Hills area are important because of severe flooding of June 9-10, 1972, that was caused by a large mesoscale convective system and caused at least 238 deaths. Many 1972 peak flows are high outliers (by factors of 10 or more) in observed records that date to the early 1900s. An efficient means of reducing uncertainties for flood recurrence is to augment gaged records by using paleohydrologic techniques to determine ages and magnitudes of prior large floods (paleofloods). This report summarizes results of paleoflood investigations for Spring Creek, Rapid Creek (two reaches), Boxelder Creek (two subreaches), and Elk Creek. Stratigraphic records and resulting long-term flood chronologies, locally extending more than 2,000 years, were combined with observed and adjusted peak-flow values (gaged records) and historical flood information to derive flood-frequency estimates for the six study reaches. Results indicate that (1) floods as large as and even substantially larger than 1972 have affected most of the study reaches, and (2) incorporation of the paleohydrologic information substantially reduced uncertainties in estimating flood recurrence. Canyons within outcrops of Paleozoic rocks along the eastern flanks of the Black Hills provided excellent environments for (1) deposition and preservation of stratigraphic sequences of late-Holocene flood deposits, primarily in protected slack-water settings flanking the streams; and (2) hydraulic analyses for determination of associated flow magnitudes. The bedrock canyons ensure long-term stability of channel and valley geometry, thereby increasing confidence in hydraulic computations of ancient floods from modern channel geometry. Stratigraphic records of flood sequences, in combination with deposit dating by radiocarbon, optically stimulated luminescence, and cesium-137, provided paleoflood chronologies for 29 individual study sites. Flow magnitudes were estimated from elevations of flood deposits in conjunction with hydraulic calculations based on modern channel and valley geometry. Reach-scale paleoflood chronologies were interpreted for each study reach, which generally entailed correlation of flood evidence among multiple sites, chiefly based on relative position within stratigraphic sequences, unique textural characteristics, or results of age dating and flow estimation. The FLDFRQ3 and PeakfqSA analytical models (assuming log-Pearson Type III frequency distributions) were used for flood-frequency analyses for as many as four scenarios: (1) analysis of gaged records only; (2) gaged records with historical information; (3) all available data including gaged records, historical flows, paleofloods, and perception thresholds; and (4) the same as the third scenario, but ?top fitting? the distribution using only the largest 50 percent of gaged peak flows. The PeakfqSA model is most consistent with procedures adopted by most Federal agencies for flood-frequency analysis and thus was (1) used for comparisons among results for study reaches, and (2) considered by the authors as most appropriate for general applications of estimating low-probability flood recurrence. The detailed paleoflood investigations indicated that in the last 2,000 years all study reaches have had multiple large floods substantially larger than in gaged records. For Spring Creek, stratigraphic records preserved a chronology of at least five paleofloods in approximately (~) 1,000 years approaching or exceeding the 1972 flow of 21,800 cubic feet per second (ft3/s). The largest was ~700 years ago with a flow range of 29,300-58,600 ft3/s, which reflects the uncertainty regarding flood-magnitude estimates that was incorporated in the flood-frequency analyses. In the lower reach of Rapid Creek (downstream from Pactola Dam), two paleofloods in ~1,000 years exceeded the 1972 flow of 31,200 ft3/s. Those occurred ~440 and 1,000 years ago, with flows of 128,000-256,000 and 64,000-128,000 ft3/s, respectively. Five smaller paleofloods of 9,500-19,000 ft3/s occurred between ~200 and 400 years ago. In the upper reach of Rapid Creek (above Pactola Reservoir), the largest recorded floods are substantially smaller than for lower Rapid Creek and all other study reaches. Paleofloods of ~12,900 and 12,000 ft3/s occurred ~1,000 and 1,500 years ago. One additional paleoflood (~800 years ago) was similar in magnitude to the largest gaged flow of 2,460 ft3/s Boxelder Creek was treated as having two subreaches because of two tributaries that affect peak flows. During the last ~1,000 years, paleofloods of ~39,000-78,000 ft3/s and 40,000-80,000 ft3/s in the upstream subreach have exceeded the 1972 peak flow of 30,800 ft3/s. One other paleoflood was similar to the second largest gaged flow (16,400 ft3/s in 1907). For the downstream subreach, paleofloods of 61,300-123,000 ft3/s and 52,500-105,000 ft3/s in the last ~1,000 years have substantially exceeded the 1972 flood (50,500 ft3/s). Four additional paleofloods had flows between 14,200 and 33,800 ft3/s. The 1972 flow on Elk Creek (10,400 ft3/s) has been substantially exceeded at least five times in the last 1,900 years. The largest paleoflood (41,500-124,000 ft3/s) was ~900 years ago. Three other paleofloods between 37,500 and 120,000 ft3/s occurred between 1,100 and 1,800 years ago. A fifth paleoflood of 25,500-76,500 ft3/s was ~750 years ago. Considering analyses for all available data (PeakfqSA model) for all six study reaches, the 95-percent confidence intervals about the low-probability quantile estimates (100-, 200-, and 500-year recurrence intervals) were reduced by at least 78 percent relative to those for the gaged records only. In some cases, 95-percent uncertainty intervals were reduced by 99 percent or more. For all study reaches except the two Boxelder Creek subreaches, quantile estimates for these long-term analyses were larger than for the short-term analyses. The 1972 flow for the Spring Creek study reach (21,800 ft3/s) corresponds with a recurrence interval of ~400 years. Recurrence intervals are ~500 years for the 1972 flood magnitudes along the lower Rapid Creek reach and the upstream subreach of Boxelder Creek. For the downstream subreach of Boxelder Creek, the large 1972 flood magnitude (50,500 ft3/s) exceeds the 500-year quantile estimate by about 35 percent. The recurrence interval of ~100 years for 1972 flooding along the Elk Creek study reach is small relative to other study reaches along the eastern margin of the Black Hills. All of the paleofloods plot within the bounds of a national envelope curve, indicating that the national curve represents exceedingly rare floods for the Black Hills area. Elk Creek, lower Rapid Creek, and the downstream subreach of Boxelder Creek all have paleofloods that plot above a regional envelope curve; in the case of Elk Creek, by a factor of nearly two. The Black Hills paleofloods represent some of the largest known floods, relative to drainage area, for the United States. Many of the other largest known United States floods are in areas with physiographic and climatologic conditions broadly similar to the Black Hills-semiarid and rugged landscapes that intercept and focus heavy precipitation from convective storm systems. The 1972 precipitation and runoff patterns, previous analyses of peak-flow records, and the paleoflood investigations of this study support a hypothesis of distinct differences in flood generation within the central Black Hills study area. The eastern Black Hills are susceptible to intense orographic lifting associated with convective storm systems and also have high relief, thin soils, and narrow and steep canyons-factors favoring generation of exceptionally heavy rain-producing thunderstorms and promoting runoff and rapid concentration of flow into stream channels. In contrast, storm potential is smaller in and near the Limestone Plateau area, and storm runoff is further reduced by substantial infiltration into the limestone, gentle topography, and extensive floodplain storage. Results of the paleoflood investigations are directly applicable only to the specific study reaches and in the case of Rapid Creek, only to pre-regulation conditions. Thus, approaches for broader applications were developed from inferences of overall flood-generation processes, and appropriate domains for application of results were described. Example applications were provided by estimating flood quantiles for selected streamgages, which also allowed direct comparison with results of at-site flood-frequency analyses from a previous study. Several broad issues and uncertainties were examined, including potential biases associated with stratigraphic records that inherently are not always complete, uncertainties regarding statistical approaches, and the unknown applicability of paleoflood records to future watershed conditions. The results of the paleoflood investigations, however, provide much better physically based information on low-probability floods than has been available previously, substantially improving estimates of the magnitude and frequency of large floods in these basins and reducing associated uncertainty.
Improving flash flood frequency analyses by using non-systematic dendrogeomorphic data
NASA Astrophysics Data System (ADS)
Mediero, Luis; María Bodoque, Jose; Garrote, Julio; Ballesteros-Cánovas, Juan Antonio; Aroca-Jimenez, Estefania
2017-04-01
Flash floods have a rapid hydrological response in catchments with short lag times, characterized by ''peaky'' hydrographs. The peak flows are reached within a few hours, thus giving little or no advance warning to prevent and mitigate flood damage. As a result, flash floods may result in a high social risk, as shown for instance by the 1997 Biescas disaster in Spain. The analysis and management of flood risk are clearly conditioned by data availability, especially in mountain areas where usually flash-floods occur. Nevertheless, in mountain basins there is often short data series available that are not accurate in terms of statistical significance. In addition, when flow data is ready for use maximum annual values are generally not as reliable as average flow values, since conventional stream gauge stations may not record the extreme floods, leading to gaps in the time series. Dendrogeomorphology has been shown to be especially useful for improving flood frequency analyses in catchments where short flood series limit the use of conventional hydrological methods. This study presents pros and cons of using a given probability distribution function, such as the Generalized Extreme Value (GEV), and Bayesian Markov Chain Monte Carlo (MCMC) methods to account for non-systematic data provided by dendrogeomorphic techniques, in order to asses flood quantile estimates accuracy. To this end, we have considered a set of locations in Central Spain, where systematic flow available at a gauging site can be extended with non-systematic data obtained from implementation of dendrogeomorphic techniques.
Assessment of the spatial scaling behaviour of floods in the United Kingdom
NASA Astrophysics Data System (ADS)
Formetta, Giuseppe; Stewart, Elizabeth; Bell, Victoria
2017-04-01
Floods are among the most dangerous natural hazards, causing loss of life and significant damage to private and public property. Regional flood-frequency analysis (FFA) methods are essential tools to assess the flood hazard and plan interventions for its mitigation. FFA methods are often based on the well-known index flood method that assumes the invariance of the coefficient of variation of floods with drainage area. This assumption is equivalent to the simple scaling or self-similarity assumption for peak floods, i.e. their spatial structure remains similar in a particular, relatively simple, way to itself over a range of scales. Spatial scaling of floods has been evaluated at national scale for different countries such as Canada, USA, and Australia. According our knowledge. Such a study has not been conducted for the United Kingdom even though the standard FFA method there is based on the index flood assumption. In this work we present an integrated approach to assess of the spatial scaling behaviour of floods in the United Kingdom using three different methods: product moments (PM), probability weighted moments (PWM), and quantile analysis (QA). We analyse both instantaneous and daily annual observed maximum floods and performed our analysis both across the entire country and in its sub-climatic regions as defined in the Flood Studies Report (NERC, 1975). To evaluate the relationship between the k-th moments or quantiles and the drainage area we used both regression with area alone and multiple regression considering other explanatory variables to account for the geomorphology, amount of rainfall, and soil type of the catchments. The latter multiple regression approach was only recently demonstrated being more robust than the traditional regression with area alone that can lead to biased estimates of scaling exponents and misinterpretation of spatial scaling behaviour. We tested our framework on almost 600 rural catchments in UK considered as entire region and split in 11 sub-regions with 50 catchments per region on average. Preliminary results from the three different spatial scaling methods are generally in agreement and indicate that: i) only some of the peak flow variability is explained by area alone (approximately 50% for the entire country and ranging between the 40% and 70% for the sub-regions); ii) this percentage increases to 90% for the entire country and ranges between 80% and 95% for the sub-regions when the multiple regression is used; iii) the simple scaling hypothesis holds in all sub-regions with the exception of weak multi-scaling found in the regions 2 (North), and 5 and 6 (South East). We hypothesize that these deviations can be explained by heterogeneity in large scale precipitation and by the influence of the soil type (predominantly chalk) on the flood formation process in regions 5 and 6.
Quantile uncertainty and value-at-risk model risk.
Alexander, Carol; Sarabia, José María
2012-08-01
This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks. © 2012 Society for Risk Analysis.
Estimating effects of limiting factors with regression quantiles
Cade, B.S.; Terrell, J.W.; Schroeder, R.L.
1999-01-01
In a recent Concepts paper in Ecology, Thomson et al. emphasized that assumptions of conventional correlation and regression analyses fundamentally conflict with the ecological concept of limiting factors, and they called for new statistical procedures to address this problem. The analytical issue is that unmeasured factors may be the active limiting constraint and may induce a pattern of unequal variation in the biological response variable through an interaction with the measured factors. Consequently, changes near the maxima, rather than at the center of response distributions, are better estimates of the effects expected when the observed factor is the active limiting constraint. Regression quantiles provide estimates for linear models fit to any part of a response distribution, including near the upper bounds, and require minimal assumptions about the form of the error distribution. Regression quantiles extend the concept of one-sample quantiles to the linear model by solving an optimization problem of minimizing an asymmetric function of absolute errors. Rank-score tests for regression quantiles provide tests of hypotheses and confidence intervals for parameters in linear models with heteroscedastic errors, conditions likely to occur in models of limiting ecological relations. We used selected regression quantiles (e.g., 5th, 10th, ..., 95th) and confidence intervals to test hypotheses that parameters equal zero for estimated changes in average annual acorn biomass due to forest canopy cover of oak (Quercus spp.) and oak species diversity. Regression quantiles also were used to estimate changes in glacier lily (Erythronium grandiflorum) seedling numbers as a function of lily flower numbers, rockiness, and pocket gopher (Thomomys talpoides fossor) activity, data that motivated the query by Thomson et al. for new statistical procedures. Both example applications showed that effects of limiting factors estimated by changes in some upper regression quantile (e.g., 90-95th) were greater than if effects were estimated by changes in the means from standard linear model procedures. Estimating a range of regression quantiles (e.g., 5-95th) provides a comprehensive description of biological response patterns for exploratory and inferential analyses in observational studies of limiting factors, especially when sampling large spatial and temporal scales.
Ermolieva, T; Filatova, T; Ermoliev, Y; Obersteiner, M; de Bruijn, K M; Jeuken, A
2017-01-01
As flood risks grow worldwide, a well-designed insurance program engaging various stakeholders becomes a vital instrument in flood risk management. The main challenge concerns the applicability of standard approaches for calculating insurance premiums of rare catastrophic losses. This article focuses on the design of a flood-loss-sharing program involving private insurance based on location-specific exposures. The analysis is guided by a developed integrated catastrophe risk management (ICRM) model consisting of a GIS-based flood model and a stochastic optimization procedure with respect to location-specific risk exposures. To achieve the stability and robustness of the program towards floods with various recurrences, the ICRM uses stochastic optimization procedure, which relies on quantile-related risk functions of a systemic insolvency involving overpayments and underpayments of the stakeholders. Two alternative ways of calculating insurance premiums are compared: the robust derived with the ICRM and the traditional average annual loss approach. The applicability of the proposed model is illustrated in a case study of a Rotterdam area outside the main flood protection system in the Netherlands. Our numerical experiments demonstrate essential advantages of the robust premiums, namely, that they: (1) guarantee the program's solvency under all relevant flood scenarios rather than one average event; (2) establish a tradeoff between the security of the program and the welfare of locations; and (3) decrease the need for other risk transfer and risk reduction measures. © 2016 Society for Risk Analysis.
Quantile regression applied to spectral distance decay
Rocchini, D.; Cade, B.S.
2008-01-01
Remotely sensed imagery has long been recognized as a powerful support for characterizing and estimating biodiversity. Spectral distance among sites has proven to be a powerful approach for detecting species composition variability. Regression analysis of species similarity versus spectral distance allows us to quantitatively estimate the amount of turnover in species composition with respect to spectral and ecological variability. In classical regression analysis, the residual sum of squares is minimized for the mean of the dependent variable distribution. However, many ecological data sets are characterized by a high number of zeroes that add noise to the regression model. Quantile regressions can be used to evaluate trend in the upper quantiles rather than a mean trend across the whole distribution of the dependent variable. In this letter, we used ordinary least squares (OLS) and quantile regressions to estimate the decay of species similarity versus spectral distance. The achieved decay rates were statistically nonzero (p < 0.01), considering both OLS and quantile regressions. Nonetheless, the OLS regression estimate of the mean decay rate was only half the decay rate indicated by the upper quantiles. Moreover, the intercept value, representing the similarity reached when the spectral distance approaches zero, was very low compared with the intercepts of the upper quantiles, which detected high species similarity when habitats are more similar. In this letter, we demonstrated the power of using quantile regressions applied to spectral distance decay to reveal species diversity patterns otherwise lost or underestimated by OLS regression. ?? 2008 IEEE.
Spectral distance decay: Assessing species beta-diversity by quantile regression
Rocchinl, D.; Nagendra, H.; Ghate, R.; Cade, B.S.
2009-01-01
Remotely sensed data represents key information for characterizing and estimating biodiversity. Spectral distance among sites has proven to be a powerful approach for detecting species composition variability. Regression analysis of species similarity versus spectral distance may allow us to quantitatively estimate how beta-diversity in species changes with respect to spectral and ecological variability. In classical regression analysis, the residual sum of squares is minimized for the mean of the dependent variable distribution. However, many ecological datasets are characterized by a high number of zeroes that can add noise to the regression model. Quantile regression can be used to evaluate trend in the upper quantiles rather than a mean trend across the whole distribution of the dependent variable. In this paper, we used ordinary least square (ols) and quantile regression to estimate the decay of species similarity versus spectral distance. The achieved decay rates were statistically nonzero (p < 0.05) considering both ols and quantile regression. Nonetheless, ols regression estimate of mean decay rate was only half the decay rate indicated by the upper quantiles. Moreover, the intercept value, representing the similarity reached when spectral distance approaches zero, was very low compared with the intercepts of upper quantiles, which detected high species similarity when habitats are more similar. In this paper we demonstrated the power of using quantile regressions applied to spectral distance decay in order to reveal species diversity patterns otherwise lost or underestimated by ordinary least square regression. ?? 2009 American Society for Photogrammetry and Remote Sensing.
Regularized quantile regression for SNP marker estimation of pig growth curves.
Barroso, L M A; Nascimento, M; Nascimento, A C C; Silva, F F; Serão, N V L; Cruz, C D; Resende, M D V; Silva, F L; Azevedo, C F; Lopes, P S; Guimarães, S E F
2017-01-01
Genomic growth curves are generally defined only in terms of population mean; an alternative approach that has not yet been exploited in genomic analyses of growth curves is the Quantile Regression (QR). This methodology allows for the estimation of marker effects at different levels of the variable of interest. We aimed to propose and evaluate a regularized quantile regression for SNP marker effect estimation of pig growth curves, as well as to identify the chromosome regions of the most relevant markers and to estimate the genetic individual weight trajectory over time (genomic growth curve) under different quantiles (levels). The regularized quantile regression (RQR) enabled the discovery, at different levels of interest (quantiles), of the most relevant markers allowing for the identification of QTL regions. We found the same relevant markers simultaneously affecting different growth curve parameters (mature weight and maturity rate): two (ALGA0096701 and ALGA0029483) for RQR(0.2), one (ALGA0096701) for RQR(0.5), and one (ALGA0003761) for RQR(0.8). Three average genomic growth curves were obtained and the behavior was explained by the curve in quantile 0.2, which differed from the others. RQR allowed for the construction of genomic growth curves, which is the key to identifying and selecting the most desirable animals for breeding purposes. Furthermore, the proposed model enabled us to find, at different levels of interest (quantiles), the most relevant markers for each trait (growth curve parameter estimates) and their respective chromosomal positions (identification of new QTL regions for growth curves in pigs). These markers can be exploited under the context of marker assisted selection while aiming to change the shape of pig growth curves.
Large-scale derived flood frequency analysis based on continuous simulation
NASA Astrophysics Data System (ADS)
Dung Nguyen, Viet; Hundecha, Yeshewatesfa; Guse, Björn; Vorogushyn, Sergiy; Merz, Bruno
2016-04-01
There is an increasing need for spatially consistent flood risk assessments at the regional scale (several 100.000 km2), in particular in the insurance industry and for national risk reduction strategies. However, most large-scale flood risk assessments are composed of smaller-scale assessments and show spatial inconsistencies. To overcome this deficit, a large-scale flood model composed of a weather generator and catchments models was developed reflecting the spatially inherent heterogeneity. The weather generator is a multisite and multivariate stochastic model capable of generating synthetic meteorological fields (precipitation, temperature, etc.) at daily resolution for the regional scale. These fields respect the observed autocorrelation, spatial correlation and co-variance between the variables. They are used as input into catchment models. A long-term simulation of this combined system enables to derive very long discharge series at many catchment locations serving as a basic for spatially consistent flood risk estimates at the regional scale. This combined model was set up and validated for major river catchments in Germany. The weather generator was trained by 53-year observation data at 528 stations covering not only the complete Germany but also parts of France, Switzerland, Czech Republic and Australia with the aggregated spatial scale of 443,931 km2. 10.000 years of daily meteorological fields for the study area were generated. Likewise, rainfall-runoff simulations with SWIM were performed for the entire Elbe, Rhine, Weser, Donau and Ems catchments. The validation results illustrate a good performance of the combined system, as the simulated flood magnitudes and frequencies agree well with the observed flood data. Based on continuous simulation this model chain is then used to estimate flood quantiles for the whole Germany including upstream headwater catchments in neighbouring countries. This continuous large scale approach overcomes the several drawbacks reported in traditional approaches for the derived flood frequency analysis and therefore is recommended for large scale flood risk case studies.
Quantile regression models of animal habitat relationships
Cade, Brian S.
2003-01-01
Typically, all factors that limit an organism are not measured and included in statistical models used to investigate relationships with their environment. If important unmeasured variables interact multiplicatively with the measured variables, the statistical models often will have heterogeneous response distributions with unequal variances. Quantile regression is an approach for estimating the conditional quantiles of a response variable distribution in the linear model, providing a more complete view of possible causal relationships between variables in ecological processes. Chapter 1 introduces quantile regression and discusses the ordering characteristics, interval nature, sampling variation, weighting, and interpretation of estimates for homogeneous and heterogeneous regression models. Chapter 2 evaluates performance of quantile rankscore tests used for hypothesis testing and constructing confidence intervals for linear quantile regression estimates (0 ≤ τ ≤ 1). A permutation F test maintained better Type I errors than the Chi-square T test for models with smaller n, greater number of parameters p, and more extreme quantiles τ. Both versions of the test required weighting to maintain correct Type I errors when there was heterogeneity under the alternative model. An example application related trout densities to stream channel width:depth. Chapter 3 evaluates a drop in dispersion, F-ratio like permutation test for hypothesis testing and constructing confidence intervals for linear quantile regression estimates (0 ≤ τ ≤ 1). Chapter 4 simulates from a large (N = 10,000) finite population representing grid areas on a landscape to demonstrate various forms of hidden bias that might occur when the effect of a measured habitat variable on some animal was confounded with the effect of another unmeasured variable (spatially and not spatially structured). Depending on whether interactions of the measured habitat and unmeasured variable were negative (interference interactions) or positive (facilitation interactions), either upper (τ > 0.5) or lower (τ < 0.5) quantile regression parameters were less biased than mean rate parameters. Sampling (n = 20 - 300) simulations demonstrated that confidence intervals constructed by inverting rankscore tests provided valid coverage of these biased parameters. Quantile regression was used to estimate effects of physical habitat resources on a bivalve mussel (Macomona liliana) in a New Zealand harbor by modeling the spatial trend surface as a cubic polynomial of location coordinates.
NASA Astrophysics Data System (ADS)
Garavaglia, F.; Paquet, E.; Lang, M.; Renard, B.; Arnaud, P.; Aubert, Y.; Carre, J.
2013-12-01
In flood risk assessment the methods can be divided in two families: deterministic methods and probabilistic methods. In the French hydrologic community the probabilistic methods are historically preferred to the deterministic ones. Presently a French research project named EXTRAFLO (RiskNat Program of the French National Research Agency, https://extraflo.cemagref.fr) deals with the design values for extreme rainfall and floods. The object of this project is to carry out a comparison of the main methods used in France for estimating extreme values of rainfall and floods, to obtain a better grasp of their respective fields of application. In this framework we present the results of Task 7 of EXTRAFLO project. Focusing on French watersheds, we compare the main extreme flood estimation methods used in French background: (i) standard flood frequency analysis (Gumbel and GEV distribution), (ii) regional flood frequency analysis (regional Gumbel and GEV distribution), (iii) local and regional flood frequency analysis improved by historical information (Naulet et al., 2005), (iv) simplify probabilistic method based on rainfall information (i.e. Gradex method (CFGB, 1994), Agregee method (Margoum, 1992) and Speed method (Cayla, 1995)), (v) flood frequency analysis by continuous simulation approach and based on rainfall information (i.e. Schadex method (Paquet et al., 2013, Garavaglia et al., 2010), Shyreg method (Lavabre et al., 2003)) and (vi) multifractal approach. The main result of this comparative study is that probabilistic methods based on additional information (i.e. regional, historical and rainfall information) provide better estimations than the standard flood frequency analysis. Another interesting result is that, the differences between the various extreme flood quantile estimations of compared methods increase with return period, staying relatively moderate up to 100-years return levels. Results and discussions are here illustrated throughout with the example of five watersheds located in the South of France. References : O. CAYLA : Probability calculation of design floods abd inflows - SPEED. Waterpower 1995, San Francisco, California 1995 CFGB : Design flood determination by the gradex method. Bulletin du Comité Français des Grands Barrages News 96, 18th congress CIGB-ICOLD n2, nov:108, 1994. F. GARAVAGLIA et al. : Introducing a rainfall compound distribution model based on weather patterns subsampling. Hydrology and Earth System Sciences, 14, 951-964, 2010. J. LAVABRE et al. : SHYREG : une méthode pour l'estimation régionale des débits de crue. application aux régions méditerranéennes françaises. Ingénierie EAT, 97-111, 2003. M. MARGOUM : Estimation des crues rares et extrêmes : le modèle AGREGEE. Conceptions et remières validations. PhD, Ecole des Mines de Paris, 1992. R. NAULET et al. : Flood frequency analysis on the Ardèche river using French documentary sources from the two last centuries. Journal of Hydrology, 313:58-78, 2005. E. PAQUET et al. : The SCHADEX method: A semi-continuous rainfall-runoff simulation for extreme flood estimation, Journal of Hydrology, 495, 23-37, 2013.
Shrinkage Estimation of Varying Covariate Effects Based On Quantile Regression
Peng, Limin; Xu, Jinfeng; Kutner, Nancy
2013-01-01
Varying covariate effects often manifest meaningful heterogeneity in covariate-response associations. In this paper, we adopt a quantile regression model that assumes linearity at a continuous range of quantile levels as a tool to explore such data dynamics. The consideration of potential non-constancy of covariate effects necessitates a new perspective for variable selection, which, under the assumed quantile regression model, is to retain variables that have effects on all quantiles of interest as well as those that influence only part of quantiles considered. Current work on l1-penalized quantile regression either does not concern varying covariate effects or may not produce consistent variable selection in the presence of covariates with partial effects, a practical scenario of interest. In this work, we propose a shrinkage approach by adopting a novel uniform adaptive LASSO penalty. The new approach enjoys easy implementation without requiring smoothing. Moreover, it can consistently identify the true model (uniformly across quantiles) and achieve the oracle estimation efficiency. We further extend the proposed shrinkage method to the case where responses are subject to random right censoring. Numerical studies confirm the theoretical results and support the utility of our proposals. PMID:25332515
Development of Hydrological Model of Klang River Valley for flood forecasting
NASA Astrophysics Data System (ADS)
Mohammad, M.; Andras, B.
2012-12-01
This study is to review the impact of climate change and land used on flooding through the Klang River and to compare the changes in the existing river system in Klang River Basin with the Storm water Management and Road Tunnel (SMART) which is now already operating in the city centre of Kuala Lumpur. Klang River Basin is the most urbanized region in Malaysia. More than half of the basin has been urbanized on the land that is prone to flooding. Numerous flood mitigation projects and studies have been carried out to enhance the existing flood forecasting and mitigation project. The objective of this study is to develop a hydrological model for flood forecasting in Klang Basin Malaysia. Hydrological modelling generally requires large set of input data and this is more often a challenge for a developing country. Due to this limitation, the Tropical Rainfall Measuring Mission (TRMM) rainfall measurement, initiated by the US space agency NASA and Japanese space agency JAXA was used in this study. TRMM data was transformed and corrected by quantile to quantile transformation. However, transforming the data based on ground measurement doesn't make any significant improvement and the statistical comparison shows only 10% difference. The conceptual HYMOD model was used in this study and calibrated using ROPE algorithm. But, using the whole time series of the observation period in this area resulted in insufficient performance. The depth function which used in ROPE algorithm are then used to identified and calibrated using only unusual event to observed the improvement and efficiency of the model.
A holistic approach for large-scale derived flood frequency analysis
NASA Astrophysics Data System (ADS)
Dung Nguyen, Viet; Apel, Heiko; Hundecha, Yeshewatesfa; Guse, Björn; Sergiy, Vorogushyn; Merz, Bruno
2017-04-01
Spatial consistency, which has been usually disregarded because of the reported methodological difficulties, is increasingly demanded in regional flood hazard (and risk) assessments. This study aims at developing a holistic approach for deriving flood frequency at large scale consistently. A large scale two-component model has been established for simulating very long-term multisite synthetic meteorological fields and flood flow at many gauged and ungauged locations hence reflecting the spatially inherent heterogeneity. The model has been applied for the region of nearly a half million km2 including Germany and parts of nearby countries. The model performance has been multi-objectively examined with a focus on extreme. By this continuous simulation approach, flood quantiles for the studied region have been derived successfully and provide useful input for a comprehensive flood risk study.
A Study on Regional Frequency Analysis using Artificial Neural Network - the Sumjin River Basin
NASA Astrophysics Data System (ADS)
Jeong, C.; Ahn, J.; Ahn, H.; Heo, J. H.
2017-12-01
Regional frequency analysis means to make up for shortcomings in the at-site frequency analysis which is about a lack of sample size through the regional concept. Regional rainfall quantile depends on the identification of hydrologically homogeneous regions, hence the regional classification based on hydrological homogeneous assumption is very important. For regional clustering about rainfall, multidimensional variables and factors related geographical features and meteorological figure are considered such as mean annual precipitation, number of days with precipitation in a year and average maximum daily precipitation in a month. Self-Organizing Feature Map method which is one of the artificial neural network algorithm in the unsupervised learning techniques solves N-dimensional and nonlinear problems and be shown results simply as a data visualization technique. In this study, for the Sumjin river basin in South Korea, cluster analysis was performed based on SOM method using high-dimensional geographical features and meteorological factor as input data. then, for the results, in order to evaluate the homogeneity of regions, the L-moment based discordancy and heterogeneity measures were used. Rainfall quantiles were estimated as the index flood method which is one of regional rainfall frequency analysis. Clustering analysis using SOM method and the consequential variation in rainfall quantile were analyzed. This research was supported by a grant(2017-MPSS31-001) from Supporting Technology Development Program for Disaster Management funded by Ministry of Public Safety and Security(MPSS) of the Korean government.
Assaad, Houssein I; Choudhary, Pankaj K
2013-01-01
The L -statistics form an important class of estimators in nonparametric statistics. Its members include trimmed means and sample quantiles and functions thereof. This article is devoted to theory and applications of L -statistics for repeated measurements data, wherein the measurements on the same subject are dependent and the measurements from different subjects are independent. This article has three main goals: (a) Show that the L -statistics are asymptotically normal for repeated measurements data. (b) Present three statistical applications of this result, namely, location estimation using trimmed means, quantile estimation and construction of tolerance intervals. (c) Obtain a Bahadur representation for sample quantiles. These results are generalizations of similar results for independently and identically distributed data. The practical usefulness of these results is illustrated by analyzing a real data set involving measurement of systolic blood pressure. The properties of the proposed point and interval estimators are examined via simulation.
Distributional changes in rainfall and river flow in Sarawak, Malaysia
NASA Astrophysics Data System (ADS)
Sa'adi, Zulfaqar; Shahid, Shamsuddin; Ismail, Tarmizi; Chung, Eun-Sung; Wang, Xiao-Jun
2017-11-01
Climate change may not change the rainfall mean, but the variability and extremes. Therefore, it is required to explore the possible distributional changes of rainfall characteristics over time. The objective of present study is to assess the distributional changes in annual and northeast monsoon rainfall (November-January) and river flow in Sarawak where small changes in rainfall or river flow variability/distribution may have severe implications on ecology and agriculture. A quantile regression-based approach was used to assess the changes of scale and location of empirical probability density function over the period 1980-2014 at 31 observational stations. The results indicate that diverse variation patterns exist at all stations for annual rainfall but mainly increasing quantile trend at the lowers, and higher quantiles for the month of January and December. The significant increase in annual rainfall is found mostly in the north and central-coastal region and monsoon month rainfalls in the interior and north of Sarawak. Trends in river flow data show that changes in rainfall distribution have affected higher quantiles of river flow in monsoon months at some of the basins and therefore more flooding. The study reveals that quantile trend can provide more information of rainfall change which may be useful for climate change mitigation and adaptation planning.
Optimal regionalization of extreme value distributions for flood estimation
NASA Astrophysics Data System (ADS)
Asadi, Peiman; Engelke, Sebastian; Davison, Anthony C.
2018-01-01
Regionalization methods have long been used to estimate high return levels of river discharges at ungauged locations on a river network. In these methods, discharge measurements from a homogeneous group of similar, gauged, stations are used to estimate high quantiles at a target location that has no observations. The similarity of this group to the ungauged location is measured in terms of a hydrological distance measuring differences in physical and meteorological catchment attributes. We develop a statistical method for estimation of high return levels based on regionalizing the parameters of a generalized extreme value distribution. The group of stations is chosen by optimizing over the attribute weights of the hydrological distance, ensuring similarity and in-group homogeneity. Our method is applied to discharge data from the Rhine basin in Switzerland, and its performance at ungauged locations is compared to that of other regionalization methods. For gauged locations we show how our approach improves the estimation uncertainty for long return periods by combining local measurements with those from the chosen group.
A gentle introduction to quantile regression for ecologists
Cade, B.S.; Noon, B.R.
2003-01-01
Quantile regression is a way to estimate the conditional quantiles of a response variable distribution in the linear model that provides a more complete view of possible causal relationships between variables in ecological processes. Typically, all the factors that affect ecological processes are not measured and included in the statistical models used to investigate relationships between variables associated with those processes. As a consequence, there may be a weak or no predictive relationship between the mean of the response variable (y) distribution and the measured predictive factors (X). Yet there may be stronger, useful predictive relationships with other parts of the response variable distribution. This primer relates quantile regression estimates to prediction intervals in parametric error distribution regression models (eg least squares), and discusses the ordering characteristics, interval nature, sampling variation, weighting, and interpretation of the estimates for homogeneous and heterogeneous regression models.
Boosting structured additive quantile regression for longitudinal childhood obesity data.
Fenske, Nora; Fahrmeir, Ludwig; Hothorn, Torsten; Rzehak, Peter; Höhle, Michael
2013-07-25
Childhood obesity and the investigation of its risk factors has become an important public health issue. Our work is based on and motivated by a German longitudinal study including 2,226 children with up to ten measurements on their body mass index (BMI) and risk factors from birth to the age of 10 years. We introduce boosting of structured additive quantile regression as a novel distribution-free approach for longitudinal quantile regression. The quantile-specific predictors of our model include conventional linear population effects, smooth nonlinear functional effects, varying-coefficient terms, and individual-specific effects, such as intercepts and slopes. Estimation is based on boosting, a computer intensive inference method for highly complex models. We propose a component-wise functional gradient descent boosting algorithm that allows for penalized estimation of the large variety of different effects, particularly leading to individual-specific effects shrunken toward zero. This concept allows us to flexibly estimate the nonlinear age curves of upper quantiles of the BMI distribution, both on population and on individual-specific level, adjusted for further risk factors and to detect age-varying effects of categorical risk factors. Our model approach can be regarded as the quantile regression analog of Gaussian additive mixed models (or structured additive mean regression models), and we compare both model classes with respect to our obesity data.
Zheng, Qi; Peng, Limin
2016-01-01
Quantile regression provides a flexible platform for evaluating covariate effects on different segments of the conditional distribution of response. As the effects of covariates may change with quantile level, contemporaneously examining a spectrum of quantiles is expected to have a better capacity to identify variables with either partial or full effects on the response distribution, as compared to focusing on a single quantile. Under this motivation, we study a general adaptively weighted LASSO penalization strategy in the quantile regression setting, where a continuum of quantile index is considered and coefficients are allowed to vary with quantile index. We establish the oracle properties of the resulting estimator of coefficient function. Furthermore, we formally investigate a BIC-type uniform tuning parameter selector and show that it can ensure consistent model selection. Our numerical studies confirm the theoretical findings and illustrate an application of the new variable selection procedure. PMID:28008212
NASA Astrophysics Data System (ADS)
Bliefernicht, Jan; Seidel, Jochen; Salack, Seyni; Waongo, Moussa; Laux, Patrick; Kunstmann, Harald
2017-04-01
Seasonal precipitation forecasts are a crucial source of information for an early warning of hydro-meteorological extremes in West Africa. However, the current seasonal forecasting system used by the West African weather services in the framework of the West African Climate Outlook forum (PRESAO) is limited to probabilistic precipitation forecasts of 1-month lead time. To improve this provision, we use an ensemble-based quantile-quantile transformation for bias correction of precipitation forecasts provided by a global seasonal ensemble prediction system, the Climate Forecast System Version 2 (CFS2). The statistical technique eliminates systematic differences between global forecasts and observations with the potential to preserve the signal from the model. The technique has also the advantage that it can be easily implemented at national weather services with low capacities. The statistical technique is used to generate probabilistic forecasts of monthly and seasonal precipitation amount and other precipitation indices useful for an early warning of large-scale drought and floods in West Africa. The evaluation of the statistical technique is done using CFS hindcasts (1982 to 2009) in a cross-validation mode to determine the performance of the precipitation forecasts for several lead times focusing on drought and flood events depicted over the Volta and Niger basins. In addition, operational forecasts provided by PRESAO are analyzed from 1998 to 2015. The precipitation forecasts are compared to low-skill reference forecasts generated from gridded observations (i.e. GPCC, CHIRPS) and a novel in-situ gauge database from national observation networks (see Poster EGU2017-10271). The forecasts are evaluated using state-of-the-art verification techniques to determine specific quality attributes of probabilistic forecasts such as reliability, accuracy and skill. In addition, cost-loss approaches are used to determine the value of probabilistic forecasts for multiple users in warning situations. The outcomes of the hindcasts experiment for the Volta basin illustrate that the statistical technique can clearly improve the CFS precipitation forecasts with the potential to provide skillful and valuable early precipitation warnings for large-scale drought and flood situations several months in ahead. In this presentation we give a detailed overview about the ensemble-based quantile-quantile-transformation, its validation and verification and the possibilities of this technique to complement PRESAO. We also highlight the performance of this technique for extremes such as the Sahel drought in the 80ties and in comparison to the various reference data sets (e.g. CFS2, PRESAO, observational data sets) used in this study.
Oddo, Perry C.; Keller, Klaus
2017-01-01
Rising sea levels increase the probability of future coastal flooding. Many decision-makers use risk analyses to inform the design of sea-level rise (SLR) adaptation strategies. These analyses are often silent on potentially relevant uncertainties. For example, some previous risk analyses use the expected, best, or large quantile (i.e., 90%) estimate of future SLR. Here, we use a case study to quantify and illustrate how neglecting SLR uncertainties can bias risk projections. Specifically, we focus on the future 100-yr (1% annual exceedance probability) coastal flood height (storm surge including SLR) in the year 2100 in the San Francisco Bay area. We find that accounting for uncertainty in future SLR increases the return level (the height associated with a probability of occurrence) by half a meter from roughly 2.2 to 2.7 m, compared to using the mean sea-level projection. Accounting for this uncertainty also changes the shape of the relationship between the return period (the inverse probability that an event of interest will occur) and the return level. For instance, incorporating uncertainties shortens the return period associated with the 2.2 m return level from a 100-yr to roughly a 7-yr return period (∼15% probability). Additionally, accounting for this uncertainty doubles the area at risk of flooding (the area to be flooded under a certain height; e.g., the 100-yr flood height) in San Francisco. These results indicate that the method of accounting for future SLR can have considerable impacts on the design of flood risk management strategies. PMID:28350884
Ruckert, Kelsey L; Oddo, Perry C; Keller, Klaus
2017-01-01
Rising sea levels increase the probability of future coastal flooding. Many decision-makers use risk analyses to inform the design of sea-level rise (SLR) adaptation strategies. These analyses are often silent on potentially relevant uncertainties. For example, some previous risk analyses use the expected, best, or large quantile (i.e., 90%) estimate of future SLR. Here, we use a case study to quantify and illustrate how neglecting SLR uncertainties can bias risk projections. Specifically, we focus on the future 100-yr (1% annual exceedance probability) coastal flood height (storm surge including SLR) in the year 2100 in the San Francisco Bay area. We find that accounting for uncertainty in future SLR increases the return level (the height associated with a probability of occurrence) by half a meter from roughly 2.2 to 2.7 m, compared to using the mean sea-level projection. Accounting for this uncertainty also changes the shape of the relationship between the return period (the inverse probability that an event of interest will occur) and the return level. For instance, incorporating uncertainties shortens the return period associated with the 2.2 m return level from a 100-yr to roughly a 7-yr return period (∼15% probability). Additionally, accounting for this uncertainty doubles the area at risk of flooding (the area to be flooded under a certain height; e.g., the 100-yr flood height) in San Francisco. These results indicate that the method of accounting for future SLR can have considerable impacts on the design of flood risk management strategies.
ERIC Educational Resources Information Center
Chen, Sheng-Tung; Kuo, Hsiao-I.; Chen, Chi-Chung
2012-01-01
The two-stage least squares approach together with quantile regression analysis is adopted here to estimate the educational production function. Such a methodology is able to capture the extreme behaviors of the two tails of students' performance and the estimation outcomes have important policy implications. Our empirical study is applied to the…
Non-stationary hydrologic frequency analysis using B-spline quantile regression
NASA Astrophysics Data System (ADS)
Nasri, B.; Bouezmarni, T.; St-Hilaire, A.; Ouarda, T. B. M. J.
2017-11-01
Hydrologic frequency analysis is commonly used by engineers and hydrologists to provide the basic information on planning, design and management of hydraulic and water resources systems under the assumption of stationarity. However, with increasing evidence of climate change, it is possible that the assumption of stationarity, which is prerequisite for traditional frequency analysis and hence, the results of conventional analysis would become questionable. In this study, we consider a framework for frequency analysis of extremes based on B-Spline quantile regression which allows to model data in the presence of non-stationarity and/or dependence on covariates with linear and non-linear dependence. A Markov Chain Monte Carlo (MCMC) algorithm was used to estimate quantiles and their posterior distributions. A coefficient of determination and Bayesian information criterion (BIC) for quantile regression are used in order to select the best model, i.e. for each quantile, we choose the degree and number of knots of the adequate B-spline quantile regression model. The method is applied to annual maximum and minimum streamflow records in Ontario, Canada. Climate indices are considered to describe the non-stationarity in the variable of interest and to estimate the quantiles in this case. The results show large differences between the non-stationary quantiles and their stationary equivalents for an annual maximum and minimum discharge with high annual non-exceedance probabilities.
A quantile regression model for failure-time data with time-dependent covariates
Gorfine, Malka; Goldberg, Yair; Ritov, Ya’acov
2017-01-01
Summary Since survival data occur over time, often important covariates that we wish to consider also change over time. Such covariates are referred as time-dependent covariates. Quantile regression offers flexible modeling of survival data by allowing the covariates to vary with quantiles. This article provides a novel quantile regression model accommodating time-dependent covariates, for analyzing survival data subject to right censoring. Our simple estimation technique assumes the existence of instrumental variables. In addition, we present a doubly-robust estimator in the sense of Robins and Rotnitzky (1992, Recovery of information and adjustment for dependent censoring using surrogate markers. In: Jewell, N. P., Dietz, K. and Farewell, V. T. (editors), AIDS Epidemiology. Boston: Birkhaäuser, pp. 297–331.). The asymptotic properties of the estimators are rigorously studied. Finite-sample properties are demonstrated by a simulation study. The utility of the proposed methodology is demonstrated using the Stanford heart transplant dataset. PMID:27485534
NASA Astrophysics Data System (ADS)
Girinoto, Sadik, Kusman; Indahwati
2017-03-01
The National Socio-Economic Survey samples are designed to produce estimates of parameters of planned domains (provinces and districts). The estimation of unplanned domains (sub-districts and villages) has its limitation to obtain reliable direct estimates. One of the possible solutions to overcome this problem is employing small area estimation techniques. The popular choice of small area estimation is based on linear mixed models. However, such models need strong distributional assumptions and do not easy allow for outlier-robust estimation. As an alternative approach for this purpose, M-quantile regression approach to small area estimation based on modeling specific M-quantile coefficients of conditional distribution of study variable given auxiliary covariates. It obtained outlier-robust estimation from influence function of M-estimator type and also no need strong distributional assumptions. In this paper, the aim of study is to estimate the poverty indicator at sub-district level in Bogor District-West Java using M-quantile models for small area estimation. Using data taken from National Socioeconomic Survey and Villages Potential Statistics, the results provide a detailed description of pattern of incidence and intensity of poverty within Bogor district. We also compare the results with direct estimates. The results showed the framework may be preferable when direct estimate having no incidence of poverty at all in the small area.
Quantile Regression in the Study of Developmental Sciences
Petscher, Yaacov; Logan, Jessica A. R.
2014-01-01
Linear regression analysis is one of the most common techniques applied in developmental research, but only allows for an estimate of the average relations between the predictor(s) and the outcome. This study describes quantile regression, which provides estimates of the relations between the predictor(s) and outcome, but across multiple points of the outcome’s distribution. Using data from the High School and Beyond and U.S. Sustained Effects Study databases, quantile regression is demonstrated and contrasted with linear regression when considering models with: (a) one continuous predictor, (b) one dichotomous predictor, (c) a continuous and a dichotomous predictor, and (d) a longitudinal application. Results from each example exhibited the differential inferences which may be drawn using linear or quantile regression. PMID:24329596
Robust and efficient estimation with weighted composite quantile regression
NASA Astrophysics Data System (ADS)
Jiang, Xuejun; Li, Jingzhi; Xia, Tian; Yan, Wanfeng
2016-09-01
In this paper we introduce a weighted composite quantile regression (CQR) estimation approach and study its application in nonlinear models such as exponential models and ARCH-type models. The weighted CQR is augmented by using a data-driven weighting scheme. With the error distribution unspecified, the proposed estimators share robustness from quantile regression and achieve nearly the same efficiency as the oracle maximum likelihood estimator (MLE) for a variety of error distributions including the normal, mixed-normal, Student's t, Cauchy distributions, etc. We also suggest an algorithm for the fast implementation of the proposed methodology. Simulations are carried out to compare the performance of different estimators, and the proposed approach is used to analyze the daily S&P 500 Composite index, which verifies the effectiveness and efficiency of our theoretical results.
The Applicability of Confidence Intervals of Quantiles for the Generalized Logistic Distribution
NASA Astrophysics Data System (ADS)
Shin, H.; Heo, J.; Kim, T.; Jung, Y.
2007-12-01
The generalized logistic (GL) distribution has been widely used for frequency analysis. However, there is a little study related to the confidence intervals that indicate the prediction accuracy of distribution for the GL distribution. In this paper, the estimation of the confidence intervals of quantiles for the GL distribution is presented based on the method of moments (MOM), maximum likelihood (ML), and probability weighted moments (PWM) and the asymptotic variances of each quantile estimator are derived as functions of the sample sizes, return periods, and parameters. Monte Carlo simulation experiments are also performed to verify the applicability of the derived confidence intervals of quantile. As the results, the relative bias (RBIAS) and relative root mean square error (RRMSE) of the confidence intervals generally increase as return period increases and reverse as sample size increases. And PWM for estimating the confidence intervals performs better than the other methods in terms of RRMSE when the data is almost symmetric while ML shows the smallest RBIAS and RRMSE when the data is more skewed and sample size is moderately large. The GL model was applied to fit the distribution of annual maximum rainfall data. The results show that there are little differences in the estimated quantiles between ML and PWM while distinct differences in MOM.
Quantifying alteration of river flow regime by large reservoirs in France
NASA Astrophysics Data System (ADS)
Cipriani, Thomas; Sauquet, Eric
2017-04-01
Reservoirs may highly modify river flow regime. Knowing the alterations is of importance to better understand the biological and physical patterns along the river network. However data are not necessary available to carry out an analysis of modifications at a national scale, e.g. due to industrial interests or to lack of measurements. The objective of this study is to quantify the changes in a set of hydrological indices due to large reservoirs in France combining different data sources. The analysis is based on a comparison between influenced discharges (observed discharges) and natural discharges available from: (i) gauging stations available upstream the dam, (ii) regionalization procedures (Sauquet et al., 2008; Sauquet et Catalogne, 2011; Cipriani et al., 2012), or (iii) historical data free from human influence close to the dam location. The impact of large reservoirs is assessed considering different facets of the river flow regime, including flood quantiles, low flow characteristics, quantiles from the flow duration curve and the twelve mean monthly discharges. The departures from the indice representative of natural conditions quantify the effect of the reservoir management on the river flow regime. The analysis is based on 62 study cases. Results show large spread in terms of impact depending on the purposes of the reservoirs and the season of interest. Results also point out inconsistencies in data (water balance between outflow and inflow, downstream of the dam is not warranted) due to uncertainties in mean monthly discharges and to the imperfect knowledge of inflows and outflows. Lastly, we suggest a typology of hydrological alterations based on the purposes of the reservoirs. Cipriani T., Toilliez T., Sauquet E. (2012). Estimating 10 year return period peak flows and flood durations at ungauged locations in France. La Houille Blanche, 4-5: 5-13, doi : 10.1051/lhb/2012024. Sauquet E., Catalogne C. (2011). Comparison of catchment grouping methods for flow duration curve estimation at ungauged sites in France. Hydrology and Earth System Sciences, 15: 2421-2435, doi:10.5194/hess-15-2421-2011. Sauquet E., Gottschalk L., Krasovskaïa I. (2008). Estimating mean monthly runoff at ungauged locations: an application to France. Hydrology Research, 39(5-6): 403-423.
An application of quantile random forests for predictive mapping of forest attributes
E.A. Freeman; G.G. Moisen
2015-01-01
Increasingly, random forest models are used in predictive mapping of forest attributes. Traditional random forests output the mean prediction from the random trees. Quantile regression forests (QRF) is an extension of random forests developed by Nicolai Meinshausen that provides non-parametric estimates of the median predicted value as well as prediction quantiles. It...
Comparing least-squares and quantile regression approaches to analyzing median hospital charges.
Olsen, Cody S; Clark, Amy E; Thomas, Andrea M; Cook, Lawrence J
2012-07-01
Emergency department (ED) and hospital charges obtained from administrative data sets are useful descriptors of injury severity and the burden to EDs and the health care system. However, charges are typically positively skewed due to costly procedures, long hospital stays, and complicated or prolonged treatment for few patients. The median is not affected by extreme observations and is useful in describing and comparing distributions of hospital charges. A least-squares analysis employing a log transformation is one approach for estimating median hospital charges, corresponding confidence intervals (CIs), and differences between groups; however, this method requires certain distributional properties. An alternate method is quantile regression, which allows estimation and inference related to the median without making distributional assumptions. The objective was to compare the log-transformation least-squares method to the quantile regression approach for estimating median hospital charges, differences in median charges between groups, and associated CIs. The authors performed simulations using repeated sampling of observed statewide ED and hospital charges and charges randomly generated from a hypothetical lognormal distribution. The median and 95% CI and the multiplicative difference between the median charges of two groups were estimated using both least-squares and quantile regression methods. Performance of the two methods was evaluated. In contrast to least squares, quantile regression produced estimates that were unbiased and had smaller mean square errors in simulations of observed ED and hospital charges. Both methods performed well in simulations of hypothetical charges that met least-squares method assumptions. When the data did not follow the assumed distribution, least-squares estimates were often biased, and the associated CIs had lower than expected coverage as sample size increased. Quantile regression analyses of hospital charges provide unbiased estimates even when lognormal and equal variance assumptions are violated. These methods may be particularly useful in describing and analyzing hospital charges from administrative data sets. © 2012 by the Society for Academic Emergency Medicine.
NASA Astrophysics Data System (ADS)
Eldardiry, H. A.; Habib, E. H.
2014-12-01
Radar-based technologies have made spatially and temporally distributed quantitative precipitation estimates (QPE) available in an operational environmental compared to the raingauges. The floods identified through flash flood monitoring and prediction systems are subject to at least three sources of uncertainties: (a) those related to rainfall estimation errors, (b) those due to streamflow prediction errors due to model structural issues, and (c) those due to errors in defining a flood event. The current study focuses on the first source of uncertainty and its effect on deriving important climatological characteristics of extreme rainfall statistics. Examples of such characteristics are rainfall amounts with certain Average Recurrence Intervals (ARI) or Annual Exceedance Probability (AEP), which are highly valuable for hydrologic and civil engineering design purposes. Gauge-based precipitation frequencies estimates (PFE) have been maturely developed and widely used over the last several decades. More recently, there has been a growing interest by the research community to explore the use of radar-based rainfall products for developing PFE and understand the associated uncertainties. This study will use radar-based multi-sensor precipitation estimates (MPE) for 11 years to derive PFE's corresponding to various return periods over a spatial domain that covers the state of Louisiana in southern USA. The PFE estimation approach used in this study is based on fitting generalized extreme value distribution to hydrologic extreme rainfall data based on annual maximum series (AMS). Some of the estimation problems that may arise from fitting GEV distributions at each radar pixel is the large variance and seriously biased quantile estimators. Hence, a regional frequency analysis approach (RFA) is applied. The RFA involves the use of data from different pixels surrounding each pixel within a defined homogenous region. In this study, region of influence approach along with the index flood technique are used in the RFA. A bootstrap technique procedure is carried out to account for the uncertainty in the distribution parameters to construct 90% confidence intervals (i.e., 5% and 95% confidence limits) on AMS-based precipitation frequency curves.
Climate Change Impact on Variability of Rainfall Intensity in Upper Blue Nile Basin, Ethiopia
NASA Astrophysics Data System (ADS)
Worku, L. Y.
2015-12-01
Extreme rainfall events are major problems in Ethiopia with the resulting floods that usually could cause significant damage to agriculture, ecology, infrastructure, disruption to human activities, loss of property, loss of lives and disease outbreak. The aim of this study was to explore the likely changes of precipitation extreme changes due to future climate change. The study specifically focuses to understand the future climate change impact on variability of rainfall intensity-duration-frequency in Upper Blue Nile basin. Precipitations data from two Global Climate Models (GCMs) have been used in the study are HadCM3 and CGCM3. Rainfall frequency analysis was carried out to estimate quantile with different return periods. Probability Weighted Method (PWM) selected estimation of parameter distribution and L-Moment Ratio Diagrams (LMRDs) used to find the best parent distribution for each station. Therefore, parent distributions for derived from frequency analysis are Generalized Logistic (GLOG), Generalized Extreme Value (GEV), and Gamma & Pearson III (P3) parent distribution. After analyzing estimated quantile simple disaggregation model was applied in order to find sub daily rainfall data. Finally the disaggregated rainfall is fitted to find IDF curve and the result shows in most parts of the basin rainfall intensity expected to increase in the future. As a result of the two GCM outputs, the study indicates there will be likely increase of precipitation extremes over the Blue Nile basin due to the changing climate. This study should be interpreted with caution as the GCM model outputs in this part of the world have huge uncertainty.
Soong, David T.; Straub, Timothy D.; Murphy, Elizabeth A.
2006-01-01
Results of hydrologic model, flood-frequency, hydraulic model, and flood-hazard analysis of the Blackberry Creek watershed in Kane County, Illinois, indicate that the 100-year and 500-year flood plains range from approximately 25 acres in the tributary F watershed (a headwater subbasin at the northeastern corner of the watershed) to almost 1,800 acres in Blackberry Creek main stem. Based on 1996 land-cover data, most of the land in the 100-year and 500-year flood plains was cropland, forested and wooded land, and grassland. A relatively small percentage of urban land was in the flood plains. The Blackberry Creek watershed has undergone rapid urbanization in recent decades. The population and urbanized lands in the watershed are projected to double from the 1990 condition by 2020. Recently, flood-induced damage has occurred more frequently in urbanized areas of the watershed. There are concerns about the effect of urbanization on flood peaks and volumes, future flood-mitigation plans, and potential effects on the water quality and stream habitats. This report describes the procedures used in developing the hydrologic models, estimating the flood-peak discharge magnitudes and recurrence intervals for flood-hazard analysis, developing the hydraulic model, and the results of the analysis in graphical and tabular form. The hydrologic model, Hydrological Simulation Program-FORTRAN (HSPF), was used to perform the simulation of continuous water movements through various patterns of land uses in the watershed. Flood-frequency analysis was applied to an annual maximum series to determine flood quantiles in subbasins for flood-hazard analysis. The Hydrologic Engineering Center-River Analysis System (HEC-RAS) hydraulic model was used to determine the 100-year and 500-year flood elevations, and to determine the 100-year floodway. The hydraulic model was calibrated and verified using high water marks and observed inundation maps for the July 17-18, 1996, flood event. Digital maps of the 100-year and 500-year flood plains and the 100-year floodway for each tributary and the main stem of Blackberry Creek were compiled.
NASA Astrophysics Data System (ADS)
Javelle, Pierre; Organde, Didier; Demargne, Julie; de Saint-Aubin, Céline; Garandeau, Léa; Janet, Bruno; Saint-Martin, Clotilde; Fouchier, Catherine
2016-04-01
Developing a national flash flood (FF) warning system is an ambitious and difficult task. On one hand it rises huge expectations from exposed populations and authorities since induced damages are considerable (ie 20 casualties in the recent October 2015 flood at the French Riviera). But on the other hand, many practical and scientific issues have to be addressed and limitations should be clearly stated. The FF warning system to be implemented by 2016 in France by the SCHAPI (French national service in charge of flood forecasting) will be based on a discharge-threshold flood warning method called AIGA (Javelle et al. 2014). The AIGA method has been experimented in real time in the south of France in the RHYTMME project (http://rhytmme.irstea.fr). It consists in comparing discharges generated by a simple conceptual hourly hydrologic model run at a 1-km² resolution to reference flood quantiles of different return periods, at any point along the river network. The hydrologic model ingests operational rainfall radar-gauge products from Météo-France. Model calibration was based on ~700 hydrometric stations over the 2002-2015 period and then hourly discharges were computed at ~76 000 catchment outlets, with areas ranging from 10 to 3 500 km², over the last 19 years. This product makes it possible to calculate reference flood quantiles at each outlet. The on-going evaluation of the FF warnings is currently made at two levels: in a 'classical' way, using discharges available at the hydrometric stations, but also in a more 'exploratory' way, by comparing past flood reports and warnings issued by the system over the 76 000 catchment outlets. The interest of the last method is that it better fit the system objectives since it is designed to monitor small ungauged catchments. Javelle, P., Demargne, J., Defrance, D, .Pansu, J, .Arnaud, P. (2014). Evaluating flash-flood warnings at ungauged locations using post-event surveys: a case study with the AIGA warning system. Hydrological Sciences Journal-Journal Des Sciences Hydrologiques, 59(7), 1390-1402. doi: 10.1080/02626667.2014.923970
Lin, Lawrence; Pan, Yi; Hedayat, A S; Barnhart, Huiman X; Haber, Michael
2016-01-01
Total deviation index (TDI) captures a prespecified quantile of the absolute deviation of paired observations from raters, observers, methods, assays, instruments, etc. We compare the performance of TDI using nonparametric quantile regression to the TDI assuming normality (Lin, 2000). This simulation study considers three distributions: normal, Poisson, and uniform at quantile levels of 0.8 and 0.9 for cases with and without contamination. Study endpoints include the bias of TDI estimates (compared with their respective theoretical values), standard error of TDI estimates (compared with their true simulated standard errors), and test size (compared with 0.05), and power. Nonparametric TDI using quantile regression, although it slightly underestimates and delivers slightly less power for data without contamination, works satisfactorily under all simulated cases even for moderate (say, ≥40) sample sizes. The performance of the TDI based on a quantile of 0.8 is in general superior to that of 0.9. The performances of nonparametric and parametric TDI methods are compared with a real data example. Nonparametric TDI can be very useful when the underlying distribution on the difference is not normal, especially when it has a heavy tail.
DOT National Transportation Integrated Search
2016-06-01
This report provides two sets of equations for estimating peak discharge quantiles at annual exceedance probabilities (AEPs) of 0.50, 0.20, 0.10, : 0.04, 0.02, 0.01, 0.005, and 0.002 (recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years,...
Quantile Regression in the Study of Developmental Sciences
ERIC Educational Resources Information Center
Petscher, Yaacov; Logan, Jessica A. R.
2014-01-01
Linear regression analysis is one of the most common techniques applied in developmental research, but only allows for an estimate of the average relations between the predictor(s) and the outcome. This study describes quantile regression, which provides estimates of the relations between the predictor(s) and outcome, but across multiple points of…
The quantile regression approach to efficiency measurement: insights from Monte Carlo simulations.
Liu, Chunping; Laporte, Audrey; Ferguson, Brian S
2008-09-01
In the health economics literature there is an ongoing debate over approaches used to estimate the efficiency of health systems at various levels, from the level of the individual hospital - or nursing home - up to that of the health system as a whole. The two most widely used approaches to evaluating the efficiency with which various units deliver care are non-parametric data envelopment analysis (DEA) and parametric stochastic frontier analysis (SFA). Productivity researchers tend to have very strong preferences over which methodology to use for efficiency estimation. In this paper, we use Monte Carlo simulation to compare the performance of DEA and SFA in terms of their ability to accurately estimate efficiency. We also evaluate quantile regression as a potential alternative approach. A Cobb-Douglas production function, random error terms and a technical inefficiency term with different distributions are used to calculate the observed output. The results, based on these experiments, suggest that neither DEA nor SFA can be regarded as clearly dominant, and that, depending on the quantile estimated, the quantile regression approach may be a useful addition to the armamentarium of methods for estimating technical efficiency.
Linking flood peak, flood volume and inundation extent: a DEM-based approach
NASA Astrophysics Data System (ADS)
Rebolho, Cédric; Furusho-Percot, Carina; Blaquière, Simon; Brettschneider, Marco; Andréassian, Vazken
2017-04-01
Traditionally, flood inundation maps are computed based on the Shallow Water Equations (SWE) in one or two dimensions, with various simplifications that have proved to give good results. However, the complexity of the SWEs often requires a numerical resolution which can need long computing time, as well as detailed cross section data: this often results in restricting these models to rather small areas abundant with high quality data. This, along with the necessity for fast inundation mapping, are the reason why rapid inundation models are being designed, working for (almost) any river with a minimum amount of data and, above all, easily available data. Our model tries to follow this path by using a 100m DEM over France from which are extracted a drainage network and the associated drainage areas. It is based on two pre-existing methods: (1) SHYREG (Arnaud et al.,2013), a regionalized approach used to calculate the 2-year and 10-year flood quantiles (used as approximated bankfull flow and maximum discharge, respectively) for each river pixel of the DEM (below a 10 000 km2 drainage area) and (2) SOCOSE (Mailhol,1980), which gives, amongst other things, an empirical formula of a characteristic flood duration (for each pixel) based on catchment area, average precipitation and temperature. An overflow volume for each river pixel is extracted from a triangular shaped synthetic hydrograph designed with SHYREG quantiles and SOCOSE flood duration. The volume is then spread from downstream to upstream one river pixel at a time. When the entire hydrographic network is processed, the model stops and generates a map of potential inundation area associated with the 10-year flood quantile. Our model can also be calibrated using past-events inundation maps by adjusting two parameters, one which modifies the overflow duration, and the other, equivalent to a minimum drainage area for river pixels to be flooded. Thus, in calibration on a sample of 42 basins, the first draft of the model showed a 0.51 median Fit (intersection of simulated and observed areas divided by the union of the two, Bates and De Roo, 2000) and a 0.74 maximum. Obviously, this approach is quite rough, and would require testing on events of homogeneous return periods (which is not the case for now). The next steps in the test and the development of our method include the use of the AIGA distributed model to simulate past-events hydrographs, the search for a new way to automatically approach bankfull flow and the integration of the results in our model to build dynamic maps of the flood. References Arnaud, P., Eglin, Y., Janet, B., and Payrastre, O. (2013). Notice utilisateur : bases de données SHYREG-Débit. Méthode - Performances - Limites. Bates, P. D. and De Roo, A. P. J. (2000). A simple raster-based model for flood inundation simulation. Journal of Hydrology, 236(1-2):54-77. Mailhol, J. (1980). Pour une approche plus réaliste du temps caractéristique de crues des bassins versants. In Actes du Colloque d'Oxford, volume 129, pages 229-237, Oxford. IAHS-AISH.
NASA Astrophysics Data System (ADS)
Spellman, P.; Griffis, V. W.; LaFond, K.
2013-12-01
A changing climate brings about new challenges for flood risk analysis and water resources planning and management. Current methods for estimating flood risk in the US involve fitting the Pearson Type III (P3) probability distribution to the logarithms of the annual maximum flood (AMF) series using the method of moments. These methods are employed under the premise of stationarity, which assumes that the fitted distribution is time invariant and variables affecting stream flow such as climate do not fluctuate. However, climate change would bring about shifts in meteorological forcings which can alter the summary statistics (mean, variance, skew) of flood series used for P3 parameter estimation, resulting in erroneous flood risk projections. To ascertain the degree to which future risk may be misrepresented by current techniques, we use climate scenarios generated from global climate models (GCMs) as input to a hydrological model to explore how relative changes to current climate affect flood response for watersheds in the northeastern United States. The watersheds were calibrated and run on a daily time step using the continuous, semi-distributed, process based Soil and Water Assessment Tool (SWAT). Nash Sutcliffe Efficiency (NSE), RMSE to Standard Deviation ratio (RSR) and Percent Bias (PBIAS) were all used to assess model performance. Eight climate scenarios were chosen from GCM output based on relative precipitation and temperature changes from the current climate of the watershed and then further bias-corrected. Four of the scenarios were selected to represent warm-wet, warm-dry, cool-wet and cool-dry future climates, and the other four were chosen to represent more extreme, albeit possible, changes in precipitation and temperature. We quantify changes in response by comparing the differences in total mass balance and summary statistics of the logarithms of the AMF series from historical baseline values. We then compare forecasts of flood quantiles from fitting a P3 distribution to the logs of historical AMF data to that of generated AMF series.
Quantile regression via vector generalized additive models.
Yee, Thomas W
2004-07-30
One of the most popular methods for quantile regression is the LMS method of Cole and Green. The method naturally falls within a penalized likelihood framework, and consequently allows for considerable flexible because all three parameters may be modelled by cubic smoothing splines. The model is also very understandable: for a given value of the covariate, the LMS method applies a Box-Cox transformation to the response in order to transform it to standard normality; to obtain the quantiles, an inverse Box-Cox transformation is applied to the quantiles of the standard normal distribution. The purposes of this article are three-fold. Firstly, LMS quantile regression is presented within the framework of the class of vector generalized additive models. This confers a number of advantages such as a unifying theory and estimation process. Secondly, a new LMS method based on the Yeo-Johnson transformation is proposed, which has the advantage that the response is not restricted to be positive. Lastly, this paper describes a software implementation of three LMS quantile regression methods in the S language. This includes the LMS-Yeo-Johnson method, which is estimated efficiently by a new numerical integration scheme. The LMS-Yeo-Johnson method is illustrated by way of a large cross-sectional data set from a New Zealand working population. Copyright 2004 John Wiley & Sons, Ltd.
Comparison of different hydrological similarity measures to estimate flow quantiles
NASA Astrophysics Data System (ADS)
Rianna, M.; Ridolfi, E.; Napolitano, F.
2017-07-01
This paper aims to evaluate the influence of hydrological similarity measures on the definition of homogeneous regions. To this end, several attribute sets have been analyzed in the context of the Region of Influence (ROI) procedure. Several combinations of geomorphological, climatological, and geographical characteristics are also used to cluster potentially homogeneous regions. To verify the goodness of the resulting pooled sites, homogeneity tests arecarried out. Through a Monte Carlo simulation and a jack-knife procedure, flow quantiles areestimated for the regions effectively resulting as homogeneous. The analysis areperformed in both the so-called gauged and ungauged scenarios to analyze the effect of hydrological measures on flow quantiles estimation.
Rainfall frequency analysis for ungauged sites using satellite precipitation products
NASA Astrophysics Data System (ADS)
Gado, Tamer A.; Hsu, Kuolin; Sorooshian, Soroosh
2017-11-01
The occurrence of extreme rainfall events and their impacts on hydrologic systems and society are critical considerations in the design and management of a large number of water resources projects. As precipitation records are often limited or unavailable at many sites, it is essential to develop better methods for regional estimation of extreme rainfall at these partially-gauged or ungauged sites. In this study, an innovative method for regional rainfall frequency analysis for ungauged sites is presented. The new method (hereafter, this is called the RRFA-S) is based on corrected annual maximum series obtained from a satellite precipitation product (e.g., PERSIANN-CDR). The probability matching method (PMM) is used here for bias correction to match the CDF of satellite-based precipitation data with the gauged data. The RRFA-S method was assessed through a comparative study with the traditional index flood method using the available annual maximum series of daily rainfall in two different regions in USA (11 sites in Colorado and 18 sites in California). The leave-one-out cross-validation technique was used to represent the ungauged site condition. Results of this numerical application have found that the quantile estimates obtained from the new approach are more accurate and more robust than those given by the traditional index flood method.
The effect of multiple stressors on salt marsh end-of-season biomass
Visser, J.M.; Sasser, C.E.; Cade, B.S.
2006-01-01
It is becoming more apparent that commonly used statistical methods (e.g., analysis of variance and regression) are not the best methods for estimating limiting relationships or stressor effects. A major challenge of estimating the effects associated with a measured subset of limiting factors is to account for the effects of unmeasured factors in an ecologically realistic matter. We used quantile regression to elucidate multiple stressor effects on end-of-season biomass data from two salt marsh sites in coastal Louisiana collected for 18 yr. Stressor effects evaluated based on available data were flooding, salinity, air temperature, cloud cover, precipitation deficit, grazing by muskrat, and surface water nitrogen and phosphorus. Precipitation deficit combined with surface water nitrogen provided the best two-parameter model to explain variation in the peak biomass with different slopes and intercepts for the two study sites. Precipitation deficit, cloud cover, and temperature were significantly correlated with each other. Surface water nitrogen was significantly correlated with surface water phosphorus and muskrat density. The site with the larger duration of flooding showed reduced peak biomass, when cloud cover and surface water nitrogen were optimal. Variation in the relatively low salinity occurring in our study area did not explain any of the variation in Spartina alterniflora biomass. ?? 2006 Estuarine Research Federation.
The effect of multiple stressors on salt marsh end-of-season biomass
Visser, J.M.; Sasser, C.E.; Cade, B.S.
2006-01-01
It is becoming more apparent that commonly used statistical methods (e.g. analysis of variance and regression) are not the best methods for estimating limiting relationships or stressor effects. A major challenge of estimating the effects associated with a measured subset of limiting factors is to account for the effects of unmeasured factors in an ecologically realistic matter. We used quantile regression to elucidate multiple stressor effects on end-of-season biomass data from two salt marsh sites in coastal Louisiana collected for 18 yr. Stressor effects evaluated based on available data were flooding, salinity air temperature, cloud cover, precipitation deficit, grazing by muskrat, and surface water nitrogen and phosphorus. Precipitation deficit combined with surface water nitrogen provided the best two-parameter model to explain variation in the peak biomass with different slopes and intercepts for the two study sites. Precipitation deficit, cloud cover, and temperature were significantly correlated with each other. Surface water nitrogen was significantly correlated with surface water phosphorus and muskrat density. The site with the larger duration of flooding showed reduced peak biomass, when cloud cover and surface water nitrogen were optimal. Variation in the relatively low salinity occurring in our study area did not explain any of the variation in Spartina alterniflora biomass.
Multi-element stochastic spectral projection for high quantile estimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ko, Jordan, E-mail: jordan.ko@mac.com; Garnier, Josselin
2013-06-15
We investigate quantile estimation by multi-element generalized Polynomial Chaos (gPC) metamodel where the exact numerical model is approximated by complementary metamodels in overlapping domains that mimic the model’s exact response. The gPC metamodel is constructed by the non-intrusive stochastic spectral projection approach and function evaluation on the gPC metamodel can be considered as essentially free. Thus, large number of Monte Carlo samples from the metamodel can be used to estimate α-quantile, for moderate values of α. As the gPC metamodel is an expansion about the means of the inputs, its accuracy may worsen away from these mean values where themore » extreme events may occur. By increasing the approximation accuracy of the metamodel, we may eventually improve accuracy of quantile estimation but it is very expensive. A multi-element approach is therefore proposed by combining a global metamodel in the standard normal space with supplementary local metamodels constructed in bounded domains about the design points corresponding to the extreme events. To improve the accuracy and to minimize the sampling cost, sparse-tensor and anisotropic-tensor quadratures are tested in addition to the full-tensor Gauss quadrature in the construction of local metamodels; different bounds of the gPC expansion are also examined. The global and local metamodels are combined in the multi-element gPC (MEgPC) approach and it is shown that MEgPC can be more accurate than Monte Carlo or importance sampling methods for high quantile estimations for input dimensions roughly below N=8, a limit that is very much case- and α-dependent.« less
Quantile rank maps: a new tool for understanding individual brain development.
Chen, Huaihou; Kelly, Clare; Castellanos, F Xavier; He, Ye; Zuo, Xi-Nian; Reiss, Philip T
2015-05-01
We propose a novel method for neurodevelopmental brain mapping that displays how an individual's values for a quantity of interest compare with age-specific norms. By estimating smoothly age-varying distributions at a set of brain regions of interest, we derive age-dependent region-wise quantile ranks for a given individual, which can be presented in the form of a brain map. Such quantile rank maps could potentially be used for clinical screening. Bootstrap-based confidence intervals are proposed for the quantile rank estimates. We also propose a recalibrated Kolmogorov-Smirnov test for detecting group differences in the age-varying distribution. This test is shown to be more robust to model misspecification than a linear regression-based test. The proposed methods are applied to brain imaging data from the Nathan Kline Institute Rockland Sample and from the Autism Brain Imaging Data Exchange (ABIDE) sample. Copyright © 2015 Elsevier Inc. All rights reserved.
Kim, Tae Hyun; Lee, Eui-Kyung; Han, Euna
Overweight/obesity is a growing health risk in Korea. The impact of overweight/obesity on pharmaceutical expenditure can be larger if individuals have multiple risk factors and multiple comorbidities. The current study estimated the combined effects of overweight/obesity and other unhealthy behaviors on pharmaceutical expenditure. An instrumental variable quantile regression model was estimated using Korea Health Panel Study data. The current study extracted data from 3 waves (2009, 2010, and 2011). The final sample included 7148 person-year observations for adults aged 20 years or older. Overweight/obese individuals had higher pharmaceutical expenditure than their non-obese counterparts only at the upper quantiles of the conditional distribution of pharmaceutical expenditure (by 119% at the 90th quantile and 115% at the 95th). The current study found a stronger association at the upper quantiles among men (152%, 144%, and 150% at the 75th, 90th, and 95th quantiles, respectively) than among women (152%, 150%, and 148% at the 75th, 90th, and 95th quantiles, respectively). The association at the upper quantiles was stronger when combined with moderate to heavy drinking and no regular physical check-up, particularly among males. The current study confirms that the association of overweight/obesity with modifiable unhealthy behaviors on pharmaceutical expenditure is larger than with overweight/obesity alone. Assessing the effect of overweight/obesity with lifestyle risk factors can help target groups for public health intervention programs. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Tan, Xuezhi; Gan, Thian Yew; Chen, Shu; Liu, Bingjun
2018-05-01
Climate change and large-scale climate patterns may result in changes in probability distributions of climate variables that are associated with changes in the mean and variability, and severity of extreme climate events. In this paper, we applied a flexible framework based on the Bayesian spatiotemporal quantile (BSTQR) model to identify climate changes at different quantile levels and their teleconnections to large-scale climate patterns such as El Niño-Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO), North Atlantic Oscillation (NAO) and Pacific-North American (PNA). Using the BSTQR model with time (year) as a covariate, we estimated changes in Canadian winter precipitation and their uncertainties at different quantile levels. There were some stations in eastern Canada showing distributional changes in winter precipitation such as an increase in low quantiles but a decrease in high quantiles. Because quantile functions in the BSTQR model vary with space and time and assimilate spatiotemporal precipitation data, the BSTQR model produced much spatially smoother and less uncertain quantile changes than the classic regression without considering spatiotemporal correlations. Using the BSTQR model with five teleconnection indices (i.e., SOI, PDO, PNA, NP and NAO) as covariates, we investigated effects of large-scale climate patterns on Canadian winter precipitation at different quantile levels. Winter precipitation responses to these five teleconnections were found to occur differently at different quantile levels. Effects of five teleconnections on Canadian winter precipitation were stronger at low and high than at medium quantile levels.
A quantile count model of water depth constraints on Cape Sable seaside sparrows
Cade, B.S.; Dong, Q.
2008-01-01
1. A quantile regression model for counts of breeding Cape Sable seaside sparrows Ammodramus maritimus mirabilis (L.) as a function of water depth and previous year abundance was developed based on extensive surveys, 1992-2005, in the Florida Everglades. The quantile count model extends linear quantile regression methods to discrete response variables, providing a flexible alternative to discrete parametric distributional models, e.g. Poisson, negative binomial and their zero-inflated counterparts. 2. Estimates from our multiplicative model demonstrated that negative effects of increasing water depth in breeding habitat on sparrow numbers were dependent on recent occupation history. Upper 10th percentiles of counts (one to three sparrows) decreased with increasing water depth from 0 to 30 cm when sites were not occupied in previous years. However, upper 40th percentiles of counts (one to six sparrows) decreased with increasing water depth for sites occupied in previous years. 3. Greatest decreases (-50% to -83%) in upper quantiles of sparrow counts occurred as water depths increased from 0 to 15 cm when previous year counts were 1, but a small proportion of sites (5-10%) held at least one sparrow even as water depths increased to 20 or 30 cm. 4. A zero-inflated Poisson regression model provided estimates of conditional means that also decreased with increasing water depth but rates of change were lower and decreased with increasing previous year counts compared to the quantile count model. Quantiles computed for the zero-inflated Poisson model enhanced interpretation of this model but had greater lack-of-fit for water depths > 0 cm and previous year counts 1, conditions where the negative effect of water depths were readily apparent and fitted better with the quantile count model.
Estimating equivalence with quantile regression
Cade, B.S.
2011-01-01
Equivalence testing and corresponding confidence interval estimates are used to provide more enlightened statistical statements about parameter estimates by relating them to intervals of effect sizes deemed to be of scientific or practical importance rather than just to an effect size of zero. Equivalence tests and confidence interval estimates are based on a null hypothesis that a parameter estimate is either outside (inequivalence hypothesis) or inside (equivalence hypothesis) an equivalence region, depending on the question of interest and assignment of risk. The former approach, often referred to as bioequivalence testing, is often used in regulatory settings because it reverses the burden of proof compared to a standard test of significance, following a precautionary principle for environmental protection. Unfortunately, many applications of equivalence testing focus on establishing average equivalence by estimating differences in means of distributions that do not have homogeneous variances. I discuss how to compare equivalence across quantiles of distributions using confidence intervals on quantile regression estimates that detect differences in heterogeneous distributions missed by focusing on means. I used one-tailed confidence intervals based on inequivalence hypotheses in a two-group treatment-control design for estimating bioequivalence of arsenic concentrations in soils at an old ammunition testing site and bioequivalence of vegetation biomass at a reclaimed mining site. Two-tailed confidence intervals based both on inequivalence and equivalence hypotheses were used to examine quantile equivalence for negligible trends over time for a continuous exponential model of amphibian abundance. ?? 2011 by the Ecological Society of America.
Qu, Pengfei; Mi, Baibing; Wang, Duolao; Zhang, Ruo; Yang, Jiaomei; Liu, Danmeng; Dang, Shaonong; Yan, Hong
2017-01-01
The objective of this study was to determine the relationship between the quality of feeding practices and children's nutritional status in rural western China. A sample of 12,146 pairs of 6- to 35-month-old children and their mothers were recruited using stratified multistage cluster random sampling in rural western China. Quantile regression was used to analyze the relationship between the Infant and Child Feeding Index (ICFI) and children's nutritional status. In rural western China, 24.37% of all infants and young children suffer from malnutrition. Of this total, 19.57%, 8.74% and 4.63% of infants and children are classified as stunting, underweight and wasting, respectively. After adjusting for covariates, the quantile regression results suggested that qualified ICFI (ICFI > 13.8) was associated with all length and HAZ quantiles (P<0.05) and had a greater effect on the following: poor length and HAZ, the β-estimates (length) from 0.76 cm (95% CI: 0.53 to 0.99 cm) to 0.34 cm (95% CI: 0.09 to 0.59 cm) and the β-estimates (HAZ) from 0.17 (95% CI: 0.10 to 0.24) to 0.11 (95% CI: 0.04 to 0.19). Qualified ICFI was also associated with most weight quantiles (P<0.05 except the 80th and 90th quantiles) and poor and intermediate WAZ quantiles (P<0.05 including the 10th, 20th 30th and 40th quantiles). Additionally, qualified ICFI had a greater effect on poor weight and WAZ quantiles in which the β-estimates (weight) were from 0.20 kg (95% CI: 0.14 to 0.26 kg) to 0.06 kg (95% CI: 0.00 to 0.12 kg) and the β-estimates (WAZ) were from 0.14 (95% CI: 0.08 to 0.21) to 0.05 (95% CI: 0.01 to 0.10). Feeding practices were associated with the physical development of infants and young children, and proper feeding practices had a greater effect on poor physical development in infants and young children. For mothers in rural western China, proper guidelines and messaging on complementary feeding practices are necessary.
Rank score and permutation testing alternatives for regression quantile estimates
Cade, B.S.; Richards, J.D.; Mielke, P.W.
2006-01-01
Performance of quantile rank score tests used for hypothesis testing and constructing confidence intervals for linear quantile regression estimates (0 ≤ τ ≤ 1) were evaluated by simulation for models with p = 2 and 6 predictors, moderate collinearity among predictors, homogeneous and hetero-geneous errors, small to moderate samples (n = 20–300), and central to upper quantiles (0.50–0.99). Test statistics evaluated were the conventional quantile rank score T statistic distributed as χ2 random variable with q degrees of freedom (where q parameters are constrained by H 0:) and an F statistic with its sampling distribution approximated by permutation. The permutation F-test maintained better Type I errors than the T-test for homogeneous error models with smaller n and more extreme quantiles τ. An F distributional approximation of the F statistic provided some improvements in Type I errors over the T-test for models with > 2 parameters, smaller n, and more extreme quantiles but not as much improvement as the permutation approximation. Both rank score tests required weighting to maintain correct Type I errors when heterogeneity under the alternative model increased to 5 standard deviations across the domain of X. A double permutation procedure was developed to provide valid Type I errors for the permutation F-test when null models were forced through the origin. Power was similar for conditions where both T- and F-tests maintained correct Type I errors but the F-test provided some power at smaller n and extreme quantiles when the T-test had no power because of excessively conservative Type I errors. When the double permutation scheme was required for the permutation F-test to maintain valid Type I errors, power was less than for the T-test with decreasing sample size and increasing quantiles. Confidence intervals on parameters and tolerance intervals for future predictions were constructed based on test inversion for an example application relating trout densities to stream channel width:depth.
Technical note: Combining quantile forecasts and predictive distributions of streamflows
NASA Astrophysics Data System (ADS)
Bogner, Konrad; Liechti, Katharina; Zappa, Massimiliano
2017-11-01
The enhanced availability of many different hydro-meteorological modelling and forecasting systems raises the issue of how to optimally combine this great deal of information. Especially the usage of deterministic and probabilistic forecasts with sometimes widely divergent predicted future streamflow values makes it even more complicated for decision makers to sift out the relevant information. In this study multiple streamflow forecast information will be aggregated based on several different predictive distributions, and quantile forecasts. For this combination the Bayesian model averaging (BMA) approach, the non-homogeneous Gaussian regression (NGR), also known as the ensemble model output statistic (EMOS) techniques, and a novel method called Beta-transformed linear pooling (BLP) will be applied. By the help of the quantile score (QS) and the continuous ranked probability score (CRPS), the combination results for the Sihl River in Switzerland with about 5 years of forecast data will be compared and the differences between the raw and optimally combined forecasts will be highlighted. The results demonstrate the importance of applying proper forecast combination methods for decision makers in the field of flood and water resource management.
NASA Astrophysics Data System (ADS)
Raza, Syed Ali; Zaighum, Isma; Shah, Nida
2018-02-01
This paper examines the relationship between economic policy uncertainty and equity premium in G7 countries over a period of the monthly data from January 1989 to December 2015 using a novel technique namely QQ regression proposed by Sim and Zhou (2015). Based on QQ approach, we estimate how the quantiles of the economic policy uncertainty affect the quantiles of the equity premium. Thus, it provides a comprehensive insight into the overall dependence structure between the equity premium and economic policy uncertainty as compared to traditional techniques like OLS or quantile regression. Overall, our empirical evidence suggests the existence of a negative association between equity premium and EPU predominately in all G7 countries, especially in the extreme low and extreme high tails. However, differences exist among countries and across different quantiles of EPU and the equity premium within each country. The existence of this heterogeneity among countries is due to the differences in terms of dependency on economic policy, other stock markets, and the linkages with other country's equity market.
NASA Astrophysics Data System (ADS)
Cannon, Alex
2017-04-01
Estimating historical trends in short-duration rainfall extremes at regional and local scales is challenging due to low signal-to-noise ratios and the limited availability of homogenized observational data. In addition to being of scientific interest, trends in rainfall extremes are of practical importance, as their presence calls into question the stationarity assumptions that underpin traditional engineering and infrastructure design practice. Even with these fundamental challenges, increasingly complex questions are being asked about time series of extremes. For instance, users may not only want to know whether or not rainfall extremes have changed over time, they may also want information on the modulation of trends by large-scale climate modes or on the nonstationarity of trends (e.g., identifying hiatus periods or periods of accelerating positive trends). Efforts have thus been devoted to the development and application of more robust and powerful statistical estimators for regional and local scale trends. While a standard nonparametric method like the regional Mann-Kendall test, which tests for the presence of monotonic trends (i.e., strictly non-decreasing or non-increasing changes), makes fewer assumptions than parametric methods and pools information from stations within a region, it is not designed to visualize detected trends, include information from covariates, or answer questions about the rate of change in trends. As a remedy, monotone quantile regression (MQR) has been developed as a nonparametric alternative that can be used to estimate a common monotonic trend in extremes at multiple stations. Quantile regression makes efficient use of data by directly estimating conditional quantiles based on information from all rainfall data in a region, i.e., without having to precompute the sample quantiles. The MQR method is also flexible and can be used to visualize and analyze the nonlinearity of the detected trend. However, it is fundamentally a univariate technique, and cannot incorporate information from additional covariates, for example ENSO state or physiographic controls on extreme rainfall within a region. Here, the univariate MQR model is extended to allow the use of multiple covariates. Multivariate monotone quantile regression (MMQR) is based on a single hidden-layer feedforward network with the quantile regression error function and partial monotonicity constraints. The MMQR model is demonstrated via Monte Carlo simulations and the estimation and visualization of regional trends in moderate rainfall extremes based on homogenized sub-daily precipitation data at stations in Canada.
Fonseca, Maria de Jesus Mendes da; Juvanhol, Leidjaira Lopes; Rotenberg, Lúcia; Nobre, Aline Araújo; Griep, Rosane Härter; Alves, Márcia Guimarães de Mello; Cardoso, Letícia de Oliveira; Giatti, Luana; Nunes, Maria Angélica; Aquino, Estela M L; Chor, Dóra
2017-11-17
This paper explores the association between job strain and adiposity, using two statistical analysis approaches and considering the role of gender. The research evaluated 11,960 active baseline participants (2008-2010) in the ELSA-Brasil study. Job strain was evaluated through a demand-control questionnaire, while body mass index (BMI) and waist circumference (WC) were evaluated in continuous form. The associations were estimated using gamma regression models with an identity link function. Quantile regression models were also estimated from the final set of co-variables established by gamma regression. The relationship that was found varied by analytical approach and gender. Among the women, no association was observed between job strain and adiposity in the fitted gamma models. In the quantile models, a pattern of increasing effects of high strain was observed at higher BMI and WC distribution quantiles. Among the men, high strain was associated with adiposity in the gamma regression models. However, when quantile regression was used, that association was found not to be homogeneous across outcome distributions. In addition, in the quantile models an association was observed between active jobs and BMI. Our results point to an association between job strain and adiposity, which follows a heterogeneous pattern. Modelling strategies can produce different results and should, accordingly, be used to complement one another.
NASA Astrophysics Data System (ADS)
Solari, Sebastián.; Egüen, Marta; Polo, María. José; Losada, Miguel A.
2017-04-01
Threshold estimation in the Peaks Over Threshold (POT) method and the impact of the estimation method on the calculation of high return period quantiles and their uncertainty (or confidence intervals) are issues that are still unresolved. In the past, methods based on goodness of fit tests and EDF-statistics have yielded satisfactory results, but their use has not yet been systematized. This paper proposes a methodology for automatic threshold estimation, based on the Anderson-Darling EDF-statistic and goodness of fit test. When combined with bootstrapping techniques, this methodology can be used to quantify both the uncertainty of threshold estimation and its impact on the uncertainty of high return period quantiles. This methodology was applied to several simulated series and to four precipitation/river flow data series. The results obtained confirmed its robustness. For the measured series, the estimated thresholds corresponded to those obtained by nonautomatic methods. Moreover, even though the uncertainty of the threshold estimation was high, this did not have a significant effect on the width of the confidence intervals of high return period quantiles.
NASA Astrophysics Data System (ADS)
Mercogliano, P.; Rianna, G.
2017-12-01
Eminent works highlighted how available observations display ongoing increases in extreme rainfall events while climate models assess them for future. Although the constraints in rainfall networks observations and uncertainties in climate modelling currently affect in significant way investigations, the huge impacts potentially induced by climate changes (CC) suggest adopting effective adaptation measures in order to take proper precautions. In this regard, design storms are used by engineers to size hydraulic infrastructures potentially affected by direct (e.g. pluvial/urban flooding) and indirect (e.g. river flooding) effects of extreme rainfall events. Usually they are expressed as IDF curves, mathematical relationships between rainfall Intensity, Duration, and the return period (frequency, F). They are estimated interpreting through Extreme Theories Statistical Theories (ETST) past rainfall records under the assumption of steady conditions resulting then unsuitable under climate change. In this work, a methodology to estimate future variations in IDF curves is presented and carried out for the city of Naples (Southern Italy). In this regard, the Equidistance Quantile Matching Approach proposed by Sivrastav et al. (2014) is adopted. According it, daily-subdaily maximum precipitation observations [a] and the analogous daily data provided by climate projections on current [b] and future time spans [c] are interpreted in IDF terms through Generalized Extreme Value (GEV) approach. After, quantile based mapping approach is used to establish a statistical relationship between cumulative distribution functions resulting by GEV of [a] and [b] (spatial downscaling) and [b] and [c] functions (temporal downscaling). Coupling so-obtained relations permits generating IDF curves under CC assumption. To account for uncertainties in future projections, all climate simulations available for the area in Euro-Cordex multimodel ensemble at 0.11° (about 12 km) are considered under three different concentration scenarios (RCP2.6, RCP4.5 and RCP8.5). The results appear largely influenced by models, RCPs and time horizon of interest; nevertheless, clear indications of increases are detectable although with different magnitude on the different precipitation durations.
NASA Astrophysics Data System (ADS)
Mulyani, Sri; Andriyana, Yudhie; Sudartianto
2017-03-01
Mean regression is a statistical method to explain the relationship between the response variable and the predictor variable based on the central tendency of the data (mean) of the response variable. The parameter estimation in mean regression (with Ordinary Least Square or OLS) generates a problem if we apply it to the data with a symmetric, fat-tailed, or containing outlier. Hence, an alternative method is necessary to be used to that kind of data, for example quantile regression method. The quantile regression is a robust technique to the outlier. This model can explain the relationship between the response variable and the predictor variable, not only on the central tendency of the data (median) but also on various quantile, in order to obtain complete information about that relationship. In this study, a quantile regression is developed with a nonparametric approach such as smoothing spline. Nonparametric approach is used if the prespecification model is difficult to determine, the relation between two variables follow the unknown function. We will apply that proposed method to poverty data. Here, we want to estimate the Percentage of Poor People as the response variable involving the Human Development Index (HDI) as the predictor variable.
Dunham, J.B.; Cade, B.S.; Terrell, J.W.
2002-01-01
We used regression quantiles to model potentially limiting relationships between the standing crop of cutthroat trout Oncorhynchus clarki and measures of stream channel morphology. Regression quantile models indicated that variation in fish density was inversely related to the width:depth ratio of streams but not to stream width or depth alone. The spatial and temporal stability of model predictions were examined across years and streams, respectively. Variation in fish density with width:depth ratio (10th-90th regression quantiles) modeled for streams sampled in 1993-1997 predicted the variation observed in 1998-1999, indicating similar habitat relationships across years. Both linear and nonlinear models described the limiting relationships well, the latter performing slightly better. Although estimated relationships were transferable in time, results were strongly dependent on the influence of spatial variation in fish density among streams. Density changes with width:depth ratio in a single stream were responsible for the significant (P < 0.10) negative slopes estimated for the higher quantiles (>80th). This suggests that stream-scale factors other than width:depth ratio play a more direct role in determining population density. Much of the variation in densities of cutthroat trout among streams was attributed to the occurrence of nonnative brook trout Salvelinus fontinalis (a possible competitor) or connectivity to migratory habitats. Regression quantiles can be useful for estimating the effects of limiting factors when ecological responses are highly variable, but our results indicate that spatiotemporal variability in the data should be explicitly considered. In this study, data from individual streams and stream-specific characteristics (e.g., the occurrence of nonnative species and habitat connectivity) strongly affected our interpretation of the relationship between width:depth ratio and fish density.
Brian S. Cade; Barry R. Noon; Rick D. Scherer; John J. Keane
2017-01-01
Counts of avian fledglings, nestlings, or clutch size that are bounded below by zero and above by some small integer form a discrete random variable distribution that is not approximated well by conventional parametric count distributions such as the Poisson or negative binomial. We developed a logistic quantile regression model to provide estimates of the empirical...
NASA Astrophysics Data System (ADS)
Paquet, Emmanuel; Lawrence, Deborah
2013-04-01
The SCHADEX method for extreme flood estimation was developed by Paquet et al. (2006, 2013), and since 2008, it is the reference method used by Electricité de France (EDF) for dam spillway design. SCHADEX is a so-called "semi-continuous" stochastic simulation method in that flood events are simulated on an event basis and are superimposed on a continuous simulation of the catchment saturation hazard usingrainfall-runoff modelling. The MORDOR hydrological model (Garçon, 1999) has thus far been used for the rainfall-runoff modelling. MORDOR is a conceptual, lumped, reservoir model with daily areal rainfall and air temperature as the driving input data. The principal hydrological processes represented are evapotranspiration, direct and indirect runoff, ground water, snow accumulation and melt, and routing. The model has been intensively used at EDF for more than 15 years, in particular for inflow forecasts for French mountainous catchments. SCHADEX has now also been applied to the Atnasjø catchment (463 km²), a well-documented inland catchment in south-central Norway, dominated by snowmelt flooding during spring/early summer. To support this application, a weather pattern classification based on extreme rainfall was first established for Norway (Fleig, 2012). This classification scheme was then used to build a Multi-Exponential Weather Pattern distribution (MEWP), as introduced by Garavaglia et al. (2010) for extreme rainfall estimation. The MORDOR model was then calibrated relative to daily discharge data for Atnasjø. Finally, a SCHADEX simulation was run to build a daily discharge distribution with a sufficient number of simulations for assessing the extreme quantiles. Detailed results are used to illustrate how SCHADEX handles the complex and interacting hydrological processes driving flood generation in this snow driven catchment. Seasonal and monthly distributions, as well as statistics for several thousand simulated events reaching a 1000 years return level value and assessment of snowmelt role in extreme floods are presented. This study illustrates the complexity of the extreme flood estimation in snow driven catchments, and the need for a good representation of snow accumulation and melting processes in simulations for design flood estimations. In particular, the SCHADEX method is able to represent a range of possible catchment conditions (representing both soil moisture and snowmelt) in which extreme flood events can occur. This study is part of a collaboration between NVE and EDF, initiated within the FloodFreq COST Action (http://www.cost-floodfreq.eu/). References: Fleig, A., Scientific Report of the Short Term Scientific Mission Anne Fleig visiting Électricité de France, FloodFreq COST action - STSM report, 2012 Garavaglia, F., Gailhard, J., Paquet, E., Lang, M., Garçon, R., and Bernardara, P., Introducing a rainfall compound distribution model based on weather patterns sub-sampling, Hydrol. Earth Syst. Sci., 14, 951-964, doi:10.5194/hess-14-951-2010, 2010 Garçon, R. Modèle global pluie-débit pour la prévision et la prédétermination des crues, La Houille Blanche, 7-8, 88-95. doi: 10.1051/lhb/1999088 Paquet, E., Gailhard, J. and Garçon, R. (2006), Evolution of the GRADEX method: improvement by atmospheric circulation classification and hydrological modeling, La Houille Blanche, 5, 80-90. doi: 10.1051/lhb/2006091 Paquet, E., Garavaglia, F., Garçon, R. and Gailhard, J. (2012), The SCHADEX method: a semi-continuous rainfall-runoff simulation for extreme food estimation, Journal of Hydrology, under revision
Quantile Regression for Recurrent Gap Time Data
Luo, Xianghua; Huang, Chiung-Yu; Wang, Lan
2014-01-01
Summary Evaluating covariate effects on gap times between successive recurrent events is of interest in many medical and public health studies. While most existing methods for recurrent gap time analysis focus on modeling the hazard function of gap times, a direct interpretation of the covariate effects on the gap times is not available through these methods. In this article, we consider quantile regression that can provide direct assessment of covariate effects on the quantiles of the gap time distribution. Following the spirit of the weighted risk-set method by Luo and Huang (2011, Statistics in Medicine 30, 301–311), we extend the martingale-based estimating equation method considered by Peng and Huang (2008, Journal of the American Statistical Association 103, 637–649) for univariate survival data to analyze recurrent gap time data. The proposed estimation procedure can be easily implemented in existing software for univariate censored quantile regression. Uniform consistency and weak convergence of the proposed estimators are established. Monte Carlo studies demonstrate the effectiveness of the proposed method. An application to data from the Danish Psychiatric Central Register is presented to illustrate the methods developed in this article. PMID:23489055
Cade, Brian S.; Noon, Barry R.; Scherer, Rick D.; Keane, John J.
2017-01-01
Counts of avian fledglings, nestlings, or clutch size that are bounded below by zero and above by some small integer form a discrete random variable distribution that is not approximated well by conventional parametric count distributions such as the Poisson or negative binomial. We developed a logistic quantile regression model to provide estimates of the empirical conditional distribution of a bounded discrete random variable. The logistic quantile regression model requires that counts are randomly jittered to a continuous random variable, logit transformed to bound them between specified lower and upper values, then estimated in conventional linear quantile regression, repeating the 3 steps and averaging estimates. Back-transformation to the original discrete scale relies on the fact that quantiles are equivariant to monotonic transformations. We demonstrate this statistical procedure by modeling 20 years of California Spotted Owl fledgling production (0−3 per territory) on the Lassen National Forest, California, USA, as related to climate, demographic, and landscape habitat characteristics at territories. Spotted Owl fledgling counts increased nonlinearly with decreasing precipitation in the early nesting period, in the winter prior to nesting, and in the prior growing season; with increasing minimum temperatures in the early nesting period; with adult compared to subadult parents; when there was no fledgling production in the prior year; and when percentage of the landscape surrounding nesting sites (202 ha) with trees ≥25 m height increased. Changes in production were primarily driven by changes in the proportion of territories with 2 or 3 fledglings. Average variances of the discrete cumulative distributions of the estimated fledgling counts indicated that temporal changes in climate and parent age class explained 18% of the annual variance in owl fledgling production, which was 34% of the total variance. Prior fledgling production explained as much of the variance in the fledgling counts as climate, parent age class, and landscape habitat predictors. Our logistic quantile regression model can be used for any discrete response variables with fixed upper and lower bounds.
Advances in the regionalization approach: geostatistical techniques for estimating flood quantiles
NASA Astrophysics Data System (ADS)
Chiarello, Valentina; Caporali, Enrica; Matthies, Hermann G.
2015-04-01
The knowledge of peak flow discharges and associated floods is of primary importance in engineering practice for planning of water resources and risk assessment. Streamflow characteristics are usually estimated starting from measurements of river discharges at stream gauging stations. However, the lack of observations at site of interest as well as the measurement inaccuracies, bring inevitably to the necessity of developing predictive models. Regional analysis is a classical approach to estimate river flow characteristics at sites where little or no data exists. Specific techniques are needed to regionalize the hydrological variables over the considered area. Top-kriging or topological kriging, is a kriging interpolation procedure that takes into account the geometric organization and structure of hydrographic network, the catchment area and the nested nature of catchments. The continuous processes in space defined for the point variables are represented by a variogram. In Top-kriging, the measurements are not point values but are defined over a non-zero catchment area. Top-kriging is applied here over the geographical space of Tuscany Region, in Central Italy. The analysis is carried out on the discharge data of 57 consistent runoff gauges, recorded from 1923 to 2014. Top-kriging give also an estimation of the prediction uncertainty in addition to the prediction itself. The results are validated using a cross-validation procedure implemented in the package rtop of the open source statistical environment R The results are compared through different error measurement methods. Top-kriging seems to perform better in nested catchments and larger scale catchments but no for headwater or where there is a high variability for neighbouring catchments.
Matching a Distribution by Matching Quantiles Estimation
Sgouropoulos, Nikolaos; Yao, Qiwei; Yastremiz, Claudia
2015-01-01
Motivated by the problem of selecting representative portfolios for backtesting counterparty credit risks, we propose a matching quantiles estimation (MQE) method for matching a target distribution by that of a linear combination of a set of random variables. An iterative procedure based on the ordinary least-squares estimation (OLS) is proposed to compute MQE. MQE can be easily modified by adding a LASSO penalty term if a sparse representation is desired, or by restricting the matching within certain range of quantiles to match a part of the target distribution. The convergence of the algorithm and the asymptotic properties of the estimation, both with or without LASSO, are established. A measure and an associated statistical test are proposed to assess the goodness-of-match. The finite sample properties are illustrated by simulation. An application in selecting a counterparty representative portfolio with a real dataset is reported. The proposed MQE also finds applications in portfolio tracking, which demonstrates the usefulness of combining MQE with LASSO. PMID:26692592
Censored quantile regression with recursive partitioning-based weights
Wey, Andrew; Wang, Lan; Rudser, Kyle
2014-01-01
Censored quantile regression provides a useful alternative to the Cox proportional hazards model for analyzing survival data. It directly models the conditional quantile of the survival time and hence is easy to interpret. Moreover, it relaxes the proportionality constraint on the hazard function associated with the popular Cox model and is natural for modeling heterogeneity of the data. Recently, Wang and Wang (2009. Locally weighted censored quantile regression. Journal of the American Statistical Association 103, 1117–1128) proposed a locally weighted censored quantile regression approach that allows for covariate-dependent censoring and is less restrictive than other censored quantile regression methods. However, their kernel smoothing-based weighting scheme requires all covariates to be continuous and encounters practical difficulty with even a moderate number of covariates. We propose a new weighting approach that uses recursive partitioning, e.g. survival trees, that offers greater flexibility in handling covariate-dependent censoring in moderately high dimensions and can incorporate both continuous and discrete covariates. We prove that this new weighting scheme leads to consistent estimation of the quantile regression coefficients and demonstrate its effectiveness via Monte Carlo simulations. We also illustrate the new method using a widely recognized data set from a clinical trial on primary biliary cirrhosis. PMID:23975800
Kowalski, Amanda
2016-01-02
Efforts to control medical care costs depend critically on how individuals respond to prices. I estimate the price elasticity of expenditure on medical care using a censored quantile instrumental variable (CQIV) estimator. CQIV allows estimates to vary across the conditional expenditure distribution, relaxes traditional censored model assumptions, and addresses endogeneity with an instrumental variable. My instrumental variable strategy uses a family member's injury to induce variation in an individual's own price. Across the conditional deciles of the expenditure distribution, I find elasticities that vary from -0.76 to -1.49, which are an order of magnitude larger than previous estimates.
NASA Astrophysics Data System (ADS)
Shao, Yuehong; Wu, Junmei; Ye, Jinyin; Liu, Yonghe
2015-08-01
This study investigates frequency analysis and its spatiotemporal characteristics of precipitation extremes based on annual maximum of daily precipitation (AMP) data of 753 observation stations in China during the period 1951-2010. Several statistical methods including L-moments, Mann-Kendall test (MK test), Student's t test ( t test) and analysis of variance ( F-test) are used to study different statistical properties related to frequency and spatiotemporal characteristics of precipitation extremes. The results indicate that the AMP series of most sites have no linear trends at 90 % confidence level, but there is a distinctive decrease trend in Beijing-Tianjin-Tangshan region. The analysis of abrupt changes shows that there are no significant changes in most sites, and no distinctive regional patterns within the mutation sites either. An important innovation different from the previous studies is the shift in the mean and the variance which are also studied in this paper in order to further analyze the changes of strong and weak precipitation extreme events. The shift analysis shows that we should pay more attention to the drought in North China and to the flood control and drought in South China, especially to those regions that have no clear trend and have a significant shift in the variance. More important, this study conducts the comprehensive analysis of a complete set of quantile estimates and its spatiotemporal characteristic in China. Spatial distribution of quantile estimation based on the AMP series demonstrated that the values gradually increased from the Northwest to the Southeast with the increment of duration and return period, while the increasing rate of estimation is smooth in the arid and semiarid region and is rapid in humid region. Frequency estimates of 50-year return period are in agreement with the maximum observations of AMP series in the most stations, which can provide more quantitative and scientific basis for decision making.
NASA Astrophysics Data System (ADS)
Passow, Christian; Donner, Reik
2017-04-01
Quantile mapping (QM) is an established concept that allows to correct systematic biases in multiple quantiles of the distribution of a climatic observable. It shows remarkable results in correcting biases in historical simulations through observational data and outperforms simpler correction methods which relate only to the mean or variance. Since it has been shown that bias correction of future predictions or scenario runs with basic QM can result in misleading trends in the projection, adjusted, trend preserving, versions of QM were introduced in the form of detrended quantile mapping (DQM) and quantile delta mapping (QDM) (Cannon, 2015, 2016). Still, all previous versions and applications of QM based bias correction rely on the assumption of time-independent quantiles over the investigated period, which can be misleading in the context of a changing climate. Here, we propose a novel combination of linear quantile regression (QR) with the classical QM method to introduce a consistent, time-dependent and trend preserving approach of bias correction for historical and future projections. Since QR is a regression method, it is possible to estimate quantiles in the same resolution as the given data and include trends or other dependencies. We demonstrate the performance of the new method of linear regression quantile mapping (RQM) in correcting biases of temperature and precipitation products from historical runs (1959 - 2005) of the COSMO model in climate mode (CCLM) from the Euro-CORDEX ensemble relative to gridded E-OBS data of the same spatial and temporal resolution. A thorough comparison with established bias correction methods highlights the strengths and potential weaknesses of the new RQM approach. References: A.J. Cannon, S.R. Sorbie, T.Q. Murdock: Bias Correction of GCM Precipitation by Quantile Mapping - How Well Do Methods Preserve Changes in Quantiles and Extremes? Journal of Climate, 28, 6038, 2015 A.J. Cannon: Multivariate Bias Correction of Climate Model Outputs - Matching Marginal Distributions and Inter-variable Dependence Structure. Journal of Climate, 29, 7045, 2016
Hospital charges associated with motorcycle crash factors: a quantile regression analysis.
Olsen, Cody S; Thomas, Andrea M; Cook, Lawrence J
2014-08-01
Previous studies of motorcycle crash (MC) related hospital charges use trauma registries and hospital records, and do not adjust for the number of motorcyclists not requiring medical attention. This may lead to conservative estimates of helmet use effectiveness. MC records were probabilistically linked with emergency department and hospital records to obtain total hospital charges. Missing data were imputed. Multivariable quantile regression estimated reductions in hospital charges associated with helmet use and other crash factors. Motorcycle helmets were associated with reduced median hospital charges of $256 (42% reduction) and reduced 98th percentile of $32,390 (33% reduction). After adjusting for other factors, helmets were associated with reductions in charges in all upper percentiles studied. Quantile regression models described homogenous and heterogeneous associations between other crash factors and charges. Quantile regression comprehensively describes associations between crash factors and hospital charges. Helmet use among motorcyclists is associated with decreased hospital charges. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Kowalski, Amanda
2015-01-01
Efforts to control medical care costs depend critically on how individuals respond to prices. I estimate the price elasticity of expenditure on medical care using a censored quantile instrumental variable (CQIV) estimator. CQIV allows estimates to vary across the conditional expenditure distribution, relaxes traditional censored model assumptions, and addresses endogeneity with an instrumental variable. My instrumental variable strategy uses a family member’s injury to induce variation in an individual’s own price. Across the conditional deciles of the expenditure distribution, I find elasticities that vary from −0.76 to −1.49, which are an order of magnitude larger than previous estimates. PMID:26977117
NASA Astrophysics Data System (ADS)
Paquet, E.
2015-12-01
The SCHADEX method aims at estimating the distribution of peak and daily discharges up to extreme quantiles. It couples a precipitation probabilistic model based on weather patterns, with a stochastic rainfall-runoff simulation process using a conceptual lumped model. It allows exploring an exhaustive set of hydrological conditions and watershed responses to intense rainfall events. Since 2006, it has been widely applied in France to about one hundred watersheds for dam spillway design, and also aboard (Norway, Canada and central Europe among others). However, its application to large watersheds (above 10 000 km²) faces some significant issues: spatial heterogeneity of rainfall and hydrological processes and flood peak damping due to hydraulic effects (flood plains, natural or man-made embankment) being the more important. This led to the development of an extreme flood simulation framework for large and heterogeneous watersheds, based on the SCHADEX method. Its main features are: Division of the large (or main) watershed into several smaller sub-watersheds, where the spatial homogeneity of the hydro-meteorological processes can reasonably be assumed, and where the hydraulic effects can be neglected. Identification of pilot watersheds where discharge data are available, thus where rainfall-runoff models can be calibrated. They will be parameters donors to non-gauged watersheds. Spatially coherent stochastic simulations for all the sub-watersheds at the daily time step. Identification of a selection of simulated events for a given return period (according to the distribution of runoff volumes at the scale of the main watershed). Generation of the complete hourly hydrographs at each of the sub-watersheds outlets. Routing to the main outlet with hydraulic 1D or 2D models. The presentation will be illustrated with the case-study of the Isère watershed (9981 km), a French snow-driven watershed. The main novelties of this method will be underlined, as well as its perspectives and future improvements.
Asymptotics of nonparametric L-1 regression models with dependent data
ZHAO, ZHIBIAO; WEI, YING; LIN, DENNIS K.J.
2013-01-01
We investigate asymptotic properties of least-absolute-deviation or median quantile estimates of the location and scale functions in nonparametric regression models with dependent data from multiple subjects. Under a general dependence structure that allows for longitudinal data and some spatially correlated data, we establish uniform Bahadur representations for the proposed median quantile estimates. The obtained Bahadur representations provide deep insights into the asymptotic behavior of the estimates. Our main theoretical development is based on studying the modulus of continuity of kernel weighted empirical process through a coupling argument. Progesterone data is used for an illustration. PMID:24955016
Spatial Scaling of Floods in Atlantic Coastal Watersheds
NASA Astrophysics Data System (ADS)
Plank, C.
2013-12-01
Climate and land use changes are altering global, regional and local hydrologic cycles. As a result, past events may not accurately represent the events that will occur in the future. Methods for hydrologic prediction, both statistical and deterministic, require adequate data for calibration. Streamflow gauges tend to be located on large rivers. As a result, statistical flood frequency analysis, which relies on gauge data, is biased towards large watersheds. Conversely, the complexity of parameterizing watershed processes in deterministic hydrological models limits these to small watersheds. Spatial scaling relationships between drainage basin area and discharge can be used to bridge these two methodologies and provide new approaches to hydrologic prediction. The relationship of discharge (Q) to drainage basin area (A) can be expressed as a power function: Q = αAθ. This study compares scaling exponents (θ) and coefficients (α) for floods of varying magnitude across a selection of major Atlantic Coast watersheds. Comparisons are made by normalizing flood discharges to a reference area bankfull discharge for each watershed. These watersheds capture the geologic and geomorphic transitions along the Atlantic Coast from narrow bedrock-dominated river valleys to wide coastal plain watersheds. Additionally, there is a range of hydrometeorological events that cause major floods in these basins including tropical storms, thunderstorm systems and winter-spring storms. The mix of flood-producing events changes along a gradient as well, with tropical storms and hurricanes increasing in dominance from north to south as a significant cause of major floods. Scaling exponents and coefficients were determined for both flood quantile estimates (e.g. 1.5-, 10-, 100-year floods) and selected hydrometeorological events (e.g. hurricanes, summer thunderstorms, winter-spring storms). Initial results indicate that southern coastal plain watersheds have lower scaling exponents (θ) than northern watersheds. However, the relative magnitudes of 100-year and other large floods are higher in the coastal plain rivers. In the transition zone between northern and southern watersheds, basins like the Potomac in the Mid-Atlantic region have similar scaling exponents as northern river basins, but relative flood magnitudes comparable to the southern coastal plain watersheds. These differences reflect variations in both geologic/geomorphic and climatic settings. Understanding these variations are important to appropriately using these relationships to improve flood risk models and analyses.
Quantile Regression Models for Current Status Data
Ou, Fang-Shu; Zeng, Donglin; Cai, Jianwen
2016-01-01
Current status data arise frequently in demography, epidemiology, and econometrics where the exact failure time cannot be determined but is only known to have occurred before or after a known observation time. We propose a quantile regression model to analyze current status data, because it does not require distributional assumptions and the coefficients can be interpreted as direct regression effects on the distribution of failure time in the original time scale. Our model assumes that the conditional quantile of failure time is a linear function of covariates. We assume conditional independence between the failure time and observation time. An M-estimator is developed for parameter estimation which is computed using the concave-convex procedure and its confidence intervals are constructed using a subsampling method. Asymptotic properties for the estimator are derived and proven using modern empirical process theory. The small sample performance of the proposed method is demonstrated via simulation studies. Finally, we apply the proposed method to analyze data from the Mayo Clinic Study of Aging. PMID:27994307
Record Balkan floods of 2014 linked to planetary wave resonance.
Stadtherr, Lisa; Coumou, Dim; Petoukhov, Vladimir; Petri, Stefan; Rahmstorf, Stefan
2016-04-01
In May 2014, the Balkans were hit by a Vb-type cyclone that brought disastrous flooding and severe damage to Bosnia and Herzegovina, Serbia, and Croatia. Vb cyclones migrate from the Mediterranean, where they absorb warm and moist air, to the north, often causing flooding in central/eastern Europe. Extreme rainfall events are increasing on a global scale, and both thermodynamic and dynamical mechanisms play a role. Where thermodynamic aspects are generally well understood, there is large uncertainty associated with current and future changes in dynamics. We study the climatic and meteorological factors that influenced the catastrophic flooding in the Balkans, where we focus on large-scale circulation. We show that the Vb cyclone was unusually stationary, bringing extreme rainfall for several consecutive days, and that this situation was likely linked to a quasi-stationary circumglobal Rossby wave train. We provide evidence that this quasi-stationary wave was amplified by wave resonance. Statistical analysis of daily spring rainfall over the Balkan region reveals significant upward trends over 1950-2014, especially in the high quantiles relevant for flooding events. These changes cannot be explained by simple thermodynamic arguments, and we thus argue that dynamical processes likely played a role in increasing flood risks over the Balkans.
2018-01-01
Natural hazards (events that may cause actual disasters) are established in the literature as major causes of various massive and destructive problems worldwide. The occurrences of earthquakes, floods and heat waves affect millions of people through several impacts. These include cases of hospitalisation, loss of lives and economic challenges. The focus of this study was on the risk reduction of the disasters that occur because of extremely high temperatures and heat waves. Modelling average maximum daily temperature (AMDT) guards against the disaster risk and may also help countries towards preparing for extreme heat. This study discusses the use of the r largest order statistics approach of extreme value theory towards modelling AMDT over the period of 11 years, that is, 2000–2010. A generalised extreme value distribution for r largest order statistics is fitted to the annual maxima. This is performed in an effort to study the behaviour of the r largest order statistics. The method of maximum likelihood is used in estimating the target parameters and the frequency of occurrences of the hottest days is assessed. The study presents a case study of South Africa in which the data for the non-winter season (September–April of each year) are used. The meteorological data used are the AMDT that are collected by the South African Weather Service and provided by Eskom. The estimation of the shape parameter reveals evidence of a Weibull class as an appropriate distribution for modelling AMDT in South Africa. The extreme quantiles for specified return periods are estimated using the quantile function and the best model is chosen through the use of the deviance statistic with the support of the graphical diagnostic tools. The Entropy Difference Test (EDT) is used as a specification test for diagnosing the fit of the models to the data.
Fuzzifying historical peak water levels: case study of the river Rhine at Basel
NASA Astrophysics Data System (ADS)
Salinas, Jose Luis; Kiss, Andrea; Blöschl, Günter
2016-04-01
Hydrological information comes from a variety of sources, which in some cases might be non-precise. In particular, this is an important issue for the available information on water stages during historical floods. An accurate estimation of the water level profile, together with an elevation model of the riverbed and floodplain areas is fundamental for the hydraulic reconstruction of historical flood events, allowing the back calculation of flood peak discharges, velocity and erosion fields, damages, among others. For the greatest floods during the last 1700 years, Wetter et al. (2011) reconstructed the water levels and historical discharges at different locations in the old city centre from a variety of historical sources (stone marks, official documents, paintings, etc). This work presents a model for the inherent unpreciseness of these historical water levels. This is, with the arithmetics of fuzzy numbers, described by their membership functions, in a similar fashion as the probability density function describes the uncertainty of a random variable. Additional to the in-site collected water stages from floodmarks and other documentary evidence (e.g. preserved in narratives and newspaper flood reports) are prone to be modeled in a fuzzy way. This study presents the use of fuzzy logic to transform historical information from different sources, in this case of flood water stages, into membership functions. This values might then introduced in the mathematical framework of Fuzzy Bayesian Inference to perform the statistical analyses with the rules of fuzzy numbers algebra. The results of this flood frequency analysis, as in the traditional non-fuzzy way, link discharges with exceedance probabilities or return periods. The main difference is, that the modeled discharge quantiles are not precise values, but fuzzy numbers instead, represented by their membership functions explicitly including the unpreciseness of the historical information used. Wetter, O., Pfister, C., Weingartner, R., Luterbacher, J., Reist, T., & Trösch, J. (2011) The largest floods in the High Rhine basin since 1268 assessed from documentary and instrumental evidence. Hydrol. Sci. J. 56(5), 733-758.
Assessing the quality of rainfall data when aiming to achieve flood resilience
NASA Astrophysics Data System (ADS)
Hoang, C. T.; Tchiguirinskaia, I.; Schertzer, D.; Lovejoy, S.
2012-04-01
A new EU Floods Directive entered into force five years ago. This Directive requires Member States to coordinate adequate measures to reduce flood risk. European flood management systems require reliable rainfall statistics, e.g. the Intensity-duration-Frequency curves for shorter and shorter durations and for a larger and larger range of return periods. Preliminary studies showed that the number of floods was lower when using low time resolution data of high intensity rainfall events, compared to estimates obtained with the help of higher time resolution data. These facts suggest that a particular attention should be paid to the rainfall data quality in order to adequately investigate flood risk aiming to achieve flood resilience. The potential consequences of changes in measuring and recording techniques have been somewhat discussed in the literature with respect to a possible introduction of artificial inhomogeneities in time series. In this paper, we discuss how to detect another artificiality: most of the rainfall time series have a lower recording frequency than that is assumed, furthermore the effective high-frequency limit often depends on the recording year due to algorithm changes. This question is particularly important for operational hydrology, because an error on the effective recording high frequency introduces biases in the corresponding statistics. In this direction, we developed a first version of a SERQUAL procedure to automatically detect the effective time resolution of highly mixed data. Being applied to the 166 rainfall time series in France, the SERQUAL procedure has detected that most of them have an effective hourly resolution, rather than a 5 minutes resolution. Furthermore, series having an overall 5 minute resolution do not have it for all years. These results raise serious concerns on how to benchmark stochastic rainfall models at a sub-hourly resolution, which are particularly desirable for operational hydrology. Therefore, database quality must be checked before use. Due to the fact that the multiple scales and possible scaling behaviour of hydrological data are particularly important for many applications, including flood resilience research, this paper first investigates the sensitivity of the scaling estimates and methods to the deficit of short duration rainfall data, and consequently propose a few simple criteria for a reliable evaluation of the data quality. Then we showed that our procedure SERQUAL enable us to extract high quality sub-series from longer time series that will be much more reliable to calibrate and/or validate short duration quantiles and hydrological models.
Using nonlinear quantile regression to estimate the self-thinning boundary curve
Quang V. Cao; Thomas J. Dean
2015-01-01
The relationship between tree size (quadratic mean diameter) and tree density (number of trees per unit area) has been a topic of research and discussion for many decades. Starting with Reineke in 1933, the maximum size-density relationship, on a log-log scale, has been assumed to be linear. Several techniques, including linear quantile regression, have been employed...
Mi, Baibing; Dang, Shaonong; Li, Qiang; Zhao, Yaling; Yang, Ruihai; Wang, Duolao; Yan, Hong
2015-07-01
Hypertensive patients have more complex health care needs and are more likely to have poorer health-related quality of life than normotensive people. The awareness of hypertension could be related to reduce health-related quality of life. We propose the use of quantile regression to explore more detailed relationships between awareness of hypertension and health-related quality of life. In a cross-sectional, population-based study, 2737 participants (including 1035 hypertensive patients and 1702 normotensive participants) completed the Short-Form Health Survey. A quantile regression model was employed to investigate the association of physical component summary scores and mental component summary scores with awareness of hypertension and to evaluate the associated factors. Patients who were aware of hypertension (N = 554) had lower scores than patients who were unaware of hypertension (N = 481). The median (IQR) of physical component summary scores: 48.20 (13.88) versus 53.27 (10.79), P < 0.01; the mental component summary scores: 50.68 (15.09) versus 51.70 (10.65), P = 0.03. adjusting for covariates, the quantile regression results suggest awareness of hypertension was associated with most physical component summary scores quantiles (P < 0.05 except 10th and 20th quantiles) in which the β-estimates from -2.14 (95% CI: -3.80 to -0.48) to -1.45 (95% CI: -2.42 to -0.47), as the same significant trend with some poorer mental component summary scores quantiles in which the β-estimates from -3.47 (95% CI: -6.65 to -0.39) to -2.18 (95% CI: -4.30 to -0.06). The awareness of hypertension has a greater effect on those with intermediate physical component summary status: the β-estimates were equal to -2.04 (95% CI: -3.51 to -0.57, P < 0.05) at the 40th and decreased further to -1.45 (95% CI: -2.42 to -0.47, P < 0.01) at the 90th quantile. Awareness of hypertension was negatively related to health-related quality of life in hypertensive patients in rural western China, which has a greater effect on mental component summary scores with the poorer status and on physical component summary scores with the intermediate status.
Effects of export concentration on CO2 emissions in developed countries: an empirical analysis.
Apergis, Nicholas; Can, Muhlis; Gozgor, Giray; Lau, Chi Keung Marco
2018-03-08
This paper provides the evidence on the short- and the long-run effects of the export product concentration on the level of CO 2 emissions in 19 developed (high-income) economies, spanning the period 1962-2010. To this end, the paper makes use of the nonlinear panel unit root and cointegration tests with multiple endogenous structural breaks. It also considers the mean group estimations, the autoregressive distributed lag model, and the panel quantile regression estimations. The findings illustrate that the environmental Kuznets curve (EKC) hypothesis is valid in the panel dataset of 19 developed economies. In addition, it documents that a higher level of the product concentration of exports leads to lower CO 2 emissions. The results from the panel quantile regressions also indicate that the effect of the export product concentration upon the per capita CO 2 emissions is relatively high at the higher quantiles.
NASA Astrophysics Data System (ADS)
Farsadnia, F.; Rostami Kamrood, M.; Moghaddam Nia, A.; Modarres, R.; Bray, M. T.; Han, D.; Sadatinejad, J.
2014-02-01
One of the several methods in estimating flood quantiles in ungauged or data-scarce watersheds is regional frequency analysis. Amongst the approaches to regional frequency analysis, different clustering techniques have been proposed to determine hydrologically homogeneous regions in the literature. Recently, Self-Organization feature Map (SOM), a modern hydroinformatic tool, has been applied in several studies for clustering watersheds. However, further studies are still needed with SOM on the interpretation of SOM output map for identifying hydrologically homogeneous regions. In this study, two-level SOM and three clustering methods (fuzzy c-mean, K-mean, and Ward's Agglomerative hierarchical clustering) are applied in an effort to identify hydrologically homogeneous regions in Mazandaran province watersheds in the north of Iran, and their results are compared with each other. Firstly the SOM is used to form a two-dimensional feature map. Next, the output nodes of the SOM are clustered by using unified distance matrix algorithm and three clustering methods to form regions for flood frequency analysis. The heterogeneity test indicates the four regions achieved by the two-level SOM and Ward approach after adjustments are sufficiently homogeneous. The results suggest that the combination of SOM and Ward is much better than the combination of either SOM and FCM or SOM and K-mean.
Robust neural network with applications to credit portfolio data analysis.
Feng, Yijia; Li, Runze; Sudjianto, Agus; Zhang, Yiyun
2010-01-01
In this article, we study nonparametric conditional quantile estimation via neural network structure. We proposed an estimation method that combines quantile regression and neural network (robust neural network, RNN). It provides good smoothing performance in the presence of outliers and can be used to construct prediction bands. A Majorization-Minimization (MM) algorithm was developed for optimization. Monte Carlo simulation study is conducted to assess the performance of RNN. Comparison with other nonparametric regression methods (e.g., local linear regression and regression splines) in real data application demonstrate the advantage of the newly proposed procedure.
Murphy, Elizabeth A.; Straub, Timothy D.; Soong, David T.; Hamblen, Christopher S.
2007-01-01
Results of the hydrologic model, flood-frequency, hydraulic model, and flood-hazard analysis of the Blackberry Creek watershed in Kendall County, Illinois, indicate that the 100-year and 500-year flood plains cover approximately 3,699 and 3,762 acres of land, respectively. On the basis of land-cover data for 2003, most of the land in the flood plains was cropland and residential land. Although many acres of residential land were included in the flood plain, this land was mostly lawns, with 25 homes within the 100-year flood plain, and 41 homes within the 500-year flood plain in the 2003 aerial photograph. This report describes the data collection activities to refine the hydrologic and hydraulic models used in an earlier study of the Kane County part of the Blackberry Creek watershed and to extend the flood-frequency analysis through water year 2003. The results of the flood-hazard analysis are presented in graphical and tabular form. The hydrologic model, Hydrological Simulation Program - FORTRAN (HSPF), was used to simulate continuous water movement through various land-use patterns in the watershed. Flood-frequency analysis was applied to an annual maximum series to determine flood quantiles in subbasins for flood-hazard analysis. The Hydrologic Engineering Center- River Analysis System (HEC-RAS) hydraulic model was used to determine the 100-year and 500-year flood elevations, and the 100-year floodway. The hydraulic model was calibrated and verified using observations during three storms at two crest-stage gages and the U.S. Geological Survey streamflowgaging station near Yorkville. Digital maps of the 100-year and 500-year flood plains and the 100-year floodway for each tributary and the main stem of Blackberry Creek were compiled.
Statistical downscaling modeling with quantile regression using lasso to estimate extreme rainfall
NASA Astrophysics Data System (ADS)
Santri, Dewi; Wigena, Aji Hamim; Djuraidah, Anik
2016-02-01
Rainfall is one of the climatic elements with high diversity and has many negative impacts especially extreme rainfall. Therefore, there are several methods that required to minimize the damage that may occur. So far, Global circulation models (GCM) are the best method to forecast global climate changes include extreme rainfall. Statistical downscaling (SD) is a technique to develop the relationship between GCM output as a global-scale independent variables and rainfall as a local- scale response variable. Using GCM method will have many difficulties when assessed against observations because GCM has high dimension and multicollinearity between the variables. The common method that used to handle this problem is principal components analysis (PCA) and partial least squares regression. The new method that can be used is lasso. Lasso has advantages in simultaneuosly controlling the variance of the fitted coefficients and performing automatic variable selection. Quantile regression is a method that can be used to detect extreme rainfall in dry and wet extreme. Objective of this study is modeling SD using quantile regression with lasso to predict extreme rainfall in Indramayu. The results showed that the estimation of extreme rainfall (extreme wet in January, February and December) in Indramayu could be predicted properly by the model at quantile 90th.
Parameter Heterogeneity In Breast Cancer Cost Regressions – Evidence From Five European Countries
Banks, Helen; Campbell, Harry; Douglas, Anne; Fletcher, Eilidh; McCallum, Alison; Moger, Tron Anders; Peltola, Mikko; Sveréus, Sofia; Wild, Sarah; Williams, Linda J.; Forbes, John
2015-01-01
Abstract We investigate parameter heterogeneity in breast cancer 1‐year cumulative hospital costs across five European countries as part of the EuroHOPE project. The paper aims to explore whether conditional mean effects provide a suitable representation of the national variation in hospital costs. A cohort of patients with a primary diagnosis of invasive breast cancer (ICD‐9 codes 174 and ICD‐10 C50 codes) is derived using routinely collected individual breast cancer data from Finland, the metropolitan area of Turin (Italy), Norway, Scotland and Sweden. Conditional mean effects are estimated by ordinary least squares for each country, and quantile regressions are used to explore heterogeneity across the conditional quantile distribution. Point estimates based on conditional mean effects provide a good approximation of treatment response for some key demographic and diagnostic specific variables (e.g. age and ICD‐10 diagnosis) across the conditional quantile distribution. For many policy variables of interest, however, there is considerable evidence of parameter heterogeneity that is concealed if decisions are based solely on conditional mean results. The use of quantile regression methods reinforce the need to consider beyond an average effect given the greater recognition that breast cancer is a complex disease reflecting patient heterogeneity. © 2015 The Authors. Health Economics Published by John Wiley & Sons Ltd. PMID:26633866
Quantile equivalence to evaluate compliance with habitat management objectives
Cade, Brian S.; Johnson, Pamela R.
2011-01-01
Equivalence estimated with linear quantile regression was used to evaluate compliance with habitat management objectives at Arapaho National Wildlife Refuge based on monitoring data collected in upland (5,781 ha; n = 511 transects) and riparian and meadow (2,856 ha, n = 389 transects) habitats from 2005 to 2008. Quantiles were used because the management objectives specified proportions of the habitat area that needed to comply with vegetation criteria. The linear model was used to obtain estimates that were averaged across 4 y. The equivalence testing framework allowed us to interpret confidence intervals for estimated proportions with respect to intervals of vegetative criteria (equivalence regions) in either a liberal, benefit-of-doubt or conservative, fail-safe approach associated with minimizing alternative risks. Simple Boolean conditional arguments were used to combine the quantile equivalence results for individual vegetation components into a joint statement for the multivariable management objectives. For example, management objective 2A required at least 809 ha of upland habitat with a shrub composition ≥0.70 sagebrush (Artemisia spp.), 20–30% canopy cover of sagebrush ≥25 cm in height, ≥20% canopy cover of grasses, and ≥10% canopy cover of forbs on average over 4 y. Shrub composition and canopy cover of grass each were readily met on >3,000 ha under either conservative or liberal interpretations of sampling variability. However, there were only 809–1,214 ha (conservative to liberal) with ≥10% forb canopy cover and 405–1,098 ha with 20–30%canopy cover of sagebrush ≥25 cm in height. Only 91–180 ha of uplands simultaneously met criteria for all four components, primarily because canopy cover of sagebrush and forbs was inversely related when considered at the spatial scale (30 m) of a sample transect. We demonstrate how the quantile equivalence analyses also can help refine the numerical specification of habitat objectives and explore specification of spatial scales for objectives with respect to sampling scales used to evaluate those objectives.
Wang, Wen-Cheng; Cho, Wen-Chien; Chen, Yin-Jen
2014-01-01
It is estimated that mainland Chinese tourists travelling to Taiwan can bring annual revenues of 400 billion NTD to the Taiwan economy. Thus, how the Taiwanese Government formulates relevant measures to satisfy both sides is the focus of most concern. Taiwan must improve the facilities and service quality of its tourism industry so as to attract more mainland tourists. This paper conducted a questionnaire survey of mainland tourists and used grey relational analysis in grey mathematics to analyze the satisfaction performance of all satisfaction question items. The first eight satisfaction items were used as independent variables, and the overall satisfaction performance was used as a dependent variable for quantile regression model analysis to discuss the relationship between the dependent variable under different quantiles and independent variables. Finally, this study further discussed the predictive accuracy of the least mean regression model and each quantile regression model, as a reference for research personnel. The analysis results showed that other variables could also affect the overall satisfaction performance of mainland tourists, in addition to occupation and age. The overall predictive accuracy of quantile regression model Q0.25 was higher than that of the other three models. PMID:24574916
Wang, Wen-Cheng; Cho, Wen-Chien; Chen, Yin-Jen
2014-01-01
It is estimated that mainland Chinese tourists travelling to Taiwan can bring annual revenues of 400 billion NTD to the Taiwan economy. Thus, how the Taiwanese Government formulates relevant measures to satisfy both sides is the focus of most concern. Taiwan must improve the facilities and service quality of its tourism industry so as to attract more mainland tourists. This paper conducted a questionnaire survey of mainland tourists and used grey relational analysis in grey mathematics to analyze the satisfaction performance of all satisfaction question items. The first eight satisfaction items were used as independent variables, and the overall satisfaction performance was used as a dependent variable for quantile regression model analysis to discuss the relationship between the dependent variable under different quantiles and independent variables. Finally, this study further discussed the predictive accuracy of the least mean regression model and each quantile regression model, as a reference for research personnel. The analysis results showed that other variables could also affect the overall satisfaction performance of mainland tourists, in addition to occupation and age. The overall predictive accuracy of quantile regression model Q0.25 was higher than that of the other three models.
Smooth conditional distribution function and quantiles under random censorship.
Leconte, Eve; Poiraud-Casanova, Sandrine; Thomas-Agnan, Christine
2002-09-01
We consider a nonparametric random design regression model in which the response variable is possibly right censored. The aim of this paper is to estimate the conditional distribution function and the conditional alpha-quantile of the response variable. We restrict attention to the case where the response variable as well as the explanatory variable are unidimensional and continuous. We propose and discuss two classes of estimators which are smooth with respect to the response variable as well as to the covariate. Some simulations demonstrate that the new methods have better mean square error performances than the generalized Kaplan-Meier estimator introduced by Beran (1981) and considered in the literature by Dabrowska (1989, 1992) and Gonzalez-Manteiga and Cadarso-Suarez (1994).
Liu, Zun-Lei; Yuan, Xing-Wei; Yan, Li-Ping; Yang, Lin-Lin; Cheng, Jia-Hua
2013-09-01
By using the 2008-2010 investigation data about the body condition of small yellow croaker in the offshore waters of southern Yellow Sea (SYS), open waters of northern East China Sea (NECS), and offshore waters of middle East China Sea (MECS), this paper analyzed the spatial heterogeneity of body length-body mass of juvenile and adult small yellow croakers by the statistical approaches of mean regression model and quantile regression model. The results showed that the residual standard errors from the analysis of covariance (ANCOVA) and the linear mixed-effects model were similar, and those from the simple linear regression were the highest. For the juvenile small yellow croakers, their mean body mass in SYS and NECS estimated by the mixed-effects mean regression model was higher than the overall average mass across the three regions, while the mean body mass in MECS was below the overall average. For the adult small yellow croakers, their mean body mass in NECS was higher than the overall average, while the mean body mass in SYS and MECS was below the overall average. The results from quantile regression indicated the substantial differences in the allometric relationships of juvenile small yellow croakers between SYS, NECS, and MECS, with the estimated mean exponent of the allometric relationship in SYS being 2.85, and the interquartile range being from 2.63 to 2.96, which indicated the heterogeneity of body form. The results from ANCOVA showed that the allometric body length-body mass relationships were significantly different between the 25th and 75th percentile exponent values (F=6.38, df=1737, P<0.01) and the 25th percentile and median exponent values (F=2.35, df=1737, P=0.039). The relationship was marginally different between the median and 75th percentile exponent values (F=2.21, df=1737, P=0.051). The estimated body length-body mass exponent of adult small yellow croakers in SYS was 3.01 (10th and 95th percentiles = 2.77 and 3.1, respectively). The estimated body length-body mass relationships were significantly different from the lower and upper quantiles of the exponent (F=3.31, df=2793, P=0.01) and the median and upper quantiles (F=3.56, df=2793, P<0.01), while no significant difference was observed between the lower and median quantiles (F=0.98, df=2793, P=0.43).
Solvency supervision based on a total balance sheet approach
NASA Astrophysics Data System (ADS)
Pitselis, Georgios
2009-11-01
In this paper we investigate the adequacy of the own funds a company requires in order to remain healthy and avoid insolvency. Two methods are applied here; the quantile regression method and the method of mixed effects models. Quantile regression is capable of providing a more complete statistical analysis of the stochastic relationship among random variables than least squares estimation. The estimated mixed effects line can be considered as an internal industry equation (norm), which explains a systematic relation between a dependent variable (such as own funds) with independent variables (e.g. financial characteristics, such as assets, provisions, etc.). The above two methods are implemented with two data sets.
Shwartz, Michael; Peköz, Erol A; Burgess, James F; Christiansen, Cindy L; Rosen, Amy K; Berlowitz, Dan
2014-12-01
Two approaches are commonly used for identifying high-performing facilities on a performance measure: one, that the facility is in a top quantile (eg, quintile or quartile); and two, that a confidence interval is below (or above) the average of the measure for all facilities. This type of yes/no designation often does not do well in distinguishing high-performing from average-performing facilities. To illustrate an alternative continuous-valued metric for profiling facilities--the probability a facility is in a top quantile--and show the implications of using this metric for profiling and pay-for-performance. We created a composite measure of quality from fiscal year 2007 data based on 28 quality indicators from 112 Veterans Health Administration nursing homes. A Bayesian hierarchical multivariate normal-binomial model was used to estimate shrunken rates of the 28 quality indicators, which were combined into a composite measure using opportunity-based weights. Rates were estimated using Markov Chain Monte Carlo methods as implemented in WinBUGS. The probability metric was calculated from the simulation replications. Our probability metric allowed better discrimination of high performers than the point or interval estimate of the composite score. In a pay-for-performance program, a smaller top quantile (eg, a quintile) resulted in more resources being allocated to the highest performers, whereas a larger top quantile (eg, being above the median) distinguished less among high performers and allocated more resources to average performers. The probability metric has potential but needs to be evaluated by stakeholders in different types of delivery systems.
Project Lifespan-based Nonstationary Hydrologic Design Methods for Changing Environment
NASA Astrophysics Data System (ADS)
Xiong, L.
2017-12-01
Under changing environment, we must associate design floods with the design life period of projects to ensure the hydrologic design is really relevant to the operation of the hydrologic projects, because the design value for a given exceedance probability over the project life period would be significantly different from that over other time periods of the same length due to the nonstationarity of probability distributions. Several hydrologic design methods that take the design life period of projects into account have been proposed in recent years, i.e. the expected number of exceedances (ENE), design life level (DLL), equivalent reliability (ER), and average design life level (ADLL). Among the four methods to be compared, both the ENE and ER methods are return period-based methods, while DLL and ADLL are risk/reliability- based methods which estimate design values for given probability values of risk or reliability. However, the four methods can be unified together under a general framework through a relationship transforming the so-called representative reliability (RRE) into the return period, i.e. m=1/1(1-RRE), in which we compute the return period m using the representative reliability RRE.The results of nonstationary design quantiles and associated confidence intervals calculated by ENE, ER and ADLL were very similar, since ENE or ER was a special case or had a similar expression form with respect to ADLL. In particular, the design quantiles calculated by ENE and ADLL were the same when return period was equal to the length of the design life. In addition, DLL can yield similar design values if the relationship between DLL and ER/ADLL return periods is considered. Furthermore, ENE, ER and ADLL had good adaptability to either an increasing or decreasing situation, yielding not too large or too small design quantiles. This is important for applications of nonstationary hydrologic design methods in actual practice because of the concern of choosing the emerging nonstationary methods versus the traditional stationary methods. There is still a long way to go for the conceptual transition from stationarity to nonstationarity in hydrologic design.
NASA Astrophysics Data System (ADS)
Rianna, G.; Mercogliano, P.
2017-12-01
Urbanization increases the flood risk because of heightened vulnerability, stemming from population concentration and hazard due to soil sealing affecting the largest part of urban settlements and reducing the concentration time of interested basins. Furthermore, current and future hazards are exacerbated by expected increases in extreme rainfall events due to Climate Changes (CC) making inadequate urban drainage infrastructures designed under the assumption of steady conditions. In this work, we present a modeling chain/algorithm to assess potential increase in pluvial flood hazard able to take into account CC forcing. The adopted simulation chain reckon on three main elements: Regional Climate Model, COSMO_CLM, dynamically downscaling GCM CMCC_CM (Scoccimarro et al., 2011) and optimized, at high resolution (about 8km), by Bucchignani et al. (2015) on Italy provide projections about precipitation up to 2100 under two concentration scenarios (RCP4.5 and RCP8.5). Such projections are used in Equidistance Quantile Mapping (EQM) approach, developed by Srivastav et al. (2014) to estimate expected variations in IDF (Intensity-Duration-Frequency) curves calculated through Generalized Extreme Value (GEV) approach on the basis of available rainfall data. To this aim, 1971-2000 observations are used as reference. Finally, a 1-D/2-D coupled urban drainage/flooding model forced by IDF (current and projected) is used to simulate storm-sewer surcharge and surface inundation to establish the variations in urban flooding risk. As test case is considered the city center of Naples (Southern Italy). In this respective, the sewage and urban drainage network is highly complex due to the historical and subsequent transformations of the city. Under such constraints, the reliability of the results maybe deeply conditioned by uncertainties not undermining the illustrative purposes of the work. Briefly, EQM returns a remarkable increase in extreme precipitations; such increase is driven by concentration scenarios (higher for RCP8.5) and investigated time horizon (more significant for 2071-2100 time span). Furthermore, results provided by hydraulic models clearly highlight the inadequacy of the actual drainage system especially under a RCP8.5-driven scenario showing large portions of the city center flooded.
2014-12-01
Primary Military Occupational Specialty PRO Proficiency Q-Q Quantile - Quantile RSS Residual Sum of Squares SI Shop Information T&R Training and...construct multivariate linear regression models to estimate Marines’ Computed Tier Score and time to achieve E-4 based on their individual personal...Science (GS) score, ASVAB Mathematics Knowledge (MK) score, ASVAB Paragraph Comprehension (PC) score, weight , and whether a Marine receives a weight
Lee, Seung-Mi; Choi, In-Sun; Han, Euna; Suh, David; Shin, Eun-Kyung; Je, Seyunghe; Lee, Sung Su; Suh, Dong-Churl
2018-01-01
This study aimed to estimate treatment costs attributable to overweight and obesity in patients with diabetes who were less than 65 years of age in the United States. This study used data from the Medical Expenditure Panel Survey from 2001 to 2013. Patients with diabetes were identified by using the International Classification of Diseases, Ninth Revision, Clinical Modification code (250), clinical classification codes (049 and 050), or self-reported physician diagnoses. Total treatment costs attributable to overweight and obesity were calculated as the differences in the adjusted costs compared with individuals with diabetes and normal weight. Adjusted costs were estimated by using generalized linear models or unconditional quantile regression models. The mean annual treatment costs attributable to obesity were $1,852 higher than those attributable to normal weight, while costs attributable to overweight were $133 higher. The unconditional quantile regression results indicated that the impact of obesity on total treatment costs gradually became more significant as treatment costs approached the upper quantile. Among patients with diabetes who were less than 65 years of age, patients with diabetes and obesity have significantly higher treatment costs than patients with diabetes and normal weight. The economic burden of diabetes to society will continue to increase unless more proactive preventive measures are taken to effectively treat patients with overweight or obesity. © 2017 The Obesity Society.
Helbich, Marco; Klein, Nadja; Roberts, Hannah; Hagedoorn, Paulien; Groenewegen, Peter P
2018-06-20
Exposure to green space seems to be beneficial for self-reported mental health. In this study we used an objective health indicator, namely antidepressant prescription rates. Current studies rely exclusively upon mean regression models assuming linear associations. It is, however, plausible that the presence of green space is non-linearly related with different quantiles of the outcome antidepressant prescription rates. These restrictions may contribute to inconsistent findings. Our aim was: a) to assess antidepressant prescription rates in relation to green space, and b) to analyze how the relationship varies non-linearly across different quantiles of antidepressant prescription rates. We used cross-sectional data for the year 2014 at a municipality level in the Netherlands. Ecological Bayesian geoadditive quantile regressions were fitted for the 15%, 50%, and 85% quantiles to estimate green space-prescription rate correlations, controlling for physical activity levels, socio-demographics, urbanicity, etc. RESULTS: The results suggested that green space was overall inversely and non-linearly associated with antidepressant prescription rates. More important, the associations differed across the quantiles, although the variation was modest. Significant non-linearities were apparent: The associations were slightly positive in the lower quantile and strongly negative in the upper one. Our findings imply that an increased availability of green space within a municipality may contribute to a reduction in the number of antidepressant prescriptions dispensed. Green space is thus a central health and community asset, whilst a minimum level of 28% needs to be established for health gains. The highest effectiveness occurred at a municipality surface percentage higher than 79%. This inverse dose-dependent relation has important implications for setting future community-level health and planning policies. Copyright © 2018 Elsevier Inc. All rights reserved.
Estimating risks to aquatic life using quantile regression
Schmidt, Travis S.; Clements, William H.; Cade, Brian S.
2012-01-01
One of the primary goals of biological assessment is to assess whether contaminants or other stressors limit the ecological potential of running waters. It is important to interpret responses to contaminants relative to other environmental factors, but necessity or convenience limit quantification of all factors that influence ecological potential. In these situations, the concept of limiting factors is useful for data interpretation. We used quantile regression to measure risks to aquatic life exposed to metals by including all regression quantiles (τ = 0.05–0.95, by increments of 0.05), not just the upper limit of density (e.g., 90th quantile). We measured population densities (individuals/0.1 m2) of 2 mayflies (Rhithrogena spp., Drunella spp.) and a caddisfly (Arctopsyche grandis), aqueous metal mixtures (Cd, Cu, Zn), and other limiting factors (basin area, site elevation, discharge, temperature) at 125 streams in Colorado. We used a model selection procedure to test which factor was most limiting to density. Arctopsyche grandis was limited by other factors, whereas metals limited most quantiles of density for the 2 mayflies. Metals reduced mayfly densities most at sites where other factors were not limiting. Where other factors were limiting, low mayfly densities were observed despite metal concentrations. Metals affected mayfly densities most at quantiles above the mean and not just at the upper limit of density. Risk models developed from quantile regression showed that mayfly densities observed at background metal concentrations are improbable when metal mixtures are at US Environmental Protection Agency criterion continuous concentrations. We conclude that metals limit potential density, not realized average density. The most obvious effects on mayfly populations were at upper quantiles and not mean density. Therefore, we suggest that policy developed from mean-based measures of effects may not be as useful as policy based on the concept of limiting factors.
Variability of daily UV index in Jokioinen, Finland, in 1995-2015
NASA Astrophysics Data System (ADS)
Heikkilä, A.; Uusitalo, K.; Kärhä, P.; Vaskuri, A.; Lakkala, K.; Koskela, T.
2017-02-01
UV Index is a measure for UV radiation harmful for the human skin, developed and used to promote the sun awareness and protection of people. Monitoring programs conducted around the world have produced a number of long-term time series of UV irradiance. One of the longest time series of solar spectral UV irradiance in Europe has been obtained from the continuous measurements of Brewer #107 spectrophotometer in Jokioinen (lat. 60°44'N, lon. 23°30'E), Finland, over the years 1995-2015. We have used descriptive statistics and estimates of cumulative distribution functions, quantiles and probability density functions in the analysis of the time series of daily UV Index maxima. Seasonal differences in the estimated distributions and in the trends of the estimated quantiles are found.
Alternative Statistical Frameworks for Student Growth Percentile Estimation
ERIC Educational Resources Information Center
Lockwood, J. R.; Castellano, Katherine E.
2015-01-01
This article suggests two alternative statistical approaches for estimating student growth percentiles (SGP). The first is to estimate percentile ranks of current test scores conditional on past test scores directly, by modeling the conditional cumulative distribution functions, rather than indirectly through quantile regressions. This would…
Cade, B.S.; Terrell, J.W.; Neely, B.C.
2011-01-01
Increasing our understanding of how environmental factors affect fish body condition and improving its utility as a metric of aquatic system health require reliable estimates of spatial variation in condition (weight at length). We used three statistical approaches that varied in how they accounted for heterogeneity in allometric growth to estimate differences in body condition of blue suckers Cycleptus elongatus across 19 large-river locations in the central USA. Quantile regression of an expanded allometric growth model provided the most comprehensive estimates, including variation in exponents within and among locations (range = 2.88–4.24). Blue suckers from more-southerly locations had the largest exponents. Mixed-effects mean regression of a similar expanded allometric growth model allowed exponents to vary among locations (range = 3.03–3.60). Mean relative weights compared across selected intervals of total length (TL = 510–594 and 594–692 mm) in a multiplicative model involved the implicit assumption that allometric exponents within and among locations were similar to the exponent (3.46) for the standard weight equation. Proportionate differences in the quantiles of weight at length for adult blue suckers (TL = 510, 594, 644, and 692 mm) compared with their average across locations ranged from 1.08 to 1.30 for southern locations (Texas, Mississippi) and from 0.84 to 1.00 for northern locations (Montana, North Dakota); proportionate differences for mean weight ranged from 1.13 to 1.17 and from 0.87 to 0.95, respectively, and those for mean relative weight ranged from 1.10 to 1.18 and from 0.86 to 0.98, respectively. Weights for fish at longer lengths varied by 600–700 g within a location and by as much as 2,000 g among southern and northern locations. Estimates for the Wabash River, Indiana (0.96–1.07 times the average; greatest increases for lower weights at shorter TLs), and for the Missouri River from Blair, Nebraska, to Sioux City, Iowa (0.90–1.00 times the average; greatest decreases for lower weights at longer TLs), were examined in detail to explain the additional information provided by quantile estimates.
A comparison of regional flood frequency analysis approaches in a simulation framework
NASA Astrophysics Data System (ADS)
Ganora, D.; Laio, F.
2016-07-01
Regional frequency analysis (RFA) is a well-established methodology to provide an estimate of the flood frequency curve at ungauged (or scarcely gauged) sites. Different RFA approaches exist, depending on the way the information is transferred to the site of interest, but it is not clear in the literature if a specific method systematically outperforms the others. The aim of this study is to provide a framework wherein carrying out the intercomparison by building up a virtual environment based on synthetically generated data. The considered regional approaches include: (i) a unique regional curve for the whole region; (ii) a multiple-region model where homogeneous subregions are determined through cluster analysis; (iii) a Region-of-Influence model which defines a homogeneous subregion for each site; (iv) a spatially smooth estimation procedure where the parameters of the regional model vary continuously along the space. Virtual environments are generated considering different patterns of heterogeneity, including step change and smooth variations. If the region is heterogeneous, with the parent distribution changing continuously within the region, the spatially smooth regional approach outperforms the others, with overall errors 10-50% lower than the other methods. In the case of a step-change, the spatially smooth and clustering procedures perform similarly if the heterogeneity is moderate, while clustering procedures work better when the step-change is severe. To extend our findings, an extensive sensitivity analysis has been performed to investigate the effect of sample length, number of virtual stations, return period of the predicted quantile, variability of the scale parameter of the parent distribution, number of predictor variables and different parent distribution. Overall, the spatially smooth approach appears as the most robust approach as its performances are more stable across different patterns of heterogeneity, especially when short records are considered.
Greenville, Aaron C; Wardle, Glenda M; Dickman, Chris R
2012-01-01
Extreme climatic events, such as flooding rains, extended decadal droughts and heat waves have been identified increasingly as important regulators of natural populations. Climate models predict that global warming will drive changes in rainfall and increase the frequency and severity of extreme events. Consequently, to anticipate how organisms will respond we need to document how changes in extremes of temperature and rainfall compare to trends in the mean values of these variables and over what spatial scales the patterns are consistent. Using the longest historical weather records available for central Australia – 100 years – and quantile regression methods, we investigate if extreme climate events have changed at similar rates to median events, if annual rainfall has increased in variability, and if the frequency of large rainfall events has increased over this period. Specifically, we compared local (individual weather stations) and regional (Simpson Desert) spatial scales, and quantified trends in median (50th quantile) and extreme weather values (5th, 10th, 90th, and 95th quantiles). We found that median and extreme annual minimum and maximum temperatures have increased at both spatial scales over the past century. Rainfall changes have been inconsistent across the Simpson Desert; individual weather stations showed increases in annual rainfall, increased frequency of large rainfall events or more prolonged droughts, depending on the location. In contrast to our prediction, we found no evidence that intra-annual rainfall had become more variable over time. Using long-term live-trapping records (22 years) of desert small mammals as a case study, we demonstrate that irruptive events are driven by extreme rainfalls (>95th quantile) and that increases in the magnitude and frequency of extreme rainfall events are likely to drive changes in the populations of these species through direct and indirect changes in predation pressure and wildfires. PMID:23170202
Interpolating Non-Parametric Distributions of Hourly Rainfall Intensities Using Random Mixing
NASA Astrophysics Data System (ADS)
Mosthaf, Tobias; Bárdossy, András; Hörning, Sebastian
2015-04-01
The correct spatial interpolation of hourly rainfall intensity distributions is of great importance for stochastical rainfall models. Poorly interpolated distributions may lead to over- or underestimation of rainfall and consequently to wrong estimates of following applications, like hydrological or hydraulic models. By analyzing the spatial relation of empirical rainfall distribution functions, a persistent order of the quantile values over a wide range of non-exceedance probabilities is observed. As the order remains similar, the interpolation weights of quantile values for one certain non-exceedance probability can be applied to the other probabilities. This assumption enables the use of kernel smoothed distribution functions for interpolation purposes. Comparing the order of hourly quantile values over different gauges with the order of their daily quantile values for equal probabilities, results in high correlations. The hourly quantile values also show high correlations with elevation. The incorporation of these two covariates into the interpolation is therefore tested. As only positive interpolation weights for the quantile values assure a monotonically increasing distribution function, the use of geostatistical methods like kriging is problematic. Employing kriging with external drift to incorporate secondary information is not applicable. Nonetheless, it would be fruitful to make use of covariates. To overcome this shortcoming, a new random mixing approach of spatial random fields is applied. Within the mixing process hourly quantile values are considered as equality constraints and correlations with elevation values are included as relationship constraints. To profit from the dependence of daily quantile values, distribution functions of daily gauges are used to set up lower equal and greater equal constraints at their locations. In this way the denser daily gauge network can be included in the interpolation of the hourly distribution functions. The applicability of this new interpolation procedure will be shown for around 250 hourly rainfall gauges in the German federal state of Baden-Württemberg. The performance of the random mixing technique within the interpolation is compared to applicable kriging methods. Additionally, the interpolation of kernel smoothed distribution functions is compared with the interpolation of fitted parametric distributions.
Etchevers, Anne; Le Tertre, Alain; Lucas, Jean-Paul; Bretin, Philippe; Oulhote, Youssef; Le Bot, Barbara; Glorennec, Philippe
2015-01-01
Blood lead levels (BLLs) have substantially decreased in recent decades in children in France. However, further reducing exposure is a public health goal because there is no clear toxicological threshold. The identification of the environmental determinants of BLLs as well as risk factors associated with high BLLs is important to update prevention strategies. We aimed to estimate the contribution of environmental sources of lead to different BLLs in children in France. We enrolled 484 children aged from 6months to 6years, in a nationwide cross-sectional survey in 2008-2009. We measured lead concentrations in blood and environmental samples (water, soils, household settled dusts, paints, cosmetics and traditional cookware). We performed two models: a multivariate generalized additive model on the geometric mean (GM), and a quantile regression model on the 10th, 25th, 50th, 75th and 90th quantile of BLLs. The GM of BLLs was 13.8μg/L (=1.38μg/dL) (95% confidence intervals (CI): 12.7-14.9) and the 90th quantile was 25.7μg/L (CI: 24.2-29.5). Household and common area dust, tap water, interior paint, ceramic cookware, traditional cosmetics, playground soil and dust, and environmental tobacco smoke were associated with the GM of BLLs. Household dust and tap water made the largest contributions to both the GM and the 90th quantile of BLLs. The concentration of lead in dust was positively correlated with all quantiles of BLLs even at low concentrations. Lead concentrations in tap water above 5μg/L were also positively correlated with the GM, 75th and 90th quantiles of BLLs in children drinking tap water. Preventative actions must target household settled dust and tap water to reduce the BLLs of children in France. The use of traditional cosmetics should be avoided whereas ceramic cookware should be limited to decorative purposes. Copyright © 2014 Elsevier Ltd. All rights reserved.
Nonparametric methods for drought severity estimation at ungauged sites
NASA Astrophysics Data System (ADS)
Sadri, S.; Burn, D. H.
2012-12-01
The objective in frequency analysis is, given extreme events such as drought severity or duration, to estimate the relationship between that event and the associated return periods at a catchment. Neural networks and other artificial intelligence approaches in function estimation and regression analysis are relatively new techniques in engineering, providing an attractive alternative to traditional statistical models. There are, however, few applications of neural networks and support vector machines in the area of severity quantile estimation for drought frequency analysis. In this paper, we compare three methods for this task: multiple linear regression, radial basis function neural networks, and least squares support vector regression (LS-SVR). The area selected for this study includes 32 catchments in the Canadian Prairies. From each catchment drought severities are extracted and fitted to a Pearson type III distribution, which act as observed values. For each method-duration pair, we use a jackknife algorithm to produce estimated values at each site. The results from these three approaches are compared and analyzed, and it is found that LS-SVR provides the best quantile estimates and extrapolating capacity.
An Investigation of Factors Influencing Nurses' Clinical Decision-Making Skills.
Wu, Min; Yang, Jinqiu; Liu, Lingying; Ye, Benlan
2016-08-01
This study aims to investigate the influencing factors on nurses' clinical decision-making (CDM) skills. A cross-sectional nonexperimental research design was conducted in the medical, surgical, and emergency departments of two university hospitals, between May and June 2014. We used a quantile regression method to identify the influencing factors across different quantiles of the CDM skills distribution and compared the results with the corresponding ordinary least squares (OLS) estimates. Our findings revealed that nurses were best at the skills of managing oneself. Educational level, experience, and the total structural empowerment had significant positive impacts on nurses' CDM skills, while the nurse-patient relationship, patient care and interaction, formal empowerment, and information empowerment were negatively correlated with nurses' CDM skills. These variables explained no more than 30% of the variance in nurses' CDM skills and mainly explained the lower quantiles of nurses' CDM skills distribution. © The Author(s) 2016.
Heterogeneity in Smokers' Responses to Tobacco Control Policies.
Nesson, Erik
2017-02-01
This paper uses unconditional quantile regression to estimate whether smokers' responses to tobacco control policies change across the distribution of smoking levels. I measure smoking behavior with the number of cigarettes smoked per day and also with serum cotinine levels, a continuous biomarker of nicotine exposure, using individual-level repeated cross-section data from the National Health and Nutrition Examination Surveys. I find that the cigarette taxes lead to reductions in both the number of cigarettes smoked per day and in smokers' cotinine levels. These reductions are most pronounced in the middle quantiles of both distributions in terms of marginal effects, but most pronounced in the lower quantiles in terms of tax elasticities. I do not find that higher cigarette taxes lead to statistically significant changes in the amount of nicotine smokers ingest from each cigarette. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Simulating Quantile Models with Applications to Economics and Management
NASA Astrophysics Data System (ADS)
Machado, José A. F.
2010-05-01
The massive increase in the speed of computers over the past forty years changed the way that social scientists, applied economists and statisticians approach their trades and also the very nature of the problems that they could feasibly tackle. The new methods that use intensively computer power go by the names of "computer-intensive" or "simulation". My lecture will start with bird's eye view of the uses of simulation in Economics and Statistics. Then I will turn out to my own research on uses of computer- intensive methods. From a methodological point of view the question I address is how to infer marginal distributions having estimated a conditional quantile process, (Counterfactual Decomposition of Changes in Wage Distributions using Quantile Regression," Journal of Applied Econometrics 20, 2005). Illustrations will be provided of the use of the method to perform counterfactual analysis in several different areas of knowledge.
Measuring racial/ethnic disparities across the distribution of health care expenditures.
Cook, Benjamin Lê; Manning, Willard G
2009-10-01
To assess whether black-white and Hispanic-white disparities increase or abate in the upper quantiles of total health care expenditure, conditional on covariates. Nationally representative adult population of non-Hispanic whites, African Americans, and Hispanics from the 2001-2005 Medical Expenditure Panel Surveys. We examine unadjusted racial/ethnic differences across the distribution of expenditures. We apply quantile regression to measure disparities at the median, 75th, 90th, and 95th quantiles, testing for differences over the distribution of health care expenditures and across income and education categories. We test the sensitivity of the results to comparisons based only on health status and estimate a two-part model to ensure that results are not driven by an extremely skewed distribution of expenditures with a large zero mass. Black-white and Hispanic-white disparities diminish in the upper quantiles of expenditure, but expenditures for blacks and Hispanics remain significantly lower than for whites throughout the distribution. For most education and income categories, disparities exist at the median and decline, but remain significant even with increased education and income. Blacks and Hispanics receive significantly disparate care at high expenditure levels, suggesting prioritization of improved access to quality care among minorities with critical health issues.
Zhang, Hanze; Huang, Yangxin; Wang, Wei; Chen, Henian; Langland-Orban, Barbara
2017-01-01
In longitudinal AIDS studies, it is of interest to investigate the relationship between HIV viral load and CD4 cell counts, as well as the complicated time effect. Most of common models to analyze such complex longitudinal data are based on mean-regression, which fails to provide efficient estimates due to outliers and/or heavy tails. Quantile regression-based partially linear mixed-effects models, a special case of semiparametric models enjoying benefits of both parametric and nonparametric models, have the flexibility to monitor the viral dynamics nonparametrically and detect the varying CD4 effects parametrically at different quantiles of viral load. Meanwhile, it is critical to consider various data features of repeated measurements, including left-censoring due to a limit of detection, covariate measurement error, and asymmetric distribution. In this research, we first establish a Bayesian joint models that accounts for all these data features simultaneously in the framework of quantile regression-based partially linear mixed-effects models. The proposed models are applied to analyze the Multicenter AIDS Cohort Study (MACS) data. Simulation studies are also conducted to assess the performance of the proposed methods under different scenarios.
Use of historical information in extreme storm surges frequency analysis
NASA Astrophysics Data System (ADS)
Hamdi, Yasser; Duluc, Claire-Marie; Deville, Yves; Bardet, Lise; Rebour, Vincent
2013-04-01
The prevention of storm surge flood risks is critical for protection and design of coastal facilities to very low probabilities of failure. The effective protection requires the use of a statistical analysis approach having a solid theoretical motivation. Relating extreme storm surges to their frequency of occurrence using probability distributions has been a common issue since 1950s. The engineer needs to determine the storm surge of a given return period, i.e., the storm surge quantile or design storm surge. Traditional methods for determining such a quantile have been generally based on data from the systematic record alone. However, the statistical extrapolation, to estimate storm surges corresponding to high return periods, is seriously contaminated by sampling and model uncertainty if data are available for a relatively limited period. This has motivated the development of approaches to enlarge the sample extreme values beyond the systematic period. The nonsystematic data occurred before the systematic period is called historical information. During the last three decades, the value of using historical information as a nonsystematic data in frequency analysis has been recognized by several authors. The basic hypothesis in statistical modeling of historical information is that a perception threshold exists and that during a giving historical period preceding the period of tide gauging, all exceedances of this threshold have been recorded. Historical information prior to the systematic records may arise from high-sea water marks left by extreme surges on the coastal areas. It can also be retrieved from archives, old books, earliest newspapers, damage reports, unpublished written records and interviews with local residents. A plotting position formula, to compute empirical probabilities based on systematic and historical data, is used in this communication paper. The objective of the present work is to examine the potential gain in estimation accuracy with the use of historical information (to the Brest tide gauge located in the French Atlantic coast). In addition, the present work contributes to addressing the problem of the presence of outliers in data sets. Historical data are generally imprecise, and their inaccuracy should be properly accounted for in the analysis. However, as several authors believe, even with substantial uncertainty in the data, the use of historical information is a viable mean to improve estimates of rare events related to extreme environmental conditions. The preliminary results of this study suggest that the use of historical information increases the representativity of an outlier in the systematic data. It is also shown that the use of historical information, specifically the perception sea water level, can be considered as a reliable solution for the optimal planning and design of facilities to withstand extreme environmental conditions, which will occur during its lifetime, with an appropriate optimum of risk level. Findings are of practical relevance for applications in storm surge risk analysis and flood management.
Student Growth Percentiles Based on MIRT: Implications of Calibrated Projection. CRESST Report 842
ERIC Educational Resources Information Center
Monroe, Scott; Cai, Li; Choi, Kilchan
2014-01-01
This research concerns a new proposal for calculating student growth percentiles (SGP, Betebenner, 2009). In Betebenner (2009), quantile regression (QR) is used to estimate the SGPs. However, measurement error in the score estimates, which always exists in practice, leads to bias in the QR-based estimates (Shang, 2012). One way to address this…
Estimation of Return Values of Wave Height: Consequences of Missing Observations
ERIC Educational Resources Information Center
Ryden, Jesper
2008-01-01
Extreme-value statistics is often used to estimate so-called return values (actually related to quantiles) for environmental quantities like wind speed or wave height. A basic method for estimation is the method of block maxima which consists in partitioning observations in blocks, where maxima from each block could be considered independent.…
Local Composite Quantile Regression Smoothing for Harris Recurrent Markov Processes
Li, Degui; Li, Runze
2016-01-01
In this paper, we study the local polynomial composite quantile regression (CQR) smoothing method for the nonlinear and nonparametric models under the Harris recurrent Markov chain framework. The local polynomial CQR regression method is a robust alternative to the widely-used local polynomial method, and has been well studied in stationary time series. In this paper, we relax the stationarity restriction on the model, and allow that the regressors are generated by a general Harris recurrent Markov process which includes both the stationary (positive recurrent) and nonstationary (null recurrent) cases. Under some mild conditions, we establish the asymptotic theory for the proposed local polynomial CQR estimator of the mean regression function, and show that the convergence rate for the estimator in nonstationary case is slower than that in stationary case. Furthermore, a weighted type local polynomial CQR estimator is provided to improve the estimation efficiency, and a data-driven bandwidth selection is introduced to choose the optimal bandwidth involved in the nonparametric estimators. Finally, we give some numerical studies to examine the finite sample performance of the developed methodology and theory. PMID:27667894
Payande, Abolfazl; Tabesh, Hamed; Shakeri, Mohammad Taghi; Saki, Azadeh; Safarian, Mohammad
2013-01-14
Growth charts are widely used to assess children's growth status and can provide a trajectory of growth during early important months of life. The objectives of this study are going to construct growth charts and normal values of weight-for-age for children aged 0 to 5 years using a powerful and applicable methodology. The results compare with the World Health Organization (WHO) references and semi-parametric LMS method of Cole and Green. A total of 70737 apparently healthy boys and girls aged 0 to 5 years were recruited in July 2004 for 20 days from those attending community clinics for routine health checks as a part of a national survey. Anthropometric measurements were done by trained health staff using WHO methodology. The nonparametric quantile regression method obtained by local constant kernel estimation of conditional quantiles curves using for estimation of curves and normal values. The weight-for-age growth curves for boys and girls aged from 0 to 5 years were derived utilizing a population of children living in the northeast of Iran. The results were similar to the ones obtained by the semi-parametric LMS method in the same data. Among all age groups from 0 to 5 years, the median values of children's weight living in the northeast of Iran were lower than the corresponding values in WHO reference data. The weight curves of boys were higher than those of girls in all age groups. The differences between growth patterns of children living in the northeast of Iran versus international ones necessitate using local and regional growth charts. International normal values may not properly recognize the populations at risk for growth problems in Iranian children. Quantile regression (QR) as a flexible method which doesn't require restricted assumptions, proposed for estimation reference curves and normal values.
Payande, Abolfazl; Tabesh, Hamed; Shakeri, Mohammad Taghi; Saki, Azadeh; Safarian, Mohammad
2013-01-01
Introduction: Growth charts are widely used to assess children’s growth status and can provide a trajectory of growth during early important months of life. The objectives of this study are going to construct growth charts and normal values of weight-for-age for children aged 0 to 5 years using a powerful and applicable methodology. The results compare with the World Health Organization (WHO) references and semi-parametric LMS method of Cole and Green. Methods: A total of 70737 apparently healthy boys and girls aged 0 to 5 years were recruited in July 2004 for 20 days from those attending community clinics for routine health checks as a part of a national survey. Anthropometric measurements were done by trained health staff using WHO methodology. The nonparametric quantile regression method obtained by local constant kernel estimation of conditional quantiles curves using for estimation of curves and normal values. Results: The weight-for-age growth curves for boys and girls aged from 0 to 5 years were derived utilizing a population of children living in the northeast of Iran. The results were similar to the ones obtained by the semi-parametric LMS method in the same data. Among all age groups from 0 to 5 years, the median values of children’s weight living in the northeast of Iran were lower than the corresponding values in WHO reference data. The weight curves of boys were higher than those of girls in all age groups. Conclusion: The differences between growth patterns of children living in the northeast of Iran versus international ones necessitate using local and regional growth charts. International normal values may not properly recognize the populations at risk for growth problems in Iranian children. Quantile regression (QR) as a flexible method which doesn’t require restricted assumptions, proposed for estimation reference curves and normal values. PMID:23618470
Hydrologic Design in the Anthropocene
NASA Astrophysics Data System (ADS)
Vogel, R. M.; Farmer, W. H.; Read, L.
2014-12-01
In an era dubbed the Anthropocene, the natural world is being transformed by a myriad of human influences. As anthropogenic impacts permeate hydrologic systems, hydrologists are challenged to fully account for such changes and develop new methods of hydrologic design. Deterministic watershed models (DWM), which can account for the impacts of changes in land use, climate and infrastructure, are becoming increasing popular for the design of flood and/or drought protection measures. As with all models that are calibrated to existing datasets, DWMs are subject to model error or uncertainty. In practice, the model error component of DWM predictions is typically ignored yet DWM simulations which ignore model error produce model output which cannot reproduce the statistical properties of the observations they are intended to replicate. In the context of hydrologic design, we demonstrate how ignoring model error can lead to systematic downward bias in flood quantiles, upward bias in drought quantiles and upward bias in water supply yields. By reincorporating model error, we document how DWM models can be used to generate results that mimic actual observations and preserve their statistical behavior. In addition to use of DWM for improved predictions in a changing world, improved communication of the risk and reliability is also needed. Traditional statements of risk and reliability in hydrologic design have been characterized by return periods, but such statements often assume that the annual probability of experiencing a design event remains constant throughout the project horizon. We document the general impact of nonstationarity on the average return period and reliability in the context of hydrologic design. Our analyses reveal that return periods do not provide meaningful expressions of the likelihood of future hydrologic events. Instead, knowledge of system reliability over future planning horizons can more effectively prepare society and communicate the likelihood of future hydrologic events of interest.
Modelling the effectiveness of grass buffer strips in managing muddy floods under a changing climate
NASA Astrophysics Data System (ADS)
Mullan, Donal; Vandaele, Karel; Boardman, John; Meneely, John; Crossley, Laura H.
2016-10-01
Muddy floods occur when rainfall generates runoff on agricultural land, detaching and transporting sediment into the surrounding natural and built environment. In the Belgian Loess Belt, muddy floods occur regularly and lead to considerable economic costs associated with damage to property and infrastructure. Mitigation measures designed to manage the problem have been tested in a pilot area within Flanders and were found to be cost-effective within three years. This study assesses whether these mitigation measures will remain effective under a changing climate. To test this, the Water Erosion Prediction Project (WEPP) model was used to examine muddy flooding diagnostics (precipitation, runoff, soil loss and sediment yield) for a case study hillslope in Flanders where grass buffer strips are currently used as a mitigation measure. The model was run for present day conditions and then under 33 future site-specific climate scenarios. These future scenarios were generated from three earth system models driven by four representative concentration pathways and downscaled using quantile mapping and the weather generator CLIGEN. Results reveal that under the majority of future scenarios, muddy flooding diagnostics are projected to increase, mostly as a consequence of large scale precipitation events rather than mean changes. The magnitude of muddy flood events for a given return period is also generally projected to increase. These findings indicate that present day mitigation measures may have a reduced capacity to manage muddy flooding given the changes imposed by a warming climate with an enhanced hydrological cycle. Revisions to the design of existing mitigation measures within existing policy frameworks are considered the most effective way to account for the impacts of climate change in future mitigation planning.
Global Climate Model Simulated Hydrologic Droughts and Floods in the Nelson-Churchill Watershed
NASA Astrophysics Data System (ADS)
Vieira, M. J. F.; Stadnyk, T. A.; Koenig, K. A.
2014-12-01
There is uncertainty surrounding the duration, magnitude and frequency of historical hydroclimatic extremes such as hydrologic droughts and floods prior to the observed record. In regions where paleoclimatic studies are less reliable, Global Climate Models (GCMs) can provide useful information about past hydroclimatic conditions. This study evaluates the use of Coupled Model Intercomparison Project 5 (CMIP5) GCMs to enhance the understanding of historical droughts and floods across the Canadian Prairie region in the Nelson-Churchill Watershed (NCW). The NCW is approximately 1.4 million km2 in size and drains into Hudson Bay in Northern Manitoba, Canada. One hundred years of observed hydrologic records show extended dry and wet periods in this region; however paleoclimatic studies suggest that longer, more severe droughts have occurred in the past. In Manitoba, where hydropower is the primary source of electricity, droughts are of particular interest as they are important for future resource planning. Twenty-three GCMs with daily runoff are evaluated using 16 metrics for skill in reproducing historic annual runoff patterns. A common 56-year historic period of 1950-2005 is used for this evaluation to capture wet and dry periods. GCM runoff is then routed at a grid resolution of 0.25° using the WATFLOOD hydrological model storage-routing algorithm to develop streamflow scenarios. Reservoir operation is naturalized and a consistent temperature scenario is used to determine ice-on and ice-off conditions. These streamflow simulations are compared with the historic record to remove bias using quantile mapping of empirical distribution functions. GCM runoff data from pre-industrial and future projection experiments are also bias corrected to obtain extended streamflow simulations. GCM streamflow simulations of more than 650 years include a stationary (pre-industrial) period and future periods forced by radiative forcing scenarios. Quantile mapping adjusts for magnitude only while maintaining the GCM's sequencing of events, allowing for the examination of differences in historic and future hydroclimatic extremes. These bias corrected streamflow scenarios provide an alternative to stochastic simulations for hydrologic data analysis and can aid future resource planning and environmental studies.
NASA Astrophysics Data System (ADS)
Kacprzak, T.; Herbel, J.; Amara, A.; Réfrégier, A.
2018-02-01
Approximate Bayesian Computation (ABC) is a method to obtain a posterior distribution without a likelihood function, using simulations and a set of distance metrics. For that reason, it has recently been gaining popularity as an analysis tool in cosmology and astrophysics. Its drawback, however, is a slow convergence rate. We propose a novel method, which we call qABC, to accelerate ABC with Quantile Regression. In this method, we create a model of quantiles of distance measure as a function of input parameters. This model is trained on a small number of simulations and estimates which regions of the prior space are likely to be accepted into the posterior. Other regions are then immediately rejected. This procedure is then repeated as more simulations are available. We apply it to the practical problem of estimation of redshift distribution of cosmological samples, using forward modelling developed in previous work. The qABC method converges to nearly same posterior as the basic ABC. It uses, however, only 20% of the number of simulations compared to basic ABC, achieving a fivefold gain in execution time for our problem. For other problems the acceleration rate may vary; it depends on how close the prior is to the final posterior. We discuss possible improvements and extensions to this method.
Estimating tree crown widths for the primary Acadian species in Maine
Matthew B. Russell; Aaron R. Weiskittel
2012-01-01
In this analysis, data for seven conifer and eight hardwood species were gathered from across the state of Maine for estimating tree crown widths. Maximum and largest crown width equations were developed using tree diameter at breast height as the primary predicting variable. Quantile regression techniques were used to estimate the maximum crown width and a constrained...
The 2011 heat wave in Greater Houston: Effects of land use on temperature.
Zhou, Weihe; Ji, Shuang; Chen, Tsun-Hsuan; Hou, Yi; Zhang, Kai
2014-11-01
Effects of land use on temperatures during severe heat waves have been rarely studied. This paper examines land use-temperature associations during the 2011 heat wave in Greater Houston. We obtained high resolution of satellite-derived land use data from the US National Land Cover Database, and temperature observations at 138 weather stations from Weather Underground, Inc (WU) during the August of 2011, which was the hottest month in Houston since 1889. Land use regression and quantile regression methods were applied to the monthly averages of daily maximum/mean/minimum temperatures and 114 land use-related predictors. Although selected variables vary with temperature metric, distance to the coastline consistently appears among all models. Other variables are generally related to high developed intensity, open water or wetlands. In addition, our quantile regression analysis shows that distance to the coastline and high developed intensity areas have larger impacts on daily average temperatures at higher quantiles, and open water area has greater impacts on daily minimum temperatures at lower quantiles. By utilizing both land use regression and quantile regression on a recent heat wave in one of the largest US metropolitan areas, this paper provides a new perspective on the impacts of land use on temperatures. Our models can provide estimates of heat exposures for epidemiological studies, and our findings can be combined with demographic variables, air conditioning and relevant diseases information to identify 'hot spots' of population vulnerability for public health interventions to reduce heat-related health effects during heat waves. Copyright © 2014 Elsevier Inc. All rights reserved.
Modeling energy expenditure in children and adolescents using quantile regression
Yang, Yunwen; Adolph, Anne L.; Puyau, Maurice R.; Vohra, Firoz A.; Zakeri, Issa F.
2013-01-01
Advanced mathematical models have the potential to capture the complex metabolic and physiological processes that result in energy expenditure (EE). Study objective is to apply quantile regression (QR) to predict EE and determine quantile-dependent variation in covariate effects in nonobese and obese children. First, QR models will be developed to predict minute-by-minute awake EE at different quantile levels based on heart rate (HR) and physical activity (PA) accelerometry counts, and child characteristics of age, sex, weight, and height. Second, the QR models will be used to evaluate the covariate effects of weight, PA, and HR across the conditional EE distribution. QR and ordinary least squares (OLS) regressions are estimated in 109 children, aged 5–18 yr. QR modeling of EE outperformed OLS regression for both nonobese and obese populations. Average prediction errors for QR compared with OLS were not only smaller at the median τ = 0.5 (18.6 vs. 21.4%), but also substantially smaller at the tails of the distribution (10.2 vs. 39.2% at τ = 0.1 and 8.7 vs. 19.8% at τ = 0.9). Covariate effects of weight, PA, and HR on EE for the nonobese and obese children differed across quantiles (P < 0.05). The associations (linear and quadratic) between PA and HR with EE were stronger for the obese than nonobese population (P < 0.05). In conclusion, QR provided more accurate predictions of EE compared with conventional OLS regression, especially at the tails of the distribution, and revealed substantially different covariate effects of weight, PA, and HR on EE in nonobese and obese children. PMID:23640591
Multi-catchment rainfall-runoff simulation for extreme flood estimation
NASA Astrophysics Data System (ADS)
Paquet, Emmanuel
2017-04-01
The SCHADEX method (Paquet et al., 2013) is a reference method in France for the estimation of extreme flood for dam design. The method is based on a semi-continuous rainfall-runoff simulation process: hundreds of different rainy events, randomly drawn up to extreme values, are simulated independently in the hydrological conditions of each day when a rainy event has been actually observed. This allows generating an exhaustive set of crossings between precipitation and soil saturation hazards, and to build a complete distribution of flood discharges up to extreme quantiles. The hydrological model used within SCHADEX, the MORDOR model (Garçon, 1996), is a lumped model, which implies that hydrological processes, e.g. rainfall and soil saturation, are supposed to be homogeneous throughout the catchment. Snow processes are nevertheless represented in relation with altitude. This hypothesis of homogeneity is questionable especially as the size of the catchment increases, or in areas of highly contrasted climatology (like mountainous areas). Conversely, modeling the catchment with a fully distributed approach would cause different problems, in particular distributing the rainfall-runoff model parameters trough space, and within the SCHADEX stochastic framework, generating extreme rain fields with credible spatio-temporal features. An intermediate solution is presented here. It provides a better representation of the hydro-climatic diversity of the studied catchment (especially regarding flood processes) while keeping the SCHADEX simulation framework. It consists in dividing the catchment in several, more homogeneous sub-catchments. Rainfall-runoff models are parameterized individually for each of them, using local discharge data if available. A first SCHADEX simulation is done at the global scale, which allows assigning a probability to each simulated event, mainly based on the global areal rainfall drawn for the event (see Paquet el al., 2013 for details). Then the rainfall of each event is distributed through the different sub-catchments using the spatial patterns calculated in the SPAZM precipitation reanalysis (Gottardi et al., 2012) for comparable situations of the 1948-2005 period. Corresponding runoffs are calculated with the hydrological models and aggregated to compute the discharge at the outlet of the main catchment. A complete distribution of flood discharges is finally computed. This method is illustrated with the example of the Durance at Serre-Ponçon catchment (south of French Alps, 3600 km2) which has been divided in four sub-catchements. The proposed approach is compared with the "classical" SCHADEX approach applied on the whole catchment. References: Garçon, R. (1996). Prévision opérationnelle des apports de la Durance à Serre-Ponçon à l'aide du modèle MORDOR. Bilan de l'année 1994-1995. La Houille Blanche, (5), 71-76. Gottardi, F., Obled, C., Gailhard, J., & Paquet, E. (2012). Statistical reanalysis of precipitation fields based on ground network data and weather patterns: Application over French mountains. Journal of Hydrology, 432, 154-167. Paquet, E., Garavaglia, F., Garçon, R., & Gailhard, J. (2013). The SCHADEX method: A semi-continuous rainfall-runoff simulation for extreme flood estimation. Journal of Hydrology, 495, 23-37.
Comparability of a short food frequency questionnaire to assess diet quality: the DISCOVER study.
Dehghan, Mahshid; Ge, Yipeng; El Sheikh, Wala; Bawor, Monica; Rangarajan, Sumathy; Dennis, Brittany; Vair, Judith; Sholer, Heather; Hutchinson, Nichole; Iordan, Elizabeth; Mackie, Pam; Samaan, Zainab
2017-09-01
This study aims to assess comparability of a short food frequency questionnaire (SFFQ) used in the Determinants of Suicide: Conventional and Emergent Risk Study (DISCOVER Study) with a validated comprehensive FFQ (CFFQ). A total of 127 individuals completed SFFQ and CFFQ. Healthy eating was measured using Healthy Eating Score (HES). Estimated food intake and healthy eating assessed by SFFQ was compared with the CFFQ. For most food groups and HES, the highest Spearman's rank correlation coefficients between the two FFQs were r > .60. For macro-nutrients, the correlations exceeded 0.4. Cross-classification of quantile analysis showed that participants were classified between 46% and 81% into the exact same quantiles, while 10% or less were misclassified into opposite quantiles. The Bland-Altman plots showed an acceptable level of agreement between the two dietary measurement methods. The SFFQ can be used for Canadian with psychiatric disorders to rank them based on their dietary intake.
A Bayesian beta distribution model for estimating rainfall IDF curves in a changing climate
NASA Astrophysics Data System (ADS)
Lima, Carlos H. R.; Kwon, Hyun-Han; Kim, Jin-Young
2016-09-01
The estimation of intensity-duration-frequency (IDF) curves for rainfall data comprises a classical task in hydrology studies to support a variety of water resources projects, including urban drainage and the design of flood control structures. In a changing climate, however, traditional approaches based on historical records of rainfall and on the stationary assumption can be inadequate and lead to poor estimates of rainfall intensity quantiles. Climate change scenarios built on General Circulation Models offer a way to access and estimate future changes in spatial and temporal rainfall patterns at the daily scale at the utmost, which is not as fine temporal resolution as required (e.g. hours) to directly estimate IDF curves. In this paper we propose a novel methodology based on a four-parameter beta distribution to estimate IDF curves conditioned on the observed (or simulated) daily rainfall, which becomes the time-varying upper bound of the updated nonstationary beta distribution. The inference is conducted in a Bayesian framework that provides a better way to take into account the uncertainty in the model parameters when building the IDF curves. The proposed model is tested using rainfall data from four stations located in South Korea and projected climate change Representative Concentration Pathways (RCPs) scenarios 6 and 8.5 from the Met Office Hadley Centre HadGEM3-RA model. The results show that the developed model fits the historical data as good as the traditional Generalized Extreme Value (GEV) distribution but is able to produce future IDF curves that significantly differ from the historically based IDF curves. The proposed model predicts for the stations and RCPs scenarios analysed in this work an increase in the intensity of extreme rainfalls of short duration with long return periods.
Relationship between Urbanization and Cancer Incidence in Iran Using Quantile Regression.
Momenyan, Somayeh; Sadeghifar, Majid; Sarvi, Fatemeh; Khodadost, Mahmoud; Mosavi-Jarrahi, Alireza; Ghaffari, Mohammad Ebrahim; Sekhavati, Eghbal
2016-01-01
Quantile regression is an efficient method for predicting and estimating the relationship between explanatory variables and percentile points of the response distribution, particularly for extreme percentiles of the distribution. To study the relationship between urbanization and cancer morbidity, we here applied quantile regression. This cross-sectional study was conducted for 9 cancers in 345 cities in 2007 in Iran. Data were obtained from the Ministry of Health and Medical Education and the relationship between urbanization and cancer morbidity was investigated using quantile regression and least square regression. Fitting models were compared using AIC criteria. R (3.0.1) software and the Quantreg package were used for statistical analysis. With the quantile regression model all percentiles for breast, colorectal, prostate, lung and pancreas cancers demonstrated increasing incidence rate with urbanization. The maximum increase for breast cancer was in the 90th percentile (β=0.13, p-value<0.001), for colorectal cancer was in the 75th percentile (β=0.048, p-value<0.001), for prostate cancer the 95th percentile (β=0.55, p-value<0.001), for lung cancer was in 95th percentile (β=0.52, p-value=0.006), for pancreas cancer was in 10th percentile (β=0.011, p-value<0.001). For gastric, esophageal and skin cancers, with increasing urbanization, the incidence rate was decreased. The maximum decrease for gastric cancer was in the 90th percentile(β=0.003, p-value<0.001), for esophageal cancer the 95th (β=0.04, p-value=0.4) and for skin cancer also the 95th (β=0.145, p-value=0.071). The AIC showed that for upper percentiles, the fitting of quantile regression was better than least square regression. According to the results of this study, the significant impact of urbanization on cancer morbidity requirs more effort and planning by policymakers and administrators in order to reduce risk factors such as pollution in urban areas and ensure proper nutrition recommendations are made.
NASA Astrophysics Data System (ADS)
Fink, G.; Koch, M.
2010-12-01
An important aspect in water resources and hydrological engineering is the assessment of hydrological risk, due to the occurrence of extreme events, e.g. droughts or floods. When dealing with the latter - as is the focus here - the classical methods of flood frequency analysis (FFA) are usually being used for the proper dimensioning of a hydraulic structure, for the purpose of bringing down the flood risk to an acceptable level. FFA is based on extreme value statistics theory. Despite the progress of methods in this scientific branch, the development, decision, and fitting of an appropriate distribution function stills remains a challenge, particularly, when certain underlying assumptions of the theory are not met in real applications. This is, for example, the case when the stationarity-condition for a random flood time series is not satisfied anymore, as could be the situation when long-term hydrological impacts of future climate change are to be considered. The objective here is to verify the applicability of classical (stationary) FFA to predicted flood time series in the Fulda catchment in central Germany, as they may occur in the wake of climate change during the 21st century. These discharge time series at the outlet of the Fulda basin have been simulated with a distributed hydrological model (SWAT) that is forced by predicted climate variables of a regional climate model for Germany (REMO). From the simulated future daily time series, annual maximum (extremes) values are computed and analyzed for the purpose of risk evaluation. Although the 21st century estimated extreme flood series of the Fulda river turn out to be only mildly non-stationary, alleviating the need for further action and concern at the first sight, the more detailed analysis of the risk, as quantified, for example, by the return period, shows non-negligent differences in the calculated risk levels. This could be verified by employing a new method, the so-called flood series maximum analysis (FSMA) method, which consists in the stochastic simulation of numerous trajectories of a stochastic process with a given GEV-distribution over a certain length of time (> larger than a desired return period). Then the maximum value for each trajectory is computed, all of which are then used to determine the empirical distribution of this maximum series. Through graphical inversion of this distribution function the size of the design flood for a given risk (quantile) and given life duration can be inferred. The results of numerous simulations show that for stationary flood series, the new FSMA method results, expectedly, in nearly identical risk values as the classical FFA approach. However, once the flood time series becomes slightly non-stationary - for reasons as discussed - and regardless of whether the trend is increasing or decreasing, large differences in the computed risk values for a given design flood occur. Or in other word, for the same risk, the new FSMA method would lead to different values in the design flood for a hydraulic structure than the classical FFA method. This, in turn, could lead to some cost savings in the realization of a hydraulic project.
Fenske, Nora; Müller, Manfred J.; Plachta-Danielzik, Sandra; Keil, Thomas; Grabenhenrich, Linus; von Kries, Rüdiger
2014-01-01
Background: Children of mothers who smoked during pregnancy have a lower birth weight but have a higher chance to become overweight during childhood. Objectives: We followed children longitudinally to assess the age when higher body mass index (BMI) z-scores became evident in the children of mothers who smoked during pregnancy, and to evaluate the trajectory of changes until adolescence. Methods: We pooled data from two German cohort studies that included repeated anthropometric measurements until 14 years of age and information on smoking during pregnancy and other risk factors for overweight. We used longitudinal quantile regression to estimate age- and sex-specific associations between maternal smoking and the 10th, 25th, 50th, 75th, and 90th quantiles of the BMI z-score distribution in study participants from birth through 14 years of age, adjusted for potential confounders. We used additive mixed models to estimate associations with mean BMI z-scores. Results: Mean and median (50th quantile) BMI z-scores at birth were smaller in the children of mothers who smoked during pregnancy compared with children of nonsmoking mothers, but BMI z-scores were significantly associated with maternal smoking beginning at the age of 4–5 years, and differences increased over time. For example, the difference in the median BMI z-score between the daughters of smokers versus nonsmokers was 0.12 (95% CI: 0.01, 0.21) at 5 years, and 0.30 (95% CI: 0.08, 0.39) at 14 years of age. For lower BMI z-score quantiles, the association with smoking was more pronounced in girls, whereas in boys the association was more pronounced for higher BMI z-score quantiles. Conclusions: A clear difference in BMI z-score (mean and median) between children of smoking and nonsmoking mothers emerged at 4–5 years of age. The shape and size of age-specific effect estimates for maternal smoking during pregnancy varied by age and sex across the BMI z-score distribution. Citation: Riedel C, Fenske N, Müller MJ, Plachta-Danielzik S, Keil T, Grabenhenrich L, von Kries R. 2014. Differences in BMI z-scores between offspring of smoking and nonsmoking mothers: a longitudinal study of German children from birth through 14 years of age. Environ Health Perspect 122:761–767; http://dx.doi.org/10.1289/ehp.1307139 PMID:24695368
Extreme Quantile Estimation in Binary Response Models
1990-03-01
in Cancer Research," Biometria , VoL 66, pp. 307-316. Hsi, B.P. [1969], ’The Multiple Sample Up-and-Down Method in Bioassay," Journal of the American...New Method of Estimation," Biometria , VoL 53, pp. 439-454. Wetherill, G.B. [1976], Sequential Methods in Statistics, London: Chapman and Hall. Wu, C.FJ
Effect of uncertainties on probabilistic-based design capacity of hydrosystems
NASA Astrophysics Data System (ADS)
Tung, Yeou-Koung
2018-02-01
Hydrosystems engineering designs involve analysis of hydrometric data (e.g., rainfall, floods) and use of hydrologic/hydraulic models, all of which contribute various degrees of uncertainty to the design process. Uncertainties in hydrosystem designs can be generally categorized into aleatory and epistemic types. The former arises from the natural randomness of hydrologic processes whereas the latter are due to knowledge deficiency in model formulation and model parameter specification. This study shows that the presence of epistemic uncertainties induces uncertainty in determining the design capacity. Hence, the designer needs to quantify the uncertainty features of design capacity to determine the capacity with a stipulated performance reliability under the design condition. Using detention basin design as an example, the study illustrates a methodological framework by considering aleatory uncertainty from rainfall and epistemic uncertainties from the runoff coefficient, curve number, and sampling error in design rainfall magnitude. The effects of including different items of uncertainty and performance reliability on the design detention capacity are examined. A numerical example shows that the mean value of the design capacity of the detention basin increases with the design return period and this relation is found to be practically the same regardless of the uncertainty types considered. The standard deviation associated with the design capacity, when subject to epistemic uncertainty, increases with both design frequency and items of epistemic uncertainty involved. It is found that the epistemic uncertainty due to sampling error in rainfall quantiles should not be ignored. Even with a sample size of 80 (relatively large for a hydrologic application) the inclusion of sampling error in rainfall quantiles resulted in a standard deviation about 2.5 times higher than that considering only the uncertainty of the runoff coefficient and curve number. Furthermore, the presence of epistemic uncertainties in the design would result in under-estimation of the annual failure probability of the hydrosystem and has a discounting effect on the anticipated design return period.
The importance of hydrological uncertainty assessment methods in climate change impact studies
NASA Astrophysics Data System (ADS)
Honti, M.; Scheidegger, A.; Stamm, C.
2014-08-01
Climate change impact assessments have become more and more popular in hydrology since the middle 1980s with a recent boost after the publication of the IPCC AR4 report. From hundreds of impact studies a quasi-standard methodology has emerged, to a large extent shaped by the growing public demand for predicting how water resources management or flood protection should change in the coming decades. The "standard" workflow relies on a model cascade from global circulation model (GCM) predictions for selected IPCC scenarios to future catchment hydrology. Uncertainty is present at each level and propagates through the model cascade. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. Our hypothesis was that the relative importance of climatic and hydrologic uncertainty is (among other factors) heavily influenced by the uncertainty assessment method. To test this we carried out a climate change impact assessment and estimated the relative importance of the uncertainty sources. The study was performed on two small catchments in the Swiss Plateau with a lumped conceptual rainfall runoff model. In the climatic part we applied the standard ensemble approach to quantify uncertainty but in hydrology we used formal Bayesian uncertainty assessment with two different likelihood functions. One was a time series error model that was able to deal with the complicated statistical properties of hydrological model residuals. The second was an approximate likelihood function for the flow quantiles. The results showed that the expected climatic impact on flow quantiles was small compared to prediction uncertainty. The choice of uncertainty assessment method actually determined what sources of uncertainty could be identified at all. This demonstrated that one could arrive at rather different conclusions about the causes behind predictive uncertainty for the same hydrological model and calibration data when considering different objective functions for calibration.
Socioeconomic and ethnic inequalities in exposure to air and noise pollution in London.
Tonne, Cathryn; Milà, Carles; Fecht, Daniela; Alvarez, Mar; Gulliver, John; Smith, James; Beevers, Sean; Ross Anderson, H; Kelly, Frank
2018-06-01
Transport-related air and noise pollution, exposures linked to adverse health outcomes, varies within cities potentially resulting in exposure inequalities. Relatively little is known regarding inequalities in personal exposure to air pollution or transport-related noise. Our objectives were to quantify socioeconomic and ethnic inequalities in London in 1) air pollution exposure at residence compared to personal exposure; and 2) transport-related noise at residence from different sources. We used individual-level data from the London Travel Demand Survey (n = 45,079) between 2006 and 2010. We modeled residential (CMAQ-urban) and personal (London Hybrid Exposure Model) particulate matter <2.5 μm and nitrogen dioxide (NO 2 ), road-traffic noise at residence (TRANEX) and identified those within 50 dB noise contours of railways and Heathrow airport. We analyzed relationships between household income, area-level income deprivation and ethnicity with air and noise pollution using quantile and logistic regression. We observed inverse patterns in inequalities in air pollution when estimated at residence versus personal exposure with respect to household income (categorical, 8 groups). Compared to the lowest income group (<£10,000), the highest group (>£75,000) had lower residential NO 2 (-1.3 (95% CI -2.1, -0.6) μg/m 3 in the 95th exposure quantile) but higher personal NO 2 exposure (1.9 (95% CI 1.6, 2.3) μg/m 3 in the 95th quantile), which was driven largely by transport mode and duration. Inequalities in residential exposure to NO 2 with respect to area-level deprivation were larger at lower exposure quantiles (e.g. estimate for NO 2 5.1 (95% CI 4.6, 5.5) at quantile 0.15 versus 1.9 (95% CI 1.1, 2.6) at quantile 0.95), reflecting low-deprivation, high residential NO 2 areas in the city centre. Air pollution exposure at residence consistently overestimated personal exposure; this overestimation varied with age, household income, and area-level income deprivation. Inequalities in road traffic noise were generally small. In logistic regression models, the odds of living within a 50 dB contour of aircraft noise were highest in individuals with the highest household income, white ethnicity, and with the lowest area-level income deprivation. Odds of living within a 50 dB contour of rail noise were 19% (95% CI 3, 37) higher for black compared to white individuals. Socioeconomic inequalities in air pollution exposure were different for modeled residential versus personal exposure, which has important implications for environmental justice and confounding in epidemiology studies. Exposure misclassification was dependent on several factors related to health, a potential source of bias in epidemiological studies. Quantile regression revealed that socioeconomic and ethnic inequalities in air pollution are often not uniform across the exposure distribution. Copyright © 2018 Elsevier Ltd. All rights reserved.
Cooper, Jennifer N; Lodwick, Daniel L; Adler, Brent; Lee, Choonsik; Minneci, Peter C; Deans, Katherine J
2017-06-01
Computed tomography (CT) is a widely used diagnostic tool in pediatric medicine. However, due to concerns regarding radiation exposure, it is essential to identify patient characteristics associated with higher radiation burden from CT imaging, in order to more effectively target efforts towards dose reduction. Our objective was to identify the effects of various demographic and clinical patient characteristics on radiation exposure from single abdomen/pelvis CT scans in children. CT scans performed at our institution between January 2013 and August 2015 in patients under 16 years of age were processed using a software tool that estimates patient-specific organ and effective doses and merges these estimates with data from the electronic health record and billing record. Quantile regression models at the 50th, 75th, and 90th percentiles were used to estimate the effects of patients' demographic and clinical characteristics on effective dose. 2390 abdomen/pelvis CT scans (median effective dose 1.52mSv) were included. Of all characteristics examined, only older age, female gender, higher BMI, and whether the scan was a multiphase exam or an exam that required repeating for movement were significant predictors of higher effective dose at each quantile examined (all p<0.05). The effects of obesity and multiphase or repeat scanning on effective dose were magnified in higher dose scans. Older age, female gender, obesity, and multiphase or repeat scanning are all associated with increased effective dose from abdomen/pelvis CT. Targeted efforts to reduce dose from abdominal CT in these groups should be undertaken. Copyright © 2017 Elsevier Ltd. All rights reserved.
Quantile Functions, Convergence in Quantile, and Extreme Value Distribution Theory.
1980-11-01
Gnanadesikan (1968). Quantile functions are advocated by Parzen (1979) as providing an approach to probability-based data analysis. Quantile functions are... Gnanadesikan , R. (1968). Probability Plotting Methods for the Analysis of Data, Biomtrika, 55, 1-17.
Russell, Brook T; Wang, Dewei; McMahan, Christopher S
2017-08-01
Fine particulate matter (PM 2.5 ) poses a significant risk to human health, with long-term exposure being linked to conditions such as asthma, chronic bronchitis, lung cancer, atherosclerosis, etc. In order to improve current pollution control strategies and to better shape public policy, the development of a more comprehensive understanding of this air pollutant is necessary. To this end, this work attempts to quantify the relationship between certain meteorological drivers and the levels of PM 2.5 . It is expected that the set of important meteorological drivers will vary both spatially and within the conditional distribution of PM 2.5 levels. To account for these characteristics, a new local linear penalized quantile regression methodology is developed. The proposed estimator uniquely selects the set of important drivers at every spatial location and for each quantile of the conditional distribution of PM 2.5 levels. The performance of the proposed methodology is illustrated through simulation, and it is then used to determine the association between several meteorological drivers and PM 2.5 over the Eastern United States (US). This analysis suggests that the primary drivers throughout much of the Eastern US tend to differ based on season and geographic location, with similarities existing between "typical" and "high" PM 2.5 levels.
Enns, Murray W; Bernstein, Charles N; Kroeker, Kristine; Graff, Lesley; Walker, John R; Lix, Lisa M; Hitchon, Carol A; El-Gabalawy, Renée; Fisk, John D; Marrie, Ruth Ann
2018-01-01
Impairment in work function is a frequent outcome in patients with chronic conditions such as immune-mediated inflammatory diseases (IMID), depression and anxiety disorders. The personal and economic costs of work impairment in these disorders are immense. Symptoms of pain, fatigue, depression and anxiety are potentially remediable forms of distress that may contribute to work impairment in chronic health conditions such as IMID. The present study evaluated the association between pain [Medical Outcomes Study Pain Effects Scale], fatigue [Daily Fatigue Impact Scale], depression and anxiety [Hospital Anxiety and Depression Scale] and work impairment [Work Productivity and Activity Impairment Scale] in four patient populations: multiple sclerosis (n = 255), inflammatory bowel disease (n = 248, rheumatoid arthritis (n = 154) and a depression and anxiety group (n = 307), using quantile regression, controlling for the effects of sociodemographic factors, physical disability, and cognitive deficits. Each of pain, depression symptoms, anxiety symptoms, and fatigue individually showed significant associations with work absenteeism, presenteeism, and general activity impairment (quantile regression standardized estimates ranging from 0.3 to 1.0). When the distress variables were entered concurrently into the regression models, fatigue was a significant predictor of work and activity impairment in all models (quantile regression standardized estimates ranging from 0.2 to 0.5). These findings have important clinical implications for understanding the determinants of work impairment and for improving work-related outcomes in chronic disease.
NASA Astrophysics Data System (ADS)
Lombardo, Luigi; Saia, Sergio; Schillaci, Calogero; Mai, P. Martin; Huser, Raphaël
2018-05-01
Soil Organic Carbon (SOC) estimation is crucial to manage both natural and anthropic ecosystems and has recently been put under the magnifying glass after the Paris agreement 2016 due to its relationship with greenhouse gas. Statistical applications have dominated the SOC stock mapping at regional scale so far. However, the community has hardly ever attempted to implement Quantile Regression (QR) to spatially predict the SOC distribution. In this contribution, we test QR to estimate SOC stock (0-30 $cm$ depth) in the agricultural areas of a highly variable semi-arid region (Sicily, Italy, around 25,000 $km2$) by using topographic and remotely sensed predictors. We also compare the results with those from available SOC stock measurement. The QR models produced robust performances and allowed to recognize dominant effects among the predictors with respect to the considered quantile. This information, currently lacking, suggests that QR can discern predictor influences on SOC stock at specific sub-domains of each predictors. In this work, the predictive map generated at the median shows lower errors than those of the Joint Research Centre and International Soil Reference, and Information Centre benchmarks. The results suggest the use of QR as a comprehensive and effective method to map SOC using legacy data in agro-ecosystems. The R code scripted in this study for QR is included.
NASA Astrophysics Data System (ADS)
Hoss, F.; Fischbeck, P. S.
2014-10-01
This study further develops the method of quantile regression (QR) to predict exceedance probabilities of flood stages by post-processing forecasts. Using data from the 82 river gages, for which the National Weather Service's North Central River Forecast Center issues forecasts daily, this is the first QR application to US American river gages. Archived forecasts for lead times up to six days from 2001-2013 were analyzed. Earlier implementations of QR used the forecast itself as the only independent variable (Weerts et al., 2011; López López et al., 2014). This study adds the rise rate of the river stage in the last 24 and 48 h and the forecast error 24 and 48 h ago to the QR model. Including those four variables significantly improved the forecasts, as measured by the Brier Skill Score (BSS). Mainly, the resolution increases, as the original QR implementation already delivered high reliability. Combining the forecast with the other four variables results in much less favorable BSSs. Lastly, the forecast performance does not depend on the size of the training dataset, but on the year, the river gage, lead time and event threshold that are being forecast. We find that each event threshold requires a separate model configuration or at least calibration.
NASA Astrophysics Data System (ADS)
Peña, Luis E.; Barrios, Miguel; Francés, Félix
2016-10-01
Changes in land use within a catchment are among the causes of non-stationarity in the flood regime, as they modify the upper soil physical structure and its runoff production capacity. This paper analyzes the relation between the variation of the upper soil hydraulic properties due to changes in land use and its effect on the magnitude of peak flows: (1) incorporating fractal scaling properties to relate the effect of the static storage capacity (the sum of capillary water storage capacity in the root zone, canopy interception and surface puddles) and the upper soil vertical saturated hydraulic conductivity on the flood regime; (2) describing the effect of the spatial organization of the upper soil hydraulic properties at catchment scale; (3) examining the scale properties in the parameters of the Generalized Extreme Value (GEV) probability distribution function, in relation to the upper soil hydraulic properties. This study considered the historical changes of land use in the Combeima River catchment in South America, between 1991 and 2007, using distributed hydrological modeling of daily discharges to describe the hydrological response. Through simulation of land cover scenarios, it was demonstrated that it is possible to quantify the magnitude of peak flows in scenarios of land cover changes through its Wide-Sense Simple Scaling with the upper soil hydraulic properties.
Moisture fluxes towards Switzerland: investigating future changes in CMIP5 climate models
NASA Astrophysics Data System (ADS)
Fazan, Valerie; Martius, Olivia; Martynov, Andrey; Panziera, Luca
2017-04-01
High integrated vapor transport (IVT) in the atmosphere directed perpendicular to the orography is an important proxy for flood related precipitation in many mountainous areas around the world. Here we focus on flood related IVT and its changes in a warmer climate in Switzerland, where most high-impact floods events in the past 30 years were connected to exceptional IVT upstream of the mountains. Our study aims at investigating how these critical IVT values are projected to evolve in the future in a changing climate. The IVT is computed from 15 CMIP5 climate models for the past (1950-2005) and the future (2006-2100) under the RCP 8.5 scenario ("business as usual"). In order to check the accuracy of the models and the effect of the varying resolution, present day IVT from the CMIP5 models is compared with the ERA-Interim reanalysis data (period 1979-2015). A quantile mapping technique is then used to correct biases. The same bias corrections are applied to the future (2006-2100) IVT data. Finally, future changes in extreme IVT are investigated. This includes an analysis of changes in the magnitude and direction of the moisture flux in the different seasons for different regions in Switzerland.
A random walk rule for phase I clinical trials.
Durham, S D; Flournoy, N; Rosenberger, W F
1997-06-01
We describe a family of random walk rules for the sequential allocation of dose levels to patients in a dose-response study, or phase I clinical trial. Patients are sequentially assigned the next higher, same, or next lower dose level according to some probability distribution, which may be determined by ethical considerations as well as the patient's response. It is shown that one can choose these probabilities in order to center dose level assignments unimodally around any target quantile of interest. Estimation of the quantile is discussed; the maximum likelihood estimator and its variance are derived under a two-parameter logistic distribution, and the maximum likelihood estimator is compared with other nonparametric estimators. Random walk rules have clear advantages: they are simple to implement, and finite and asymptotic distribution theory is completely worked out. For a specific random walk rule, we compute finite and asymptotic properties and give examples of its use in planning studies. Having the finite distribution theory available and tractable obviates the need for elaborate simulation studies to analyze the properties of the design. The small sample properties of our rule, as determined by exact theory, compare favorably to those of the continual reassessment method, determined by simulation.
Fine-tuning satellite-based rainfall estimates
NASA Astrophysics Data System (ADS)
Harsa, Hastuadi; Buono, Agus; Hidayat, Rahmat; Achyar, Jaumil; Noviati, Sri; Kurniawan, Roni; Praja, Alfan S.
2018-05-01
Rainfall datasets are available from various sources, including satellite estimates and ground observation. The locations of ground observation scatter sparsely. Therefore, the use of satellite estimates is advantageous, because satellite estimates can provide data on places where the ground observations do not present. However, in general, the satellite estimates data contain bias, since they are product of algorithms that transform the sensors response into rainfall values. Another cause may come from the number of ground observations used by the algorithms as the reference in determining the rainfall values. This paper describe the application of bias correction method to modify the satellite-based dataset by adding a number of ground observation locations that have not been used before by the algorithm. The bias correction was performed by utilizing Quantile Mapping procedure between ground observation data and satellite estimates data. Since Quantile Mapping required mean and standard deviation of both the reference and the being-corrected data, thus the Inverse Distance Weighting scheme was applied beforehand to the mean and standard deviation of the observation data in order to provide a spatial composition of them, which were originally scattered. Therefore, it was possible to provide a reference data point at the same location with that of the satellite estimates. The results show that the new dataset have statistically better representation of the rainfall values recorded by the ground observation than the previous dataset.
Asquith, William H.; Kiang, Julie E.; Cohn, Timothy A.
2017-07-17
The U.S. Geological Survey (USGS), in cooperation with the U.S. Nuclear Regulatory Commission, has investigated statistical methods for probabilistic flood hazard assessment to provide guidance on very low annual exceedance probability (AEP) estimation of peak-streamflow frequency and the quantification of corresponding uncertainties using streamgage-specific data. The term “very low AEP” implies exceptionally rare events defined as those having AEPs less than about 0.001 (or 1 × 10–3 in scientific notation or for brevity 10–3). Such low AEPs are of great interest to those involved with peak-streamflow frequency analyses for critical infrastructure, such as nuclear power plants. Flood frequency analyses at streamgages are most commonly based on annual instantaneous peak streamflow data and a probability distribution fit to these data. The fitted distribution provides a means to extrapolate to very low AEPs. Within the United States, the Pearson type III probability distribution, when fit to the base-10 logarithms of streamflow, is widely used, but other distribution choices exist. The USGS-PeakFQ software, implementing the Pearson type III within the Federal agency guidelines of Bulletin 17B (method of moments) and updates to the expected moments algorithm (EMA), was specially adapted for an “Extended Output” user option to provide estimates at selected AEPs from 10–3 to 10–6. Parameter estimation methods, in addition to product moments and EMA, include L-moments, maximum likelihood, and maximum product of spacings (maximum spacing estimation). This study comprehensively investigates multiple distributions and parameter estimation methods for two USGS streamgages (01400500 Raritan River at Manville, New Jersey, and 01638500 Potomac River at Point of Rocks, Maryland). The results of this study specifically involve the four methods for parameter estimation and up to nine probability distributions, including the generalized extreme value, generalized log-normal, generalized Pareto, and Weibull. Uncertainties in streamflow estimates for corresponding AEP are depicted and quantified as two primary forms: quantile (aleatoric [random sampling] uncertainty) and distribution-choice (epistemic [model] uncertainty). Sampling uncertainties of a given distribution are relatively straightforward to compute from analytical or Monte Carlo-based approaches. Distribution-choice uncertainty stems from choices of potentially applicable probability distributions for which divergence among the choices increases as AEP decreases. Conventional goodness-of-fit statistics, such as Cramér-von Mises, and L-moment ratio diagrams are demonstrated in order to hone distribution choice. The results generally show that distribution choice uncertainty is larger than sampling uncertainty for very low AEP values.
Regional Frequency and Uncertainty Analysis of Extreme Precipitation in Bangladesh
NASA Astrophysics Data System (ADS)
Mortuza, M. R.; Demissie, Y.; Li, H. Y.
2014-12-01
Increased frequency of extreme precipitations, especially those with multiday durations, are responsible for recent urban floods and associated significant losses of lives and infrastructures in Bangladesh. Reliable and routinely updated estimation of the frequency of occurrence of such extreme precipitation events are thus important for developing up-to-date hydraulic structures and stormwater drainage system that can effectively minimize future risk from similar events. In this study, we have updated the intensity-duration-frequency (IDF) curves for Bangladesh using daily precipitation data from 1961 to 2010 and quantified associated uncertainties. Regional frequency analysis based on L-moments is applied on 1-day, 2-day and 5-day annual maximum precipitation series due to its advantages over at-site estimation. The regional frequency approach pools the information from climatologically similar sites to make reliable estimates of quantiles given that the pooling group is homogeneous and of reasonable size. We have used Region of influence (ROI) approach along with homogeneity measure based on L-moments to identify the homogenous pooling groups for each site. Five 3-parameter distributions (i.e., Generalized Logistic, Generalized Extreme value, Generalized Normal, Pearson Type Three, and Generalized Pareto) are used for a thorough selection of appropriate models that fit the sample data. Uncertainties related to the selection of the distributions and historical data are quantified using the Bayesian Model Averaging and Balanced Bootstrap approaches respectively. The results from this study can be used to update the current design and management of hydraulic structures as well as in exploring spatio-temporal variations of extreme precipitation and associated risk.
Bayesian quantitative precipitation forecasts in terms of quantiles
NASA Astrophysics Data System (ADS)
Bentzien, Sabrina; Friederichs, Petra
2014-05-01
Ensemble prediction systems (EPS) for numerical weather predictions on the mesoscale are particularly developed to obtain probabilistic guidance for high impact weather. An EPS not only issues a deterministic future state of the atmosphere but a sample of possible future states. Ensemble postprocessing then translates such a sample of forecasts into probabilistic measures. This study focus on probabilistic quantitative precipitation forecasts in terms of quantiles. Quantiles are particular suitable to describe precipitation at various locations, since no assumption is required on the distribution of precipitation. The focus is on the prediction during high-impact events and related to the Volkswagen Stiftung funded project WEX-MOP (Mesoscale Weather Extremes - Theory, Spatial Modeling and Prediction). Quantile forecasts are derived from the raw ensemble and via quantile regression. Neighborhood method and time-lagging are effective tools to inexpensively increase the ensemble spread, which results in more reliable forecasts especially for extreme precipitation events. Since an EPS provides a large amount of potentially informative predictors, a variable selection is required in order to obtain a stable statistical model. A Bayesian formulation of quantile regression allows for inference about the selection of predictive covariates by the use of appropriate prior distributions. Moreover, the implementation of an additional process layer for the regression parameters accounts for spatial variations of the parameters. Bayesian quantile regression and its spatially adaptive extension is illustrated for the German-focused mesoscale weather prediction ensemble COSMO-DE-EPS, which runs (pre)operationally since December 2010 at the German Meteorological Service (DWD). Objective out-of-sample verification uses the quantile score (QS), a weighted absolute error between quantile forecasts and observations. The QS is a proper scoring function and can be decomposed into reliability, resolutions and uncertainty parts. A quantile reliability plot gives detailed insights in the predictive performance of the quantile forecasts.
Reinforcing flood-risk estimation.
Reed, Duncan W
2002-07-15
Flood-frequency estimation is inherently uncertain. The practitioner applies a combination of gauged data, scientific method and hydrological judgement to derive a flood-frequency curve for a particular site. The resulting estimate can be thought fully satisfactory only if it is broadly consistent with all that is reliably known about the flood-frequency behaviour of the river. The paper takes as its main theme the search for information to strengthen a flood-risk estimate made from peak flows alone. Extra information comes in many forms, including documentary and monumental records of historical floods, and palaeological markers. Meteorological information is also useful, although rainfall rarity is difficult to assess objectively and can be a notoriously unreliable indicator of flood rarity. On highly permeable catchments, groundwater levels present additional data. Other types of information are relevant to judging hydrological similarity when the flood-frequency estimate derives from data pooled across several catchments. After highlighting information sources, the paper explores a second theme: that of consistency in flood-risk estimates. Following publication of the Flood estimation handbook, studies of flood risk are now using digital catchment data. Automated calculation methods allow estimates by standard methods to be mapped basin-wide, revealing anomalies at special sites such as river confluences. Such mapping presents collateral information of a new character. Can this be used to achieve flood-risk estimates that are coherent throughout a river basin?
NASA Astrophysics Data System (ADS)
Penot, David; Paquet, Emmanuel; Lang, Michel
2014-05-01
SCHADEX is a probabilistic method for extreme flood estimation, developed and applied since 2006 at Electricité de France (EDF) for dam spillway design [Paquet et al., 2013]. SCHADEX is based on a semi-continuous rainfall-runoff simulation process. The method has been built around two models: a Multi-Exponential Weather Pattern (MEWP) distribution for rainfall probability estimation [Garavaglia et al., 2010] and the MORDOR hydrological model. To use SCHADEX in ungauged context, rainfall distribution and hydrological model must be regionalized. The regionalization of the MEWP rainfall distribution can be managed with SPAZM, a daily rainfall interpolator [Gottardi et al., 2012] which provides reasonable estimates of point and areal rainfall up to hight quantiles. The main issue remains to regionalize MORDOR which is heavily parametrized. A much more simple model has been considered: the SCS model. It is a well known model for event simulation [USDA SCS, 1985; Beven, 2003] and it relies on only one parameter. Then, the idea is to use the SCS model instead of MORDOR within a simplified stochastic simulation scheme to produce a distribution of flood volume from an exhaustive crossing between rainy events and catchment saturation hazards. The presentation details this process and its capacity to generate a runoff distribution based on catchment areal rainfall distribution. The simulation method depends on a unique parameter Smax, the maximum initial loss of the catchment. Then an initial loss S (between zero and Smax) can be drawn to account for the variability of catchment state (between dry and saturated). The distribution of initial loss (or conversely, of catchment saturation, as modeled by MORDOR) seems closely linked to the catchment's regime, therefore easily to regionalize. The simulation takes into account a snow contribution for snow driven catchments, and an antecedent runoff. The presentation shows the results of this stochastic procedure applied on 80 French catchments and its capacity to represent the asymptotic behaviour of the runoff distribution. References: K. J. Beven. Rainfall-Runoff modelling The Primer, British Library, 2003. F. Garavaglia, J. Gailhard, E. Paquet, M. Lang, R. Garçon, and P. Bernardara. Introducing a rainfall compound distribution model based on weather patterns sub-sampling. Hydrology and Earth System Sciences, 14(6):951-964, 2010. F. Gottardi, C. Obled, J. Gailhard, and E. Paquet. Statistical reanalysis of precipitation fields based on ground network data and weather patterns : Application over french mountains. Journal of Hydrology, 432-433:154-167, 2012. ISSN 0022-1694. E. Paquet, F. Garavaglia, R Garçon, and J. Gailhard. The schadex method : a semi-continuous rainfall-runoff simulation for extreme flood estimation. Journal of Hydrology, 2013. USDA SCS, National Engineering Handbook, Supplement A, Section 4, Chapter 10. Whashington DC, 1985.
Constructing inverse probability weights for continuous exposures: a comparison of methods.
Naimi, Ashley I; Moodie, Erica E M; Auger, Nathalie; Kaufman, Jay S
2014-03-01
Inverse probability-weighted marginal structural models with binary exposures are common in epidemiology. Constructing inverse probability weights for a continuous exposure can be complicated by the presence of outliers, and the need to identify a parametric form for the exposure and account for nonconstant exposure variance. We explored the performance of various methods to construct inverse probability weights for continuous exposures using Monte Carlo simulation. We generated two continuous exposures and binary outcomes using data sampled from a large empirical cohort. The first exposure followed a normal distribution with homoscedastic variance. The second exposure followed a contaminated Poisson distribution, with heteroscedastic variance equal to the conditional mean. We assessed six methods to construct inverse probability weights using: a normal distribution, a normal distribution with heteroscedastic variance, a truncated normal distribution with heteroscedastic variance, a gamma distribution, a t distribution (1, 3, and 5 degrees of freedom), and a quantile binning approach (based on 10, 15, and 20 exposure categories). We estimated the marginal odds ratio for a single-unit increase in each simulated exposure in a regression model weighted by the inverse probability weights constructed using each approach, and then computed the bias and mean squared error for each method. For the homoscedastic exposure, the standard normal, gamma, and quantile binning approaches performed best. For the heteroscedastic exposure, the quantile binning, gamma, and heteroscedastic normal approaches performed best. Our results suggest that the quantile binning approach is a simple and versatile way to construct inverse probability weights for continuous exposures.
NASA Astrophysics Data System (ADS)
Khan, Firdos; Pilz, Jürgen
2016-04-01
South Asia is under the severe impacts of changing climate and global warming. The last two decades showed that climate change or global warming is happening and the first decade of 21st century is considered as the warmest decade over Pakistan ever in history where temperature reached 53 0C in 2010. Consequently, the spatio-temporal distribution and intensity of precipitation is badly effected and causes floods, cyclones and hurricanes in the region which further have impacts on agriculture, water, health etc. To cope with the situation, it is important to conduct impact assessment studies and take adaptation and mitigation remedies. For impact assessment studies, we need climate variables at higher resolution. Downscaling techniques are used to produce climate variables at higher resolution; these techniques are broadly divided into two types, statistical downscaling and dynamical downscaling. The target location of this study is the monsoon dominated region of Pakistan. One reason for choosing this area is because the contribution of monsoon rains in this area is more than 80 % of the total rainfall. This study evaluates a statistical downscaling technique which can be then used for downscaling climatic variables. Two statistical techniques i.e. quantile regression and copula modeling are combined in order to produce realistic results for climate variables in the area under-study. To reduce the dimension of input data and deal with multicollinearity problems, empirical orthogonal functions will be used. Advantages of this new method are: (1) it is more robust to outliers as compared to ordinary least squares estimates and other estimation methods based on central tendency and dispersion measures; (2) it preserves the dependence among variables and among sites and (3) it can be used to combine different types of distributions. This is important in our case because we are dealing with climatic variables having different distributions over different meteorological stations. The proposed model will be validated by using the (National Centers for Environmental Prediction / National Center for Atmospheric Research) NCEP/NCAR predictors for the period of 1960-1990 and validated for 1990-2000. To investigate the efficiency of the proposed model, it will be compared with the multivariate multiple regression model and with dynamical downscaling climate models by using different climate indices that describe the frequency, intensity and duration of the variables of interest. KEY WORDS: Climate change, Copula, Monsoon, Quantile regression, Spatio-temporal distribution.
Koltun, G.F.; Kula, Stephanie P.
2013-01-01
This report presents the results of a study to develop methods for estimating selected low-flow statistics and for determining annual flow-duration statistics for Ohio streams. Regression techniques were used to develop equations for estimating 10-year recurrence-interval (10-percent annual-nonexceedance probability) low-flow yields, in cubic feet per second per square mile, with averaging periods of 1, 7, 30, and 90-day(s), and for estimating the yield corresponding to the long-term 80-percent duration flow. These equations, which estimate low-flow yields as a function of a streamflow-variability index, are based on previously published low-flow statistics for 79 long-term continuous-record streamgages with at least 10 years of data collected through water year 1997. When applied to the calibration dataset, average absolute percent errors for the regression equations ranged from 15.8 to 42.0 percent. The regression results have been incorporated into the U.S. Geological Survey (USGS) StreamStats application for Ohio (http://water.usgs.gov/osw/streamstats/ohio.html) in the form of a yield grid to facilitate estimation of the corresponding streamflow statistics in cubic feet per second. Logistic-regression equations also were developed and incorporated into the USGS StreamStats application for Ohio for selected low-flow statistics to help identify occurrences of zero-valued statistics. Quantiles of daily and 7-day mean streamflows were determined for annual and annual-seasonal (September–November) periods for each complete climatic year of streamflow-gaging station record for 110 selected streamflow-gaging stations with 20 or more years of record. The quantiles determined for each climatic year were the 99-, 98-, 95-, 90-, 80-, 75-, 70-, 60-, 50-, 40-, 30-, 25-, 20-, 10-, 5-, 2-, and 1-percent exceedance streamflows. Selected exceedance percentiles of the annual-exceedance percentiles were subsequently computed and tabulated to help facilitate consideration of the annual risk of exceedance or nonexceedance of annual and annual-seasonal-period flow-duration values. The quantiles are based on streamflow data collected through climatic year 2008.
NASA Astrophysics Data System (ADS)
Tumanov, Sergiu
A test of goodness of fit based on rank statistics was applied to prove the applicability of the Eggenberger-Polya discrete probability law to hourly SO 2-concentrations measured in the vicinity of single sources. With this end in view, the pollutant concentration was considered an integral quantity which may be accepted if one properly chooses the unit of measurement (in this case μg m -3) and if account is taken of the limited accuracy of measurements. The results of the test being satisfactory, even in the range of upper quantiles, the Eggenberger-Polya law was used in association with numerical modelling to estimate statistical parameters, e.g. quantiles, cumulative probabilities of threshold concentrations to be exceeded, and so on, in the grid points of a network covering the area of interest. This only needs accurate estimations of means and variances of the concentration series which can readily be obtained through routine air pollution dispersion modelling.
Uncertainty in flood damage estimates and its potential effect on investment decisions
NASA Astrophysics Data System (ADS)
Wagenaar, D. J.; de Bruijn, K. M.; Bouwer, L. M.; de Moel, H.
2016-01-01
This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage functions and maximum damages can have large effects on flood damage estimates. This explanation is then used to quantify the uncertainty in the damage estimates with a Monte Carlo analysis. The Monte Carlo analysis uses a damage function library with 272 functions from seven different flood damage models. The paper shows that the resulting uncertainties in estimated damages are in the order of magnitude of a factor of 2 to 5. The uncertainty is typically larger for flood events with small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.
Evaluation of design flood estimates with respect to sample size
NASA Astrophysics Data System (ADS)
Kobierska, Florian; Engeland, Kolbjorn
2016-04-01
Estimation of design floods forms the basis for hazard management related to flood risk and is a legal obligation when building infrastructure such as dams, bridges and roads close to water bodies. Flood inundation maps used for land use planning are also produced based on design flood estimates. In Norway, the current guidelines for design flood estimates give recommendations on which data, probability distribution, and method to use dependent on length of the local record. If less than 30 years of local data is available, an index flood approach is recommended where the local observations are used for estimating the index flood and regional data are used for estimating the growth curve. For 30-50 years of data, a 2 parameter distribution is recommended, and for more than 50 years of data, a 3 parameter distribution should be used. Many countries have national guidelines for flood frequency estimation, and recommended distributions include the log Pearson II, generalized logistic and generalized extreme value distributions. For estimating distribution parameters, ordinary and linear moments, maximum likelihood and Bayesian methods are used. The aim of this study is to r-evaluate the guidelines for local flood frequency estimation. In particular, we wanted to answer the following questions: (i) Which distribution gives the best fit to the data? (ii) Which estimation method provides the best fit to the data? (iii) Does the answer to (i) and (ii) depend on local data availability? To answer these questions we set up a test bench for local flood frequency analysis using data based cross-validation methods. The criteria were based on indices describing stability and reliability of design flood estimates. Stability is used as a criterion since design flood estimates should not excessively depend on the data sample. The reliability indices describe to which degree design flood predictions can be trusted.
NASA Astrophysics Data System (ADS)
Koshinchanov, Georgy; Dimitrov, Dobri
2008-11-01
The characteristics of rainfall intensity are important for many purposes, including design of sewage and drainage systems, tuning flood warning procedures, etc. Those estimates are usually statistical estimates of the intensity of precipitation realized for certain period of time (e.g. 5, 10 min., etc) with different return period (e.g. 20, 100 years, etc). The traditional approach in evaluating the mentioned precipitation intensities is to process the pluviometer's records and fit probability distribution to samples of intensities valid for certain locations ore regions. Those estimates further become part of the state regulations to be used for various economic activities. Two problems occur using the mentioned approach: 1. Due to various factors the climate conditions are changed and the precipitation intensity estimates need regular update; 2. As far as the extremes of the probability distribution are of particular importance for the practice, the methodology of the distribution fitting needs specific attention to those parts of the distribution. The aim of this paper is to make review of the existing methodologies for processing the intensive rainfalls and to refresh some of the statistical estimates for the studied areas. The methodologies used in Bulgaria for analyzing the intensive rainfalls and produce relevant statistical estimates: The method of the maximum intensity, used in the National Institute of Meteorology and Hydrology to process and decode the pluviometer's records, followed by distribution fitting for each precipitation duration period; As the above, but with separate modeling of probability distribution for the middle and high probability quantiles. Method is similar to the first one, but with a threshold of 0,36 mm/min of intensity; Another method proposed by the Russian hydrologist G. A. Aleksiev for regionalization of estimates over some territory, improved and adapted by S. Gerasimov for Bulgaria; Next method is considering only the intensive rainfalls (if any) during the day with the maximal annual daily precipitation total for a given year; Conclusions are drown on the relevance and adequacy of the applied methods.
A non-stationary cost-benefit based bivariate extreme flood estimation approach
NASA Astrophysics Data System (ADS)
Qi, Wei; Liu, Junguo
2018-02-01
Cost-benefit analysis and flood frequency analysis have been integrated into a comprehensive framework to estimate cost effective design values. However, previous cost-benefit based extreme flood estimation is based on stationary assumptions and analyze dependent flood variables separately. A Non-Stationary Cost-Benefit based bivariate design flood estimation (NSCOBE) approach is developed in this study to investigate influence of non-stationarities in both the dependence of flood variables and the marginal distributions on extreme flood estimation. The dependence is modeled utilizing copula functions. Previous design flood selection criteria are not suitable for NSCOBE since they ignore time changing dependence of flood variables. Therefore, a risk calculation approach is proposed based on non-stationarities in both marginal probability distributions and copula functions. A case study with 54-year observed data is utilized to illustrate the application of NSCOBE. Results show NSCOBE can effectively integrate non-stationarities in both copula functions and marginal distributions into cost-benefit based design flood estimation. It is also found that there is a trade-off between maximum probability of exceedance calculated from copula functions and marginal distributions. This study for the first time provides a new approach towards a better understanding of influence of non-stationarities in both copula functions and marginal distributions on extreme flood estimation, and could be beneficial to cost-benefit based non-stationary bivariate design flood estimation across the world.
NASA Astrophysics Data System (ADS)
Qi, Wei
2017-11-01
Cost-benefit analysis is commonly used for engineering planning and design problems in practice. However, previous cost-benefit based design flood estimation is based on stationary assumption. This study develops a non-stationary cost-benefit based design flood estimation approach. This approach integrates a non-stationary probability distribution function into cost-benefit analysis, and influence of non-stationarity on expected total cost (including flood damage and construction costs) and design flood estimation can be quantified. To facilitate design flood selections, a 'Risk-Cost' analysis approach is developed, which reveals the nexus of extreme flood risk, expected total cost and design life periods. Two basins, with 54-year and 104-year flood data respectively, are utilized to illustrate the application. It is found that the developed approach can effectively reveal changes of expected total cost and extreme floods in different design life periods. In addition, trade-offs are found between extreme flood risk and expected total cost, which reflect increases in cost to mitigate risk. Comparing with stationary approaches which generate only one expected total cost curve and therefore only one design flood estimation, the proposed new approach generate design flood estimation intervals and the 'Risk-Cost' approach selects a design flood value from the intervals based on the trade-offs between extreme flood risk and expected total cost. This study provides a new approach towards a better understanding of the influence of non-stationarity on expected total cost and design floods, and could be beneficial to cost-benefit based non-stationary design flood estimation across the world.
Composite marginal quantile regression analysis for longitudinal adolescent body mass index data.
Yang, Chi-Chuan; Chen, Yi-Hau; Chang, Hsing-Yi
2017-09-20
Childhood and adolescenthood overweight or obesity, which may be quantified through the body mass index (BMI), is strongly associated with adult obesity and other health problems. Motivated by the child and adolescent behaviors in long-term evolution (CABLE) study, we are interested in individual, family, and school factors associated with marginal quantiles of longitudinal adolescent BMI values. We propose a new method for composite marginal quantile regression analysis for longitudinal outcome data, which performs marginal quantile regressions at multiple quantile levels simultaneously. The proposed method extends the quantile regression coefficient modeling method introduced by Frumento and Bottai (Biometrics 2016; 72:74-84) to longitudinal data accounting suitably for the correlation structure in longitudinal observations. A goodness-of-fit test for the proposed modeling is also developed. Simulation results show that the proposed method can be much more efficient than the analysis without taking correlation into account and the analysis performing separate quantile regressions at different quantile levels. The application to the longitudinal adolescent BMI data from the CABLE study demonstrates the practical utility of our proposal. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Smith, Erik A.; Sanocki, Chris A.; Lorenz, David L.; Jacobsen, Katrin E.
2017-12-27
Streamflow distribution maps for the Cannon River and St. Louis River drainage basins were developed by the U.S. Geological Survey, in cooperation with the Legislative-Citizen Commission on Minnesota Resources, to illustrate relative and cumulative streamflow distributions. The Cannon River was selected to provide baseline data to assess the effects of potential surficial sand mining, and the St. Louis River was selected to determine the effects of ongoing Mesabi Iron Range mining. Each drainage basin (Cannon, St. Louis) was subdivided into nested drainage basins: the Cannon River was subdivided into 152 nested drainage basins, and the St. Louis River was subdivided into 353 nested drainage basins. For each smaller drainage basin, the estimated volumes of groundwater discharge (as base flow) and surface runoff flowing into all surface-water features were displayed under the following conditions: (1) extreme low-flow conditions, comparable to an exceedance-probability quantile of 0.95; (2) low-flow conditions, comparable to an exceedance-probability quantile of 0.90; (3) a median condition, comparable to an exceedance-probability quantile of 0.50; and (4) a high-flow condition, comparable to an exceedance-probability quantile of 0.02.Streamflow distribution maps were developed using flow-duration curve exceedance-probability quantiles in conjunction with Soil-Water-Balance model outputs; both the flow-duration curve and Soil-Water-Balance models were built upon previously published U.S. Geological Survey reports. The selected streamflow distribution maps provide a proactive water management tool for State cooperators by illustrating flow rates during a range of hydraulic conditions. Furthermore, after the nested drainage basins are highlighted in terms of surface-water flows, the streamflows can be evaluated in the context of meeting specific ecological flows under different flow regimes and potentially assist with decisions regarding groundwater and surface-water appropriations. Presented streamflow distribution maps are foundational work intended to support the development of additional streamflow distribution maps that include statistical constraints on the selected flow conditions.
NASA Astrophysics Data System (ADS)
Chapman, Sandra; Stainforth, David; Watkins, Nick
2014-05-01
Estimates of how our climate is changing are needed locally in order to inform adaptation planning decisions. This requires quantifying the geographical patterns in changes at specific quantiles in distributions of variables such as daily temperature or precipitation. Here we focus on these local changes and on a method to transform daily observations of precipitation into patterns of local climate change. We develop a method[1] for analysing local climatic timeseries to assess which quantiles of the local climatic distribution show the greatest and most robust changes, to specifically address the challenges presented by daily precipitation data. We extract from the data quantities that characterize the changes in time of the likelihood of daily precipitation above a threshold and of the relative amount of precipitation in those days. Our method is a simple mathematical deconstruction of how the difference between two observations from two different time periods can be assigned to the combination of natural statistical variability and/or the consequences of secular climate change. This deconstruction facilitates an assessment of how fast different quantiles of precipitation distributions are changing. This involves both determining which quantiles and geographical locations show the greatest change but also, those at which any change is highly uncertain. We demonstrate this approach using E-OBS gridded data[2] timeseries of local daily precipitation from specific locations across Europe over the last 60 years. We treat geographical location and precipitation as independent variables and thus obtain as outputs the pattern of change at a given threshold of precipitation and with geographical location. This is model- independent, thus providing data of direct value in model calibration and assessment. Our results show regionally consistent patterns of systematic increase in precipitation on the wettest days, and of drying across all days which is of potential value in adaptation planning. [1] S C Chapman, D A Stainforth, N W Watkins, 2013, On Estimating Local Long Term Climate Trends, Phil. Trans. R. Soc. A, 371 20120287; D. A. Stainforth, 2013, S. C. Chapman, N. W. Watkins, Mapping climate change in European temperature distributions, Environ. Res. Lett. 8, 034031 [2] Haylock, M.R., N. Hofstra, A.M.G. Klein Tank, E.J. Klok, P.D. Jones and M. New. 2008: A European daily high-resolution gridded dataset of surface temperature and precipitation. J. Geophys. Res (Atmospheres), 113, D20119
NASA Astrophysics Data System (ADS)
Sardet, Laure; Patilea, Valentin
When pricing a specific insurance premium, actuary needs to evaluate the claims cost distribution for the warranty. Traditional actuarial methods use parametric specifications to model claims distribution, like lognormal, Weibull and Pareto laws. Mixtures of such distributions allow to improve the flexibility of the parametric approach and seem to be quite well-adapted to capture the skewness, the long tails as well as the unobserved heterogeneity among the claims. In this paper, instead of looking for a finely tuned mixture with many components, we choose a parsimonious mixture modeling, typically a two or three-component mixture. Next, we use the mixture cumulative distribution function (CDF) to transform data into the unit interval where we apply a beta-kernel smoothing procedure. A bandwidth rule adapted to our methodology is proposed. Finally, the beta-kernel density estimate is back-transformed to recover an estimate of the original claims density. The beta-kernel smoothing provides an automatic fine-tuning of the parsimonious mixture and thus avoids inference in more complex mixture models with many parameters. We investigate the empirical performance of the new method in the estimation of the quantiles with simulated nonnegative data and the quantiles of the individual claims distribution in a non-life insurance application.
A Study of Alternative Quantile Estimation Methods in Newsboy-Type Problems
1980-03-01
decision maker selects to have on hand. The newsboy cost equation may be formulated as a two-piece continuous linear function in the following manner. C(S...number of observations, some approximations may be possible. Three points which are near each other can be assumed to be linear and some estimator using...respectively. Define the value r as: r = [nq + 0.5] , (6) where [X] denotes the largest integer of X. Let us consider an estimate of X as the linear
Jeon, Soyoung; Paciorek, Christopher J.; Wehner, Michael F.
2016-02-16
Extreme event attribution characterizes how anthropogenic climate change may have influenced the probability and magnitude of selected individual extreme weather and climate events. Attribution statements often involve quantification of the fraction of attributable risk (FAR) or the risk ratio (RR) and associated confidence intervals. Many such analyses use climate model output to characterize extreme event behavior with and without anthropogenic influence. However, such climate models may have biases in their representation of extreme events. To account for discrepancies in the probabilities of extreme events between observational datasets and model datasets, we demonstrate an appropriate rescaling of the model output basedmore » on the quantiles of the datasets to estimate an adjusted risk ratio. Our methodology accounts for various components of uncertainty in estimation of the risk ratio. In particular, we present an approach to construct a one-sided confidence interval on the lower bound of the risk ratio when the estimated risk ratio is infinity. We demonstrate the methodology using the summer 2011 central US heatwave and output from the Community Earth System Model. In this example, we find that the lower bound of the risk ratio is relatively insensitive to the magnitude and probability of the actual event.« less
1982-06-01
scale based on two symmetric quantiles. Sankhya A 30, 335-336. [S] Gupta, S. S. and Gnanadesikan , M. (1966). Estimation of the parameters of the logistic...and Cheng (1971, 1972, 1974) Chan, Cheng, Mead and Panjer (1973) Cheng (1975) Eubank (1979, 1981a,b) Gupta and Gnanadesikan (1966) Hassanein (1969b
Examining the Reliability of Student Growth Percentiles Using Multidimensional IRT
ERIC Educational Resources Information Center
Monroe, Scott; Cai, Li
2015-01-01
Student growth percentiles (SGPs, Betebenner, 2009) are used to locate a student's current score in a conditional distribution based on the student's past scores. Currently, following Betebenner (2009), quantile regression (QR) is most often used operationally to estimate the SGPs. Alternatively, multidimensional item response theory (MIRT) may…
Technique for estimating depth of floods in Tennessee
Gamble, C.R.
1983-01-01
Estimates of flood depths are needed for design of roadways across flood plains and for other types of construction along streams. Equations for estimating flood depths in Tennessee were derived using data for 150 gaging stations. The equations are based on drainage basin size and can be used to estimate depths of the 10-year and 100-year floods for four hydrologic areas. A method also was developed for estimating depth of floods having recurrence intervals between 10 and 100 years. Standard errors range from 22 to 30 percent for the 10-year depth equations and from 23 to 30 percent for the 100-year depth equations. (USGS)
NASA Astrophysics Data System (ADS)
Yang, Z.; Burn, D. H.
2017-12-01
Extreme rainfall events can have devastating impacts on society. To quantify the associated risk, the IDF curve has been used to provide the essential rainfall-related information for urban planning. However, the recent changes in the rainfall climatology caused by climate change and urbanization have made the estimates provided by the traditional regional IDF approach increasingly inaccurate. This inaccuracy is mainly caused by two problems: 1) The ineffective choice of similarity indicators for the formation of a homogeneous group at different regions; and 2) An inadequate number of stations in the pooling group that does not adequately reflect the optimal balance between group size and group homogeneity or achieve the lowest uncertainty in the rainfall quantiles estimates. For the first issue, to consider the temporal difference among different meteorological and topographic indicators, a three-layer design is proposed based on three stages in the extreme rainfall formation: cloud formation, rainfall generation and change of rainfall intensity above urban surface. During the process, the impacts from climate change and urbanization are considered through the inclusion of potential relevant features at each layer. Then to consider spatial difference of similarity indicators for the homogeneous group formation at various regions, an automatic feature selection and weighting algorithm, specifically the hybrid searching algorithm of Tabu search, Lagrange Multiplier and Fuzzy C-means Clustering, is used to select the optimal combination of features for the potential optimal homogenous groups formation at a specific region. For the second issue, to compare the uncertainty of rainfall quantile estimates among potential groups, the two sample Kolmogorov-Smirnov test-based sample ranking process is used. During the process, linear programming is used to rank these groups based on the confidence intervals of the quantile estimates. The proposed methodology fills the gap of including the urbanization impacts during the pooling group formation, and challenges the traditional assumption that the same set of similarity indicators can be equally effective in generating the optimal homogeneous group for regions with different geographic and meteorological characteristics.
Wildfire Selectivity for Land Cover Type: Does Size Matter?
Barros, Ana M. G.; Pereira, José M. C.
2014-01-01
Previous research has shown that fires burn certain land cover types disproportionally to their abundance. We used quantile regression to study land cover proneness to fire as a function of fire size, under the hypothesis that they are inversely related, for all land cover types. Using five years of fire perimeters, we estimated conditional quantile functions for lower (avoidance) and upper (preference) quantiles of fire selectivity for five land cover types - annual crops, evergreen oak woodlands, eucalypt forests, pine forests and shrublands. The slope of significant regression quantiles describes the rate of change in fire selectivity (avoidance or preference) as a function of fire size. We used Monte-Carlo methods to randomly permutate fires in order to obtain a distribution of fire selectivity due to chance. This distribution was used to test the null hypotheses that 1) mean fire selectivity does not differ from that obtained by randomly relocating observed fire perimeters; 2) that land cover proneness to fire does not vary with fire size. Our results show that land cover proneness to fire is higher for shrublands and pine forests than for annual crops and evergreen oak woodlands. As fire size increases, selectivity decreases for all land cover types tested. Moreover, the rate of change in selectivity with fire size is higher for preference than for avoidance. Comparison between observed and randomized data led us to reject both null hypotheses tested ( = 0.05) and to conclude it is very unlikely the observed values of fire selectivity and change in selectivity with fire size are due to chance. PMID:24454747
Boudaghpour, Siamak; Bagheri, Majid; Bagheri, Zahra
2014-01-01
High flood occurrences with large environmental damages have a growing trend in Iran. Dynamic movements of water during a flood cause different environmental damages in geographical areas with different characteristics such as topographic conditions. In general, environmental effects and damages caused by a flood in an area can be investigated from different points of view. The current essay is aiming at detecting environmental effects of flood occurrences in Halilrood catchment area of Kerman province in Iran using flood zone mapping techniques. The intended flood zone map was introduced in four steps. Steps 1 to 3 pave the way to calculate and estimate flood zone map in the understudy area while step 4 determines the estimation of environmental effects of flood occurrence. Based on our studies, wide range of accuracy for estimating the environmental effects of flood occurrence was introduced by using of flood zone mapping techniques. Moreover, it was identified that the existence of Jiroft dam in the study area can decrease flood zone from 260 hectares to 225 hectares and also it can decrease 20% of flood peak intensity. As a result, 14% of flood zone in the study area can be saved environmentally.
Uncertainty in flood damage estimates and its potential effect on investment decisions
NASA Astrophysics Data System (ADS)
Wagenaar, Dennis; de Bruijn, Karin; Bouwer, Laurens; de Moel, Hans
2015-04-01
This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. This Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. This uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.
Uncertainty in flood damage estimates and its potential effect on investment decisions
NASA Astrophysics Data System (ADS)
Wagenaar, D. J.; de Bruijn, K. M.; Bouwer, L. M.; De Moel, H.
2015-01-01
This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. As input the Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. The resulting uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.
Liu, Sze Yan; Kawachi, Ichiro; Glymour, M Maria
2012-09-01
Concerns have been raised that education may have greater benefits for persons at high risk of coronary heart disease (CHD) than for those at low risk. We estimated the association of education (less than high school, high school, or college graduates) with 10-year CHD risk and body mass index (BMI), using linear and quantile regression models, in the following two nationally representative datasets: the 2006 wave of the Health and Retirement Survey and the 2003-2008 National Health and Nutrition Examination Survey (NHANES). Higher educational attainment was associated with lower 10-year CHD risk for all groups. However, the magnitude of this association varied considerably across quantiles for some subgroups. For example, among women in NHANES, a high school degree was associated with 4% (95% confidence interval = -9% to 1%) and 17% (-24% to -8%) lower CHD risk in the 10th and 90th percentiles, respectively. For BMI, a college degree was associated with uniform decreases across the distribution for women, but with varying increases for men. Compared with those who had not completed high school, male college graduates in the NHANES sample had a BMI that was 6% greater (2% to 11%) at the 10th percentile of the BMI distribution and 7% lower (-10% to -3%) at the 90th percentile (ie, overweight/obese). Estimates from the Health and Retirement Survey sample and the marginal quantile regression models showed similar patterns. Conventional regression methods may mask important variations in the associations between education and CHD risk.
Kennedy, Jeffrey R.; Paretti, Nicholas V.
2014-01-01
Flooding in urban areas routinely causes severe damage to property and often results in loss of life. To investigate the effect of urbanization on the magnitude and frequency of flood peaks, a flood frequency analysis was carried out using data from urbanized streamgaging stations in Phoenix and Tucson, Arizona. Flood peaks at each station were predicted using the log-Pearson Type III distribution, fitted using the expected moments algorithm and the multiple Grubbs-Beck low outlier test. The station estimates were then compared to flood peaks estimated by rural-regression equations for Arizona, and to flood peaks adjusted for urbanization using a previously developed procedure for adjusting U.S. Geological Survey rural regression peak discharges in an urban setting. Only smaller, more common flood peaks at the 50-, 20-, 10-, and 4-percent annual exceedance probabilities (AEPs) demonstrate any increase in magnitude as a result of urbanization; the 1-, 0.5-, and 0.2-percent AEP flood estimates are predicted without bias by the rural-regression equations. Percent imperviousness was determined not to account for the difference in estimated flood peaks between stations, either when adjusting the rural-regression equations or when deriving urban-regression equations to predict flood peaks directly from basin characteristics. Comparison with urban adjustment equations indicates that flood peaks are systematically overestimated if the rural-regression-estimated flood peaks are adjusted upward to account for urbanization. At nearly every streamgaging station in the analysis, adjusted rural-regression estimates were greater than the estimates derived using station data. One likely reason for the lack of increase in flood peaks with urbanization is the presence of significant stormwater retention and detention structures within the watershed used in the study.
Time Series Model Identification by Estimating Information, Memory, and Quantiles.
1983-07-01
Standards, Sect. D, 68D, 937-951. Parzen, Emanuel (1969) "Multiple time series modeling" Multivariate Analysis - II, edited by P. Krishnaiah , Academic... Krishnaiah , North Holland: Amsterdam, 283-295. Parzen, Emanuel (1979) "Forecasting and Whitening Filter Estimation" TIMS Studies in the Management...principle. Applications of Statistics, P. R. Krishnaiah , ed. North Holland: Amsterdam, 27-41. Box, G. E. P. and Jenkins, G. M. (1970) Time Series Analysis
NASA Astrophysics Data System (ADS)
Chapman, S. C.; Stainforth, D. A.; Watkins, N. W.
2014-12-01
Estimates of how our climate is changing are needed locally in order to inform adaptation planning decisions. This requires quantifying the geographical patterns in changes at specific quantiles or thresholds in distributions of variables such as daily temperature or precipitation. We develop a method[1] for analysing local climatic timeseries to assess which quantiles of the local climatic distribution show the greatest and most robust changes, to specifically address the challenges presented by 'heavy tailed' distributed variables such as daily precipitation. We extract from the data quantities that characterize the changes in time of the likelihood of daily precipitation above a threshold and of the relative amount of precipitation in those extreme precipitation days. Our method is a simple mathematical deconstruction of how the difference between two observations from two different time periods can be assigned to the combination of natural statistical variability and/or the consequences of secular climate change. This deconstruction facilitates an assessment of how fast different quantiles of precipitation distributions are changing. This involves both determining which quantiles and geographical locations show the greatest change but also, those at which any change is highly uncertain. We demonstrate this approach using E-OBS gridded data[2] timeseries of local daily precipitation from specific locations across Europe over the last 60 years. We treat geographical location and precipitation as independent variables and thus obtain as outputs the pattern of change at a given threshold of precipitation and with geographical location. This is model- independent, thus providing data of direct value in model calibration and assessment. Our results identify regionally consistent patterns which, dependent on location, show systematic increase in precipitation on the wettest days, shifts in precipitation patterns to less moderate days and more heavy days, and drying across all days which is of potential value in adaptation planning. [1] S C Chapman, D A Stainforth, N W Watkins, 2013 Phil. Trans. R. Soc. A, 371 20120287; D. A. Stainforth, S. C. Chapman, N. W. Watkins, 2013 Environ. Res. Lett. 8, 034031 [2] Haylock et al. 2008 J. Geophys. Res (Atmospheres), 113, D20119
Feaster, Toby D.; Gotvald, Anthony J.; Weaver, J. Curtis
2009-01-01
For more than 50 years, the U.S. Geological Survey (USGS) has been developing regional regression equations that can be used to estimate flood magnitude and frequency at ungaged sites. Flood magnitude relates to the volume of flow that occurs over some period of time and usually is presented in cubic feet per second. Flood frequency relates to the probability of occurrence of a flood; that is, on average, what is the likelihood that a flood with a specified magnitude will occur in any given year (1 percent chance, 10 percent chance, 50 percent chance, and so on). Such flood estimates are needed for the efficient design of bridges, highway embankments, levees, and other structures near streams. In addition, these estimates are needed for the effective planning and management of land and water resources, to protect lives and property in flood-prone areas, and to determine flood-insurance rates.
NASA Astrophysics Data System (ADS)
Braud, Isabelle; Breil, Pascal; Javelle, Pierre; Pejakovic, Nikola; Guérin, Stéphane
2017-04-01
The Yzeron periurban catchment (150 km2) is prone to flash floods leading to overflow in the downstream part of the catchment. A prevention and management plan has been approved and the set-up of a flood forecasting system is planned. The present study presents a comparison of several solutions for flood forecasting in the catchment. It is based on an extensive data collection (rain gauges, radar/rain gauge reanalyses, discharge and water level data) from this experimental catchment. A set of rainfall-runoff events leading to floods (problematic and non-problematic floods) was extracted and formed the basis for the definition of a first forecasting method. It is based on data analysis and the identification of explaining factors amongst the following: rainfall amount, intensity, antecedent rainfall, initial discharge. Several statistical methods including Factorial Analysis of Mixed Data and Classification and Regression Tree were used for this purpose. They showed that several classes of problematic floods can be identified. The first one is related to wet conditions characterized with high initial discharge and antecedent rainfall. The second class is driven by rainfall amount, initial discharge and rainfall intensity. Thresholds of these variables can be identified to provide a first warning. The second forecasting method assessed in the study is the system that will be operational in France in 2017, based on the AIGA method (Javelle et al., 2016). For this purpose, 18-year discharge simulation using the hydrological model of the AIGA method, forced using radar/rain gauges reanalysis were available at 44 locations within the catchment. The dates for which quantiles of a given return period were overtopped were identified and compared with the list of problematic events. The AIGA method was found relevant in identifying the most problematic events, but the lead time needs further investigation in order to assess the usefulness for population warning. References: Pierre Javelle, Didier Organde, Julie Demargne, Clotilde Saint-Martin, Céline de Saint-Aubin, Léa Garandeau and Bruno Janet (2016). Setting up a French national flash flood warning system for ungauged catchments based on the AIGA method. E3S Web of Conferences 7, 18010 (2016), 3rd European Conference on Flood Risk Management (FLOODrisk 2016), http://dx.doi.org/10.1051/e3sconf/20160718010
ERIC Educational Resources Information Center
Bitler, Marianne; Domina, Thurston; Penner, Emily; Hoynes, Hilary
2015-01-01
We use quantile treatment effects estimation to examine the consequences of the random-assignment New York City School Choice Scholarship Program across the distribution of student achievement. Our analyses suggest that the program had negligible and statistically insignificant effects across the skill distribution. In addition to contributing to…
Public health impacts of ecosystem change in the Brazilian Amazon
Bauch, Simone C.; Birkenbach, Anna M.; Pattanayak, Subhrendu K.; Sills, Erin O.
2015-01-01
The claim that nature delivers health benefits rests on a thin empirical evidence base. Even less evidence exists on how specific conservation policies affect multiple health outcomes. We address these gaps in knowledge by combining municipal-level panel data on diseases, public health services, climatic factors, demographics, conservation policies, and other drivers of land-use change in the Brazilian Amazon. To fully exploit this dataset, we estimate random-effects and quantile regression models of disease incidence. We find that malaria, acute respiratory infection (ARI), and diarrhea incidence are significantly and negatively correlated with the area under strict environmental protection. Results vary by disease for other types of protected areas (PAs), roads, and mining. The relationships between diseases and land-use change drivers also vary by quantile of the disease distribution. Conservation scenarios based on estimated regression results suggest that malaria, ARI, and diarrhea incidence would be reduced by expanding strict PAs, and malaria could be further reduced by restricting roads and mining. Although these relationships are complex, we conclude that interventions to preserve natural capital can deliver cobenefits by also increasing human (health) capital. PMID:26082548
Future extreme water levels and floodplains in Gironde Estuary considering climate change
NASA Astrophysics Data System (ADS)
Laborie, V.; Hissel, F.; Sergent, P.
2012-04-01
Within THESEUS European project, an overflowing model of Gironde Estuary has been used to evaluate future surge levels at Le Verdon and future water levels at 6 specific sites of the estuary : le Verdon, Richard, Laména, Pauillac, Le Marquis and Bordeaux. It was then used to study the evolution of floodplains' location and areas towards 2100 in the entire Estuary. In this study, no breaching and no modification in the elevation of the dike was considered. The model was fed by several data sources : wind fields at Royan and Mérignac interpolated from the grid of the European Climatolologic Model CLM/SGA, a tide signal at Le Verdon, the discharges of Garonne (at La Réole), the Dordogne (at Pessac) and Isle (at Libourne). A simplified mathematical model of surge levels has been adjusted at Le Verdon with 10 surge storms and by using wind and pressure fields given by CLM/SGA. This adjustment was led so that the statistical analysis of the global signal at Le Verdon gives the same quantiles as the same analysis driven on maregraphic observations for the period [1960 ; 2000]. The assumption used for sea level rise was the pessimistic one of the French national institute for climate change: 60 cm in 2100. The model was then used to study the evolution of extreme water levels towards 2100. The analysis of surge levels at Le Verdon shows a decrease in quantiles which is coherent with the analysis of climatologic fields. The analysis of water levels shows that the increase in mean water levels quantiles represents only a part of sea level rise in Gironde Estuary. Moreover this effect seems to decrease from the maritime limit of the model towards upstream. Concerning floodplains, those corresponding to return periods from 2 to 100 years for present conditions and 3 slices [2010; 2039], [2040; 2069] and [2070; 2099] have been mapped for 3 areas in Gironde Estuary : around Le Verdon, at the confluence between Garonne and Dordogne, and near Bordeaux. Concerning the evolution of floodplains in Gironde Estuary, taking into account IPCC scenario A1B, under the same assumptions, it appears that the impact of the climate change on the quantiles of water levels in floodplains depends on the sea level rise over the period considered ([2010; 2039], [2040; 2069], [2070; 2099]) and that areas which are not flooded today for weak return periods become submerged towards 2100. The neighborhood of Le Verdon undergoes a negative impact only in the medium and long term. For the period [2010; 2039], a small reduction of floodplains can be observed in quantiles of water levels for all return periods. Under those assumptions, in the area of Bordeaux, significant effects would be felt along the road RN230 towards 2100. The effects of the discharges and dike breaching will have to be studied in order to precise these results.
Current and future pluvial flood hazard analysis for the city of Antwerp
NASA Astrophysics Data System (ADS)
Willems, Patrick; Tabari, Hossein; De Niel, Jan; Van Uytven, Els; Lambrechts, Griet; Wellens, Geert
2016-04-01
For the city of Antwerp in Belgium, higher rainfall extremes were observed in comparison with surrounding areas. The differences were found statistically significant for some areas and may be the result of the heat island effect in combination with the higher concentrations of aerosols. A network of 19 rain gauges but with varying records length (the longest since the 1960s) and continuous radar data for 10 years were combined to map the spatial variability of rainfall extremes over the city at various durations from 15 minutes to 1 day together with the uncertainty. The improved spatial rainfall information was used as input in the sewer system model of the city to analyze the frequency of urban pluvial floods. Comparison with historical flood observations from various sources (fire brigade and media) confirmed that the improved spatial rainfall information also improved sewer impact results on both the magnitude and frequency of the sewer floods. Next to these improved urban flood impact results for recent and current climatological conditions, the new insights on the local rainfall microclimate were also helpful to enhance future projections on rainfall extremes and pluvial floods in the city. This was done by improved statistical downscaling of all available CMIP5 global climate model runs (160 runs) for the 4 RCP scenarios, as well as the available EURO-CORDEX regional climate model runs. Two types of statistical downscaling methods were applied for that purpose (a weather typing based method, and a quantile perturbation approach), making use of the microclimate results and its dependency on specific weather types. Changes in extreme rainfall intensities were analyzed and mapped as a function of the RCP scenario, together with the uncertainty, decomposed in the uncertainties related to the climate models, the climate model initialization or limited length of the 30-year time series (natural climate variability) and the statistical downscaling (albeit limited to two types of methods). These were finally transferred into future pluvial flash flood hazard maps for the city together with the uncertainties, and are considered as basis for spatial planning and adaptation.
Modeling energy expenditure in children and adolescents using quantile regression
USDA-ARS?s Scientific Manuscript database
Advanced mathematical models have the potential to capture the complex metabolic and physiological processes that result in energy expenditure (EE). Study objective is to apply quantile regression (QR) to predict EE and determine quantile-dependent variation in covariate effects in nonobese and obes...
BORAH, BIJAN J.; BASU, ANIRBAN
2014-01-01
The quantile regression (QR) framework provides a pragmatic approach in understanding the differential impacts of covariates along the distribution of an outcome. However, the QR framework that has pervaded the applied economics literature is based on the conditional quantile regression method. It is used to assess the impact of a covariate on a quantile of the outcome conditional on specific values of other covariates. In most cases, conditional quantile regression may generate results that are often not generalizable or interpretable in a policy or population context. In contrast, the unconditional quantile regression method provides more interpretable results as it marginalizes the effect over the distributions of other covariates in the model. In this paper, the differences between these two regression frameworks are highlighted, both conceptually and econometrically. Additionally, using real-world claims data from a large US health insurer, alternative QR frameworks are implemented to assess the differential impacts of covariates along the distribution of medication adherence among elderly patients with Alzheimer’s disease. PMID:23616446
Estimated flood-inundation maps for Cowskin Creek in western Wichita, Kansas
Studley, Seth E.
2003-01-01
The October 31, 1998, flood on Cowskin Creek in western Wichita, Kansas, caused millions of dollars in damages. Emergency management personnel and flood mitigation teams had difficulty in efficiently identifying areas affected by the flooding, and no warning was given to residents because flood-inundation information was not available. To provide detailed information about future flooding on Cowskin Creek, high-resolution estimated flood-inundation maps were developed using geographic information system technology and advanced hydraulic analysis. Two-foot-interval land-surface elevation data from a 1996 flood insurance study were used to create a three-dimensional topographic representation of the study area for hydraulic analysis. The data computed from the hydraulic analyses were converted into geographic information system format with software from the U.S. Army Corps of Engineers' Hydrologic Engineering Center. The results were overlaid on the three-dimensional topographic representation of the study area to produce maps of estimated flood-inundation areas and estimated depths of water in the inundated areas for 1-foot increments on the basis of stream stage at an index streamflow-gaging station. A Web site (http://ks.water.usgs.gov/Kansas/cowskin.floodwatch) was developed to provide the public with information pertaining to flooding in the study area. The Web site shows graphs of the real-time streamflow data for U.S. Geological Survey gaging stations in the area and monitors the National Weather Service Arkansas-Red Basin River Forecast Center for Cowskin Creek flood-forecast information. When a flood is forecast for the Cowskin Creek Basin, an estimated flood-inundation map is displayed for the stream stage closest to the National Weather Service's forecasted peak stage. Users of the Web site are able to view the estimated flood-inundation maps for selected stages at any time and to access information about this report and about flooding in general. Flood recovery teams also have the ability to view the estimated flood-inundation map pertaining to the most recent flood. The availability of these maps and the ability to monitor the real-time stream stage through the U.S. Geological Survey Web site provide emergency management personnel and residents with information that is critical for evacuation and rescue efforts in the event of a flood as well as for post-flood recovery efforts.
NASA Astrophysics Data System (ADS)
Yuan, Fei; Zhao, Chongxu; Jiang, Yong; Ren, Liliang; Shan, Hongcui; Zhang, Limin; Zhu, Yonghua; Chen, Tao; Jiang, Shanhu; Yang, Xiaoli; Shen, Hongren
2017-11-01
Projections of hydrological changes are associated with large uncertainties from different sources, which should be quantified for an effective implementation of water management policies adaptive to future climate change. In this study, a modeling chain framework to project future hydrological changes and the associated uncertainties in the Xijiang River basin, South China, was established. The framework consists of three emission scenarios (ESs), four climate models (CMs), four statistical downscaling (SD) methods, four hydrological modeling (HM) schemes, and four probability distributions (PDs) for extreme flow frequency analyses. Direct variance method was adopted to analyze the manner by which uncertainty sources such as ES, CM, SD, and HM affect the estimates of future evapotranspiration (ET) and streamflow, and to quantify the uncertainties of PDs in future flood and drought risk assessment. Results show that ES is one of the least important uncertainty sources in most situations. CM, in general, is the dominant uncertainty source for the projections of monthly ET and monthly streamflow during most of the annual cycle, daily streamflow below the 99.6% quantile level, and extreme low flow. SD is the most predominant uncertainty source in the projections of extreme high flow, and has a considerable percentage of uncertainty contribution in monthly streamflow projections in July-September. The effects of SD in other cases are negligible. HM is a non-ignorable uncertainty source that has the potential to produce much larger uncertainties for the projections of low flow and ET in warm and wet seasons than for the projections of high flow. PD contributes a larger percentage of uncertainty in extreme flood projections than it does in extreme low flow estimates. Despite the large uncertainties in hydrological projections, this work found that future extreme low flow would undergo a considerable reduction, and a noticeable increase in drought risk in the Xijiang River basin would be expected. Thus, the necessity of employing effective water-saving techniques and adaptive water resources management strategies for drought disaster mitigation should be addressed.
Kim, Moon H.; Morlock, Scott E.; Arihood, Leslie D.; Kiesler, James L.
2011-01-01
Near-real-time and forecast flood-inundation mapping products resulted from a pilot study for an 11-mile reach of the White River in Indianapolis. The study was done by the U.S. Geological Survey (USGS), Indiana Silver Jackets hazard mitigation taskforce members, the National Weather Service (NWS), the Polis Center, and Indiana University, in cooperation with the City of Indianapolis, the Indianapolis Museum of Art, the Indiana Department of Homeland Security, and the Indiana Department of Natural Resources, Division of Water. The pilot project showed that it is technically feasible to create a flood-inundation map library by means of a two-dimensional hydraulic model, use a map from the library to quickly complete a moderately detailed local flood-loss estimate, and automatically run the hydraulic model during a flood event to provide the maps and flood-damage information through a Web graphical user interface. A library of static digital flood-inundation maps was created by means of a calibrated two-dimensional hydraulic model. Estimated water-surface elevations were developed for a range of river stages referenced to a USGS streamgage and NWS flood forecast point colocated within the study reach. These maps were made available through the Internet in several formats, including geographic information system, Keyhole Markup Language, and Portable Document Format. A flood-loss estimate was completed for part of the study reach by using one of the flood-inundation maps from the static library. The Federal Emergency Management Agency natural disaster-loss estimation program HAZUS-MH, in conjunction with local building information, was used to complete a level 2 analysis of flood-loss estimation. A Service-Oriented Architecture-based dynamic flood-inundation application was developed and was designed to start automatically during a flood, obtain near real-time and forecast data (from the colocated USGS streamgage and NWS flood forecast point within the study reach), run the two-dimensional hydraulic model, and produce flood-inundation maps. The application used local building data and depth-damage curves to estimate flood losses based on the maps, and it served inundation maps and flood-loss estimates through a Web-based graphical user interface.
The effect of smoking habit changes on body weight: Evidence from the UK.
Pieroni, Luca; Salmasi, Luca
2016-03-01
This paper evaluates the causal relationship between smoking and body weight through two waves (2004-2006) of the British Household Panel Survey. We model the effect of changes in smoking habits, such as quitting or reducing, and account for the heterogeneous responses of individuals located at different points of the body mass distribution by quantile regression. We test our results by means of a large set of control groups and investigate their robustness by using the changes-in-changes estimator and accounting for different thresholds to define smoking reductions. Our results reveal the positive effect of quitting smoking on weight changes, which is also found to increase in the highest quantiles, whereas the decision to reduce smoking does not affect body weight. Copyright © 2015 Elsevier B.V. All rights reserved.
Methods for estimating magnitude and frequency of floods in Montana based on data through 1983
Omang, R.J.; Parrett, Charles; Hull, J.A.
1986-01-01
Equations are presented for estimating flood magnitudes for ungaged sites in Montana based on data through 1983. The State was divided into eight regions based on hydrologic conditions, and separate multiple regression equations were developed for each region. These equations relate annual flood magnitudes and frequencies to basin characteristics and are applicable only to natural flow streams. In three of the regions, equations also were developed relating flood magnitudes and frequencies to basin characteristics and channel geometry measurements. The standard errors of estimate for an exceedance probability of 1% ranged from 39% to 87%. Techniques are described for estimating annual flood magnitude and flood frequency information at ungaged sites based on data from gaged sites on the same stream. Included are curves relating flood frequency information to drainage area for eight major streams in the State. Maximum known flood magnitudes in Montana are compared with estimated 1 %-chance flood magnitudes and with maximum known floods in the United States. Values of flood magnitudes for selected exceedance probabilities and values of significant basin characteristics and channel geometry measurements for all gaging stations used in the analysis are tabulated. Included are 375 stations in Montana and 28 nearby stations in Canada and adjoining States. (Author 's abstract)
McDonald, S; Ortaglia, A; Supino, C; Kacka, M; Clenin, M; Bottai, M
2017-06-01
This study comprehensively explores racial/ethnic disparities in waist circumference (WC) after adjusting for cardiorespiratory fitness (CRF), among both adult and adolescent women, across WC percentiles. Analysis was conducted using data from the 1999 to 2004 National Health and Nutrition Examination Survey. Female participants ( n = 3,977) aged 12-49 years with complete data on CRF, height, weight and WC were included. Quantile regression models, stratified by age groups (12-15, 16-19 and 20-49 years), were used to assess the association between WC and race/ethnicity adjusting for CRF, height and age across WC percentiles (10th, 25th, 50th, 75th, 90th and 95th). For non-Hispanic (NH) Black, in both the 16-19 and 20-49 years age groups, estimated WC was significantly greater than for NH White across percentiles above the median with estimates ranging from 5.2 to 11.5 cm. For Mexican Americans, in all age groups, estimated WC tended to be significantly greater than for NH White particularly for middle percentiles (50th and 75th) with point estimates ranging from 1.9 to 8.4 cm. Significant disparities in WC between NH Black and Mexican women, as compared to NH White, remain even after adjustment for CRF. The magnitude of the disparities associated with race/ethnicity differs across WC percentiles and age groups.
Prenatal Lead Exposure and Fetal Growth: Smaller Infants Have Heightened Susceptibility
Rodosthenous, Rodosthenis S.; Burris, Heather H.; Svensson, Katherine; Amarasiriwardena, Chitra J.; Cantoral, Alejandra; Schnaas, Lourdes; Mercado-García, Adriana; Coull, Brent A.; Wright, Robert O.; Téllez-Rojo, Martha M.; Baccarelli, Andrea A.
2016-01-01
Background As population lead levels decrease, the toxic effects of lead may be distributed to more sensitive populations, such as infants with poor fetal growth. Objectives To determine the association of prenatal lead exposure and fetal growth; and to evaluate whether infants with poor fetal growth are more susceptible to lead toxicity than those with normal fetal growth. Methods We examined the association of second trimester maternal blood lead levels (BLL) with birthweight-for-gestational age (BWGA) z-score in 944 mother-infant participants of the PROGRESS cohort. We determined the association between maternal BLL and BWGA z-score by using both linear and quantile regression. We estimated odds ratios for small-for-gestational age (SGA) infants between maternal BLL quartiles using logistic regression. Maternal age, body mass index, socioeconomic status, parity, household smoking exposure, hemoglobin levels, and infant sex were included as confounders. Results While linear regression showed a negative association between maternal BLL and BWGA z-score (β=−0.06 z-score units per log2 BLL increase; 95% CI: −0.13, 0.003; P=0.06), quantile regression revealed larger magnitudes of this association in the <30th percentiles of BWGA z-score (β range [−0.08, −0.13] z-score units per log2 BLL increase; all P values <0.05). Mothers in the highest BLL quartile had an odds ratio of 1.62 (95% CI: 0.99–2.65) for having a SGA infant compared to the lowest BLL quartile. Conclusions While both linear and quantile regression showed a negative association between prenatal lead exposure and birthweight, quantile regression revealed that smaller infants may represent a more susceptible subpopulation. PMID:27923585
Prenatal lead exposure and fetal growth: Smaller infants have heightened susceptibility.
Rodosthenous, Rodosthenis S; Burris, Heather H; Svensson, Katherine; Amarasiriwardena, Chitra J; Cantoral, Alejandra; Schnaas, Lourdes; Mercado-García, Adriana; Coull, Brent A; Wright, Robert O; Téllez-Rojo, Martha M; Baccarelli, Andrea A
2017-02-01
As population lead levels decrease, the toxic effects of lead may be distributed to more sensitive populations, such as infants with poor fetal growth. To determine the association of prenatal lead exposure and fetal growth; and to evaluate whether infants with poor fetal growth are more susceptible to lead toxicity than those with normal fetal growth. We examined the association of second trimester maternal blood lead levels (BLL) with birthweight-for-gestational age (BWGA) z-score in 944 mother-infant participants of the PROGRESS cohort. We determined the association between maternal BLL and BWGA z-score by using both linear and quantile regression. We estimated odds ratios for small-for-gestational age (SGA) infants between maternal BLL quartiles using logistic regression. Maternal age, body mass index, socioeconomic status, parity, household smoking exposure, hemoglobin levels, and infant sex were included as confounders. While linear regression showed a negative association between maternal BLL and BWGA z-score (β=-0.06 z-score units per log 2 BLL increase; 95% CI: -0.13, 0.003; P=0.06), quantile regression revealed larger magnitudes of this association in the <30th percentiles of BWGA z-score (β range [-0.08, -0.13] z-score units per log 2 BLL increase; all P values<0.05). Mothers in the highest BLL quartile had an odds ratio of 1.62 (95% CI: 0.99-2.65) for having a SGA infant compared to the lowest BLL quartile. While both linear and quantile regression showed a negative association between prenatal lead exposure and birthweight, quantile regression revealed that smaller infants may represent a more susceptible subpopulation. Copyright © 2016 Elsevier Ltd. All rights reserved.
Prigent, Amélie; Kamendje-Tchokobou, Blaise; Chevreul, Karine
2017-11-01
Health-related quality of life (HRQoL) is a widely used concept in the assessment of health care. Some generic HRQoL instruments, based on specific algorithms, can generate utility scores which reflect the preferences of the general population for the different health states described by the instrument. This study aimed to investigate the relationships between utility scores and potentially associated factors in patients with mental disorders followed in inpatient and/or outpatient care settings using two statistical methods. Patients were recruited in four psychiatric sectors in France. Patient responses to the SF-36 generic HRQoL instrument were used to calculate SF-6D utility scores. The relationships between utility scores and patient socio-demographic, clinical characteristics, and mental health care utilization, considered as potentially associated factors, were studied using OLS and quantile regressions. One hundred and seventy six patients were included. Women, severely ill patients and those hospitalized full-time tended to report lower utility scores, whereas psychotic disorders (as opposed to mood disorders) and part-time care were associated with higher scores. The quantile regression highlighted that the size of the associations between the utility scores and some patient characteristics varied along with the utility score distribution, and provided more accurate estimated values than OLS regression. The quantile regression may constitute a relevant complement for the analysis of factors associated with utility scores. For policy decision-making, the association of full-time hospitalization with lower utility scores while part-time care was associated with higher scores supports the further development of alternatives to full-time hospitalizations.
Artes, Paul H; Crabb, David P
2010-01-01
To investigate why the specificity of the Moorfields Regression Analysis (MRA) of the Heidelberg Retina Tomograph (HRT) varies with disc size, and to derive accurate normative limits for neuroretinal rim area to address this problem. Two datasets from healthy subjects (Manchester, UK, n = 88; Halifax, Nova Scotia, Canada, n = 75) were used to investigate the physiological relationship between the optic disc and neuroretinal rim area. Normative limits for rim area were derived by quantile regression (QR) and compared with those of the MRA (derived by linear regression). Logistic regression analyses were performed to quantify the association between disc size and positive classifications with the MRA, as well as with the QR-derived normative limits. In both datasets, the specificity of the MRA depended on optic disc size. The odds of observing a borderline or outside-normal-limits classification increased by approximately 10% for each 0.1 mm(2) increase in disc area (P < 0.1). The lower specificity of the MRA with large optic discs could be explained by the failure of linear regression to model the extremes of the rim area distribution (observations far from the mean). In comparison, the normative limits predicted by QR were larger for smaller discs (less specific, more sensitive), and smaller for larger discs, such that false-positive rates became independent of optic disc size. Normative limits derived by quantile regression appear to remove the size-dependence of specificity with the MRA. Because quantile regression does not rely on the restrictive assumptions of standard linear regression, it may be a more appropriate method for establishing normative limits in other clinical applications where the underlying distributions are nonnormal or have nonconstant variance.
Updating estimates of low streamflow statistics to account for possible trends
NASA Astrophysics Data System (ADS)
Blum, A. G.; Archfield, S. A.; Hirsch, R. M.; Vogel, R. M.; Kiang, J. E.; Dudley, R. W.
2017-12-01
Given evidence of both increasing and decreasing trends in low flows in many streams, methods are needed to update estimators of low flow statistics used in water resources management. One such metric is the 10-year annual low-flow statistic (7Q10) calculated as the annual minimum seven-day streamflow which is exceeded in nine out of ten years on average. Historical streamflow records may not be representative of current conditions at a site if environmental conditions are changing. We present a new approach to frequency estimation under nonstationary conditions that applies a stationary nonparametric quantile estimator to a subset of the annual minimum flow record. Monte Carlo simulation experiments were used to evaluate this approach across a range of trend and no trend scenarios. Relative to the standard practice of using the entire available streamflow record, use of a nonparametric quantile estimator combined with selection of the most recent 30 or 50 years for 7Q10 estimation were found to improve accuracy and reduce bias. Benefits of data subset selection approaches were greater for higher magnitude trends annual minimum flow records with lower coefficients of variation. A nonparametric trend test approach for subset selection did not significantly improve upon always selecting the last 30 years of record. At 174 stream gages in the Chesapeake Bay region, 7Q10 estimators based on the most recent 30 years of flow record were compared to estimators based on the entire period of record. Given the availability of long records of low streamflow, using only a subset of the flow record ( 30 years) can be used to update 7Q10 estimators to better reflect current streamflow conditions.
Predicting Word Reading Ability: A Quantile Regression Study
ERIC Educational Resources Information Center
McIlraith, Autumn L.
2018-01-01
Predictors of early word reading are well established. However, it is unclear if these predictors hold for readers across a range of word reading abilities. This study used quantile regression to investigate predictive relationships at different points in the distribution of word reading. Quantile regression analyses used preschool and…
Re-assessing the flood risk in Scotland.
Black, Andrew R; Burns, John C
2002-07-22
This paper presents a review of changes in flood risk estimation on Scottish rivers resulting from re-analysis of flood records or from the application of new methods. The review arises at a time when flood damages have received recent prominence through the occurrence of a number of extreme floods in Scotland, and when the possible impacts of climate change on flood risk are receiving considerable attention. An analysis of the nine longest available peaks-over-threshold (POT) flood series for Scottish rivers reveals that, for thresholds yielding two events per year on average, annual POT frequencies on western rivers have increased in the 1980s/1990s to maximum recorded values, while in the east, values were highest in the 1950s/1960s. These results support the results of flood modelling work based on rainfall and temperature records from the 1870s, which indicate that, in western catchments, annual POT frequencies in the 1980s/1990s are unprecedented. No general trends in flood magnitude series were found, but an unexpected cluster of extreme floods is identified as having occurred since 1988, resulting in eight of Scotland's 16 largest gauged rivers producing their maximum recorded flows since then. These shifts are related to recent increases in the dominance of westerly airflows, share similarities with the results of climate change modelling, and collectively point to increases in flood risk in many parts of Scotland. The paper also reviews advances in flood risk estimation arising from the publication of the UK Flood Estimation Handbook, developments in the collection and use of historic flood estimation and the production of maps of 100-year flood areal extent. Finally the challenges in flood risk estimation posed by climate change are examined, particularly in relation to the assumption of stationarity.
NASA Astrophysics Data System (ADS)
Gomes, Zahra; Jarvis, Matt J.; Almosallam, Ibrahim A.; Roberts, Stephen J.
2018-03-01
The next generation of large-scale imaging surveys (such as those conducted with the Large Synoptic Survey Telescope and Euclid) will require accurate photometric redshifts in order to optimally extract cosmological information. Gaussian Process for photometric redshift estimation (GPZ) is a promising new method that has been proven to provide efficient, accurate photometric redshift estimations with reliable variance predictions. In this paper, we investigate a number of methods for improving the photometric redshift estimations obtained using GPZ (but which are also applicable to others). We use spectroscopy from the Galaxy and Mass Assembly Data Release 2 with a limiting magnitude of r < 19.4 along with corresponding Sloan Digital Sky Survey visible (ugriz) photometry and the UKIRT Infrared Deep Sky Survey Large Area Survey near-IR (YJHK) photometry. We evaluate the effects of adding near-IR magnitudes and angular size as features for the training, validation, and testing of GPZ and find that these improve the accuracy of the results by ˜15-20 per cent. In addition, we explore a post-processing method of shifting the probability distributions of the estimated redshifts based on their Quantile-Quantile plots and find that it improves the bias by ˜40 per cent. Finally, we investigate the effects of using more precise photometry obtained from the Hyper Suprime-Cam Subaru Strategic Program Data Release 1 and find that it produces significant improvements in accuracy, similar to the effect of including additional features.
Estimating flood hydrographs and volumes for Alabama streams
Olin, D.A.; Atkins, J.B.
1988-01-01
The hydraulic design of highway drainage structures involves an evaluation of the effect of the proposed highway structures on lives, property, and stream stability. Flood hydrographs and associated flood volumes are useful tools in evaluating these effects. For design purposes, the Alabama Highway Department needs information on flood hydrographs and volumes associated with flood peaks of specific recurrence intervals (design floods) at proposed or existing bridge crossings. This report will provide the engineer with a method to estimate flood hydrographs, volumes, and lagtimes for rural and urban streams in Alabama with drainage areas less than 500 sq mi. Existing computer programs and methods to estimate flood hydrographs and volumes for ungaged streams have been developed in Georgia. These computer programs and methods were applied to streams in Alabama. The report gives detailed instructions on how to estimate flood hydrographs for ungaged rural or urban streams in Alabama with drainage areas less than 500 sq mi, without significant in-channel storage or regulations. (USGS)
Estimates of present and future flood risk in the conterminous United States
NASA Astrophysics Data System (ADS)
Wing, Oliver E. J.; Bates, Paul D.; Smith, Andrew M.; Sampson, Christopher C.; Johnson, Kris A.; Fargione, Joseph; Morefield, Philip
2018-03-01
Past attempts to estimate rainfall-driven flood risk across the US either have incomplete coverage, coarse resolution or use overly simplified models of the flooding process. In this paper, we use a new 30 m resolution model of the entire conterminous US with a 2D representation of flood physics to produce estimates of flood hazard, which match to within 90% accuracy the skill of local models built with detailed data. These flood depths are combined with exposure datasets of commensurate resolution to calculate current and future flood risk. Our data show that the total US population exposed to serious flooding is 2.6-3.1 times higher than previous estimates, and that nearly 41 million Americans live within the 1% annual exceedance probability floodplain (compared to only 13 million when calculated using FEMA flood maps). We find that population and GDP growth alone are expected to lead to significant future increases in exposure, and this change may be exacerbated in the future by climate change.
Assessing the impact of climate and land use changes on extreme floods in a large tropical catchment
NASA Astrophysics Data System (ADS)
Jothityangkoon, Chatchai; Hirunteeyakul, Chow; Boonrawd, Kowit; Sivapalan, Murugesu
2013-05-01
In the wake of the recent catastrophic floods in Thailand, there is considerable concern about the safety of large dams designed and built some 50 years ago. In this paper a distributed rainfall-runoff model appropriate for extreme flood conditions is used to generate revised estimates of the Probable Maximum Flood (PMF) for the Upper Ping River catchment (area 26,386 km2) in northern Thailand, upstream of location of the large Bhumipol Dam. The model has two components: a continuous water balance model based on a configuration of parameters estimated from climate, soil and vegetation data and a distributed flood routing model based on non-linear storage-discharge relationships of the river network under extreme flood conditions. The model is implemented under several alternative scenarios regarding the Probable Maximum Precipitation (PMP) estimates and is also used to estimate the potential effects of both climate change and land use and land cover changes on the extreme floods. These new estimates are compared against estimates using other hydrological models, including the application of the original prediction methods under current conditions. Model simulations and sensitivity analyses indicate that a reasonable Probable Maximum Flood (PMF) at the dam site is 6311 m3/s, which is only slightly higher than the original design flood of 6000 m3/s. As part of an uncertainty assessment, the estimated PMF is sensitive to the design method, input PMP, land use changes and the floodplain inundation effect. The increase of PMP depth by 5% can cause a 7.5% increase in PMF. Deforestation by 10%, 20%, 30% can result in PMF increases of 3.1%, 6.2%, 9.2%, respectively. The modest increase of the estimated PMF (to just 6311 m3/s) in spite of these changes is due to the factoring of the hydraulic effects of trees and buildings on the floodplain as the flood situation changes from normal floods to extreme floods, when over-bank flows may be the dominant flooding process, leading to a substantial reduction in the PMF estimates.
Paretti, Nicholas V.; Kennedy, Jeffrey R.; Cohn, Timothy A.
2014-01-01
Flooding is among the costliest natural disasters in terms of loss of life and property in Arizona, which is why the accurate estimation of flood frequency and magnitude is crucial for proper structural design and accurate floodplain mapping. Current guidelines for flood frequency analysis in the United States are described in Bulletin 17B (B17B), yet since B17B’s publication in 1982 (Interagency Advisory Committee on Water Data, 1982), several improvements have been proposed as updates for future guidelines. Two proposed updates are the Expected Moments Algorithm (EMA) to accommodate historical and censored data, and a generalized multiple Grubbs-Beck (MGB) low-outlier test. The current guidelines use a standard Grubbs-Beck (GB) method to identify low outliers, changing the determination of the moment estimators because B17B uses a conditional probability adjustment to handle low outliers while EMA censors the low outliers. B17B and EMA estimates are identical if no historical information or censored or low outliers are present in the peak-flow data. EMA with MGB (EMA-MGB) test was compared to the standard B17B (B17B-GB) method for flood frequency analysis at 328 streamgaging stations in Arizona. The methods were compared using the relative percent difference (RPD) between annual exceedance probabilities (AEPs), goodness-of-fit assessments, random resampling procedures, and Monte Carlo simulations. The AEPs were calculated and compared using both station skew and weighted skew. Streamgaging stations were classified by U.S. Geological Survey (USGS) National Water Information System (NWIS) qualification codes, used to denote historical and censored peak-flow data, to better understand the effect that nonstandard flood information has on the flood frequency analysis for each method. Streamgaging stations were also grouped according to geographic flood regions and analyzed separately to better understand regional differences caused by physiography and climate. The B17B-GB and EMA-MGB RPD-boxplot results showed that the median RPDs across all streamgaging stations for the 10-, 1-, and 0.2-percent AEPs, computed using station skew, were approximately zero. As the AEP flow estimates decreased (that is, from 10 to 0.2 percent AEP) the variability in the RPDs increased, indicating that the AEP flow estimate was greater for EMA-MGB when compared to B17B-GB. There was only one RPD greater than 100 percent for the 10- and 1-percent AEP estimates, whereas 19 RPDs exceeded 100 percent for the 0.2-percent AEP. At streamgaging stations with low-outlier data, historical peak-flow data, or both, RPDs ranged from −84 to 262 percent for the 0.2-percent AEP flow estimate. When streamgaging stations were separated by the presence of historical peak-flow data (that is, no low outliers or censored peaks) or by low outlier peak-flow data (no historical data), the results showed that RPD variability was greatest for the 0.2-AEP flow estimates, indicating that the treatment of historical and (or) low-outlier data was different between methods and that method differences were most influential when estimating the less probable AEP flows (1, 0.5, and 0.2 percent). When regional skew information was weighted with the station skew, B17B-GB estimates were generally higher than the EMA-MGB estimates for any given AEP. This was related to the different regional skews and mean square error used in the weighting procedure for each flood frequency analysis. The B17B-GB weighted skew analysis used a more positive regional skew determined in USGS Water Supply Paper 2433 (Thomas and others, 1997), while the EMA-MGB analysis used a more negative regional skew with a lower mean square error determined from a Bayesian generalized least squares analysis. Regional groupings of streamgaging stations reflected differences in physiographic and climatic characteristics. Potentially influential low flows (PILFs) were more prevalent in arid regions of the State, and generally AEP flows were larger with EMA-MGB than with B17B-GB for gaging stations with PILFs. In most cases EMA-MGB curves would fit the largest floods more accurately than B17B-GB. In areas of the State with more baseflow, such as along the Mogollon Rim and the White Mountains, streamgaging stations generally had fewer PILFs and more positive skews, causing estimated AEP flows to be larger with B17B-GB than with EMA-MGB. The effect of including regional skew was similar for all regions, and the observed pattern was increasingly greater B17B-GB flows (more negative RPDs) with each decreasing AEP quantile. A variation on a goodness-of-fit test statistic was used to describe each method’s ability to fit the largest floods. The mean absolute percent difference between the measured peak flows and the log-Pearson Type 3 (LP3)-estimated flows, for each method, was averaged over the 90th, 75th, and 50th percentiles of peak-flow data at each site. In most percentile subsets, EMA-MGB on average had smaller differences (1 to 3 percent) between the observed and fitted value, suggesting that the EMA-MGB-LP3 distribution is fitting the observed peak-flow data more precisely than B17B-GB. The smallest EMA-MGB percent differences occurred for the greatest 10 percent (90th percentile) of the peak-flow data. When stations were analyzed by USGS NWIS peak flow qualification code groups, the stations with historical peak flows and no low outliers had average percent differences as high as 11 percent greater for B17B-GB, indicating that EMA-MGB utilized the historical information to fit the largest observed floods more accurately. A resampling procedure was used in which 1,000 random subsamples were drawn, each comprising one-half of the observed data. An LP3 distribution was fit to each subsample using B17B-GB and EMA-MGB methods, and the predicted 1-percent AEP flows were compared to those generated from distributions fit to the entire dataset. With station skew, the two methods were similar in the median percent difference, but with weighted skew EMA-MGB estimates were generally better. At two gages where B17B-GB appeared to perform better, a large number of peak flows were deemed to be PILFs by the MGB test, although they did not appear to depart significantly from the trend of the data (step or dogleg appearance). At two gages where EMA-MGB performed better, the MGB identified several PILFs that were affecting the fitted distribution of the B17B-GB method. Monte Carlo simulations were run for the LP3 distribution using different skews and with different assumptions about the expected number of historical peaks. The primary benefit of running Monte Carlo simulations is that the underlying distribution statistics are known, meaning that the true 1-percent AEP is known. The results showed that EMA-MGB performed as well or better in situations where the LP3 distribution had a zero or positive skew and historical information. When the skew for the LP3 distribution was negative, EMA-MGB performed significantly better than B17B-GB and EMA-MGB estimates were less biased by more closely estimating the true 1-percent AEP for 1, 2, and 10 historical flood scenarios.
NASA Astrophysics Data System (ADS)
Fouchier, Catherine; Maire, Alexis; Arnaud, Patrick; Cantet, Philippe; Odry, Jean
2016-04-01
The starting point of our study was the availability of maps of rainfall quantiles available for the entire French mainland territory at the spatial resolution of 1 km². These maps display the rainfall amounts estimated for different rainfall durations (from 15 minutes to 72 hours) and different return periods (from 2 years up to 1 000 years). They are provided by a regionalized stochastic hourly point rainfall generator, the SHYREG method which was previously developed by Irstea (Arnaud et al., 2007; Cantet and Arnaud, 2014). Being calibrated independently on numerous raingauges data (with an average density across the country of 1 raingauge per 200 km²), this method suffers from a limitation common to point-process rainfall generators: it can only reproduce point rainfall patterns and has no capacity to generate rainfall fields. It can't hence provide areal rainfall quantiles, the estimation of the latter being however needed for the construction of design rainfall or for the diagnostic of observed events. One means of bridging this gap between our local rainfall quantiles and areal rainfall quantiles is given by the concept of probabilistic areal reduction factors of rainfall (ARF) as defined by Omolayo (1993). This concept enables to estimate areal rainfall of a particular frequency within a certain amount of time from point rainfalls of the same frequency and duration. Assessing such ARF for the whole French territory is of particular interest since it should allow us to compute areal rainfall quantiles, and eventually watershed rainfall quantiles, by using the already available grids of statistical point rainfall of the SHYREG method. Our purpose was then to assess these ARF thanks to long time-series of spatial rainfall data. We have used two sets of rainfall fields: i) hourly rainfall fields from a 10-year reference database of Quantitative Precipitation Estimation (QPE) over France (Tabary et al., 2012), ii) daily rainfall fields resulting from a 53-year high-resolution atmospheric reanalysis over France with the SAFRAN-gauge-based analysis system (Vidal et al., 2010). We have then built samples of maximal rainfalls for each cell location (the "point" rainfalls) and for different areas centered on each cell location (the areal rainfalls) of these gridded data. To compute rainfall quantiles, we have fitted a Gumbel law, with the L-moment method, on each of these samples. Our daily and hourly ARF have then shown four main trends: i) a sensitivity to the return period, with ARF values decreasing when the return period increases; ii) a sensitivity to the rainfall duration, with ARF values decreasing when the rainfall duration decreases; iii) a sensitivity to the season, with ARF values smaller for the summer period than for the winter period; iv) a sensitivity to the geographical location, with low ARF values in the French Mediterranean area and ARF values close to 1 for the climatic zones of Northern and Western France (oceanic to semi-continental climate). The results of this data-intensive study led for the first time on the whole French territory are in agreement with studies led abroad (e.g. Allen and DeGaetano 2005, Overeem et al. 2010) and confirm and widen the results of previous studies that were carried out in France on smaller areas and with fewer rainfall durations (e.g. Ramos et al., 2006, Neppel et al., 2003). References Allen R. J. and DeGaetano A. T. (2005). Areal reduction factors for two eastern United States regions with high rain-gauge density. Journal of Hydrologic Engineering 10(4): 327-335. Arnaud P., Fine J.-A. and Lavabre J. (2007). An hourly rainfall generation model applicable to all types of climate. Atmospheric Research 85(2): 230-242. Cantet, P. and Arnaud, P. (2014). Extreme rainfall analysis by a stochastic model: impact of the copula choice on the sub-daily rainfall generation, Stochastic Environmental Research and Risk Assessment, Springer Berlin Heidelberg, 28(6), 1479-1492. Neppel L., Bouvier C. and Lavabre J. (2003). Areal reduction factor probabilities for rainfall in Languedoc Roussillon. IAHS-AISH Publication (278): 276-283. Omolayo, A. S. (1993). On the transposition of areal reduction factors for rainfall frequency estimation. Journal of Hydrology 145 (1-2): 191-205. Overeem A., Buishand T. A., Holleman I. and Uijlenhoet R. (2010). Extreme value modeling of areal rainfall from weather radar. Water Resources Research 46(9): 10 p. Ramos M.-H., Leblois E., Creutin J.-D. (2006). From point to areal rainfall: Linking the different approaches for the frequency characterisation of rainfalls in urban areas. Water Science and Technology. 54(6-7): 33-40. Tabary P., Dupuy P., L'Henaff G., Gueguen C., Moulin L., Laurantin O., Merlier C., Soubeyroux J. M. (2012). A 10-year (1997-2006) reanalysis of Quantitative Precipitation Estimation over France: methodology and first results. IAHS-AISH Publication (351) : 255-260. Vidal J.-P., Martin E., Franchistéguy L., Baillon M. and Soubeyroux J.-M. (2010). A 50-year high-resolution atmospheric reanalysis over France with the Safran system. International Journal of Climatology 30(11): 1627-1644.
DOT National Transportation Integrated Search
2014-03-01
Reliable estimates of the magnitude and frequency : of floods are essential for the design of transportation and : water-conveyance structures, flood-insurance studies, and : flood-plain management. Such estimates are particularly : important in dens...
Inventory and mapping of flood inundation using interactive digital image analysis techniques
Rohde, Wayne G.; Nelson, Charles A.; Taranik, J.V.
1979-01-01
LANDSAT digital data and color infra-red photographs were used in a multiphase sampling scheme to estimate the area of agricultural land affected by a flood. The LANDSAT data were classified with a maximum likelihood algorithm. Stratification of the LANDSAT data, prior to classification, greatly reduced misclassification errors. The classification results were used to prepare a map overlay showing the areal extent of flooding. These data also provided statistics required to estimate sample size in a two phase sampling scheme, and provided quick, accurate estimates of areas flooded for the first phase. The measurements made in the second phase, based on ground data and photo-interpretation, were used with two phase sampling statistics to estimate the area of agricultural land affected by flooding These results show that LANDSAT digital data can be used to prepare map overlays showing the extent of flooding on agricultural land and, with two phase sampling procedures, can provide acreage estimates with sampling errors of about 5 percent. This procedure provides a technique for rapidly assessing the areal extent of flood conditions on agricultural land and would provide a basis for designing a sampling framework to estimate the impact of flooding on crop production.
Yu, Hwa-Lung; Wang, Chih-Hsin
2013-02-05
Understanding the daily changes in ambient air quality concentrations is important to the assessing human exposure and environmental health. However, the fine temporal scales (e.g., hourly) involved in this assessment often lead to high variability in air quality concentrations. This is because of the complex short-term physical and chemical mechanisms among the pollutants. Consequently, high heterogeneity is usually present in not only the averaged pollution levels, but also the intraday variance levels of the daily observations of ambient concentration across space and time. This characteristic decreases the estimation performance of common techniques. This study proposes a novel quantile-based Bayesian maximum entropy (QBME) method to account for the nonstationary and nonhomogeneous characteristics of ambient air pollution dynamics. The QBME method characterizes the spatiotemporal dependence among the ambient air quality levels based on their location-specific quantiles and accounts for spatiotemporal variations using a local weighted smoothing technique. The epistemic framework of the QBME method can allow researchers to further consider the uncertainty of space-time observations. This study presents the spatiotemporal modeling of daily CO and PM10 concentrations across Taiwan from 1998 to 2009 using the QBME method. Results show that the QBME method can effectively improve estimation accuracy in terms of lower mean absolute errors and standard deviations over space and time, especially for pollutants with strong nonhomogeneous variances across space. In addition, the epistemic framework can allow researchers to assimilate the site-specific secondary information where the observations are absent because of the common preferential sampling issues of environmental data. The proposed QBME method provides a practical and powerful framework for the spatiotemporal modeling of ambient pollutants.
Covariate Measurement Error Correction for Student Growth Percentiles Using the SIMEX Method
ERIC Educational Resources Information Center
Shang, Yi; VanIwaarden, Adam; Betebenner, Damian W.
2015-01-01
In this study, we examined the impact of covariate measurement error (ME) on the estimation of quantile regression and student growth percentiles (SGPs), and find that SGPs tend to be overestimated among students with higher prior achievement and underestimated among those with lower prior achievement, a problem we describe as ME endogeneity in…
Importance of record length with respect to estimating the 1-percent chance flood
Feaster, Toby D.
2010-01-01
U.S. Geological Survey (USGS) streamflow gages have been established in every State in the Nation, Puerto Rico, and the Trust Territory of the Pacific Islands. From these st reamflow records, estimates of the magnitude and frequency of floods are often developed and used to design transportation and water- conveyance structures to protect lives and property, and to determine flood-insurance rates. Probably the most recognizable flood statistic computed from USGS stream gaging records is the 1- percent (%) chance flood; better known has the 100-year flood. By definition, this is a flood that has a 1% chance of occurring in any given year. The 1% chance flood is a statistical estimate that can be significantly influenced by length of record and extreme flood events captured in that record. Consequently, it is typically recommended that flood statistics be updated on some regular interval such as every 10 years. This paper examines the influence of record length on the 1% chance flood for the Broad River in Georgia and the substantial difference that can occur in the estimate based on record length and the hydrologic conditions under which that record was collected.
A Fresh Start for Flood Estimation in Ungauged UK Catchments
NASA Astrophysics Data System (ADS)
Giani, Giulia; Woods, Ross
2017-04-01
The standard regression-based method for estimating the median annual flood in ungauged UK catchments has a high standard error (95% confidence interval is +/- a factor of 2). This is also the dominant source of uncertainty in statistical estimates of the 100-year flood. Similarly large uncertainties have been reported elsewhere. These large uncertainties make it difficult to do reliable flood design estimates for ungauged catchments. If the uncertainty could be reduced, flood protection schemes could be made significantly more cost-effective. Here we report on attempts to develop a new practical method for flood estimation in ungauged UK catchments, by making more use of knowledge about rainfall-runoff processes. Building on recent research on the seasonality of flooding, we first classify more than 1000 UK catchments into groups according to the seasonality of extreme rainfall and floods, and infer possible causal mechanisms for floods (e.g. Berghuijs et al, Geophysical Research Letters, 2016). For each group we are developing simplified rainfall-runoff-routing relationships (e.g. Viglione et al, Journal of Hydrology, 2010) which can account for spatial and temporal variability in rainfall and flood processes, as well as channel network routing effects. An initial investigation by Viglione et al suggested that the relationship between rainfall amount and flood peak could be summarised through a dimensionless response number that represents the product of the event runoff coefficient and a measure of hydrograph peakedness. Our hypothesis is that this approach is widely applicable, and can be used as the basis for flood estimation. Using subdaily and daily rainfall-runoff data for more than 1000 catchments, we identify a subset of catchments in the west of the UK where floods are generated predominantly in winter through the coincidence of heavy rain and low soil moisture deficits. Floods in these catchments can reliably be simulated with simple rainfall-runoff models, so it is reasonable to expect simple flood estimators. We will report on tests of the several components of the dimensionless response number hypothesis for these catchments.
Techniques for estimating flood-peak discharges from urban basins in Missouri
Becker, L.D.
1986-01-01
Techniques are defined for estimating the magnitude and frequency of future flood peak discharges of rainfall-induced runoff from small urban basins in Missouri. These techniques were developed from an initial analysis of flood records of 96 gaged sites in Missouri and adjacent states. Final regression equations are based on a balanced, representative sampling of 37 gaged sites in Missouri. This sample included 9 statewide urban study sites, 18 urban sites in St. Louis County, and 10 predominantly rural sites statewide. Short-term records were extended on the basis of long-term climatic records and use of a rainfall-runoff model. Linear least-squares regression analyses were used with log-transformed variables to relate flood magnitudes of selected recurrence intervals (dependent variables) to selected drainage basin indexes (independent variables). For gaged urban study sites within the State, the flood peak estimates are from the frequency curves defined from the synthesized long-term discharge records. Flood frequency estimates are made for ungaged sites by using regression equations that require determination of the drainage basin size and either the percentage of impervious area or a basin development factor. Alternative sets of equations are given for the 2-, 5-, 10-, 25-, 50-, and 100-yr recurrence interval floods. The average standard errors of estimate range from about 33% for the 2-yr flood to 26% for the 100-yr flood. The techniques for estimation are applicable to flood flows that are not significantly affected by storage caused by manmade activities. Flood peak discharge estimating equations are considered applicable for sites on basins draining approximately 0.25 to 40 sq mi. (Author 's abstract)
Borah, Bijan J; Basu, Anirban
2013-09-01
The quantile regression (QR) framework provides a pragmatic approach in understanding the differential impacts of covariates along the distribution of an outcome. However, the QR framework that has pervaded the applied economics literature is based on the conditional quantile regression method. It is used to assess the impact of a covariate on a quantile of the outcome conditional on specific values of other covariates. In most cases, conditional quantile regression may generate results that are often not generalizable or interpretable in a policy or population context. In contrast, the unconditional quantile regression method provides more interpretable results as it marginalizes the effect over the distributions of other covariates in the model. In this paper, the differences between these two regression frameworks are highlighted, both conceptually and econometrically. Additionally, using real-world claims data from a large US health insurer, alternative QR frameworks are implemented to assess the differential impacts of covariates along the distribution of medication adherence among elderly patients with Alzheimer's disease. Copyright © 2013 John Wiley & Sons, Ltd.
Tighe, Elizabeth L.; Schatschneider, Christopher
2015-01-01
The purpose of this study was to investigate the joint and unique contributions of morphological awareness and vocabulary knowledge at five reading comprehension levels in Adult Basic Education (ABE) students. We introduce the statistical technique of multiple quantile regression, which enabled us to assess the predictive utility of morphological awareness and vocabulary knowledge at multiple points (quantiles) along the continuous distribution of reading comprehension. To demonstrate the efficacy of our multiple quantile regression analysis, we compared and contrasted our results with a traditional multiple regression analytic approach. Our results indicated that morphological awareness and vocabulary knowledge accounted for a large portion of the variance (82-95%) in reading comprehension skills across all quantiles. Morphological awareness exhibited the greatest unique predictive ability at lower levels of reading comprehension whereas vocabulary knowledge exhibited the greatest unique predictive ability at higher levels of reading comprehension. These results indicate the utility of using multiple quantile regression to assess trajectories of component skills across multiple levels of reading comprehension. The implications of our findings for ABE programs are discussed. PMID:25351773
Quantile based Tsallis entropy in residual lifetime
NASA Astrophysics Data System (ADS)
Khammar, A. H.; Jahanshahi, S. M. A.
2018-02-01
Tsallis entropy is a generalization of type α of the Shannon entropy, that is a nonadditive entropy unlike the Shannon entropy. Shannon entropy may be negative for some distributions, but Tsallis entropy can always be made nonnegative by choosing appropriate value of α. In this paper, we derive the quantile form of this nonadditive's entropy function in the residual lifetime, namely the residual quantile Tsallis entropy (RQTE) and get the bounds for it, depending on the Renyi's residual quantile entropy. Also, we obtain relationship between RQTE and concept of proportional hazards model in the quantile setup. Based on the new measure, we propose a stochastic order and aging classes, and study its properties. Finally, we prove characterizations theorems for some well known lifetime distributions. It is shown that RQTE uniquely determines the parent distribution unlike the residual Tsallis entropy.
On the Mean Squared Error of Nonparametric Quantile Estimators under Random Right-Censorship.
1986-09-01
SECURITY CI.ASSIFICATION lb. RESTRICTIVE MARKINGS UNCLASSIFIED 2a, SECURITY CLASSIFICATION AUTHORITY 3 . OISTRIBUTIONIAVAILASIL.ITY OF REPORT P16e 2b...UNCLASSIPIEO/UNLIMITEO 3 SAME AS RPT". 0 OTIC USERS 1 UNCLASSIFIED p." " 22. NAME OP RESPONSIBLE INOIVIOUAL 22b. TELEPHONE NUMBER 22c. OFFICE SYMBOL...in Section 3 , and the result for the kernel estimator Qn is derived in Section 4. It should be k. mentioned that the order statistic methods used by
The Significance of the Record Length in Flood Frequency Analysis
NASA Astrophysics Data System (ADS)
Senarath, S. U.
2013-12-01
Of all of the potential natural hazards, flood is the most costly in many regions of the world. For example, floods cause over a third of Europe's average annual catastrophe losses and affect about two thirds of the people impacted by natural catastrophes. Increased attention is being paid to determining flow estimates associated with pre-specified return periods so that flood-prone areas can be adequately protected against floods of particular magnitudes or return periods. Flood frequency analysis, which is conducted by using an appropriate probability density function that fits the observed annual maximum flow data, is frequently used for obtaining these flow estimates. Consequently, flood frequency analysis plays an integral role in determining the flood risk in flood prone watersheds. A long annual maximum flow record is vital for obtaining accurate estimates of discharges associated with high return period flows. However, in many areas of the world, flood frequency analysis is conducted with limited flow data or short annual maximum flow records. These inevitably lead to flow estimates that are subject to error. This is especially the case with high return period flow estimates. In this study, several statistical techniques are used to identify errors caused by short annual maximum flow records. The flow estimates used in the error analysis are obtained by fitting a log-Pearson III distribution to the flood time-series. These errors can then be used to better evaluate the return period flows in data limited streams. The study findings, therefore, have important implications for hydrologists, water resources engineers and floodplain managers.
Epic Flooding in Georgia, 2009
Gotvald, Anthony J.; McCallum, Brian E.
2010-01-01
Metropolitan Atlanta-September 2009 Floods The epic floods experienced in the Atlanta area in September 2009 were extremely rare. Eighteen streamgages in the Metropolitan Atlanta area had flood magnitudes much greater than the estimated 0.2-percent (500-year) annual exceedance probability. The Federal Emergency Management Agency (FEMA) reported that 23 counties in Georgia were declared disaster areas due to this flood and that 16,981 homes and 3,482 businesses were affected by floodwaters. Ten lives were lost in the flood. The total estimated damages exceed $193 million (H.E. Longenecker, Federal Emergency Management Agency, written commun., November 2009). On Sweetwater Creek near Austell, Ga., just north of Interstate 20, the peak stage was more than 6 feet higher than the estimated peak stage of the 0.2-percent (500-year) flood. Flood magnitudes in Cobb County on Sweetwater, Butler, and Powder Springs Creeks greatly exceeded the estimated 0.2-percent (500-year) floods for these streams. In Douglas County, the Dog River at Ga. Highway 5 near Fairplay had a peak stage nearly 20 feet higher than the estimated peak stage of the 0.2-percent (500-year) flood. On the Chattahoochee River, the U.S. Geological Survey (USGS) gage at Vinings reached the highest level recorded in the past 81 years. Gwinnett, De Kalb, Fulton, and Rockdale Counties also had record flooding.South Georgia March and April 2009 FloodsThe March and April 2009 floods in South Georgia were smaller in magnitude than the September floods but still caused significant damage. No lives were lost in this flood. Approximately $60 million in public infrastructure damage occurred to roads, culverts, bridges and a water treatment facility (Joseph T. McKinney, Federal Emergency Management Agency, written commun., July 2009). Flow at the Satilla River near Waycross, exceeded the 0.5-percent (200-year) flood. Flows at seven other stations in South Georgia exceeded the 1-percent (100-year) flood.
Ahearn, Elizabeth A.
2009-01-01
A spring nor'easter affected the East Coast of the United States from April 15 to 18, 2007. In Connecticut, rainfall varied from 3 inches to more than 7 inches. The combined effects of heavy rainfall over a short duration, high winds, and high tides led to widespread flooding, storm damage, power outages, evacuations, and disruptions to traffic and commerce. The storm caused at least 18 fatalities (none in Connecticut). A Presidential Disaster Declaration was issued on May 11, 2007, for two counties in western Connecticut - Fairfield and Litchfield. This report documents hydrologic and meteorologic aspects of the April 2007 flood and includes estimates of the magnitude of the peak discharges and peak stages during the flood at 28 streamflow-gaging stations in western Connecticut. These data were used to perform flood-frequency analyses. Flood-frequency estimates provided in this report are expressed in terms of exceedance probabilities (the probability of a flood reaching or exceeding a particular magnitude in any year). Flood-frequency estimates for the 0.50, 0.20, 0.10, 0.04, 0.02, 0.01, and 0.002 exceedance probabilities (also expressed as 50-, 20-, 10-, 4-, 2-, 1-, and 0.2- percent exceedance probability, respectively) were computed for 24 of the 28 streamflow-gaging stations. Exceedance probabilities can further be expressed in terms of recurrence intervals (2-, 5-, 10-, 25-, 50-, 100-, and 500-year recurrence interval, respectively). Flood-frequency estimates computed in this study were compared to the flood-frequency estimates used to derive the water-surface profiles in previously published Federal Emergency Management Agency (FEMA) Flood Insurance Studies. The estimates in this report update and supersede previously published flood-frequency estimates for streamflowgaging stations in Connecticut by incorporating additional years of annual peak discharges, including the peaks for the April 2007 flood. In the southwest coastal region of Connecticut, the April 2007 peak discharges for streamflow-gaging stations with records extending back to 1955 were the second highest peak discharges on record; the 1955 annual peak discharges are the highest peak discharges in the station records. In the Housatonic and South Central Coast Basins, the April 2007 peak discharges for streamflow-gaging stations with records extending back to 1930 or earlier ranked between the fourth and eighth highest discharges on record, with the 1936, 1938, and 1955 floods as the largest floods in the station records. The peak discharges for the April 2007 flood have exceedance probabilities ranging between 0.10 to 0.02 (a 10- to 2-percent chance of being exceeded in a given year, respectively) with the majority (80 percent) of the stations having exceedance probabilities between 0.10 to 0.04. At three stations - Norwalk River at South Wilton, Pootatuck River at Sandy Hook, and Still River at Robertsville - the April 2007 peak discharges have an exceedance probability of 0.02. Flood-frequency estimates made after the April 2007 flood were compared to flood-frequency estimates used to derive the water-surface profiles (also called flood profiles) in FEMA Flood Insurance Studies developed for communities. In general, the comparison indicated that at the 0.10 exceedance probability (a 10-percent change of being exceeded in a given year), the discharges from the current (2007) flood-frequency analysis are larger than the discharges in the FEMA Flood Insurance Studies, with a median change of about +10 percent. In contrast, at the 0.01 exceedance probability (a 1-percent change of being exceeded in a year), the discharges from the current flood-frequency analysis are smaller than the discharges in the FEMA Flood Insurance Studies, with a median change of about -13 percent. Several stations had more than + 25 percent change in discharges at the 0.10 exceedance probability and are in the following communities: Winchester (Still River at Robertsv
The gender gap reloaded: are school characteristics linked to labor market performance?
Konstantopoulos, Spyros; Constant, Amelie
2008-06-01
This study examines the wage gender gap of young adults in the 1970s, 1980s, and 2000 in the US. Using quantile regression we estimate the gender gap across the entire wage distribution. We also study the importance of high school characteristics in predicting future labor market performance. We conduct analyses for three major racial/ethnic groups in the US: Whites, Blacks, and Hispanics, employing data from two rich longitudinal studies: NLS and NELS. Our results indicate that while some school characteristics are positive and significant predictors of future wages for Whites, they are less so for the two minority groups. We find significant wage gender disparities favoring men across all three surveys in the 1970s, 1980s, and 2000. The wage gender gap is more pronounced in higher paid jobs (90th quantile) for all groups, indicating the presence of a persistent and alarming "glass ceiling."
Jason B. Dunham; Brian S. Cade; James W. Terrell
2002-01-01
We used regression quantiles to model potentially limiting relationships between the standing crop of cutthroat trout Oncorhynchus clarki and measures of stream channel morphology. Regression quantile models indicated that variation in fish density was inversely related to the width:depth ratio of streams but not to stream width or depth alone. The...
Superquantile/CVaR Risk Measures: Second-Order Theory
2015-07-31
order superquantile risk minimization as well as superquantile regression , a proposed second-order version of quantile regression . Keywords...minimization as well as superquantile regression , a proposed second-order version of quantile regression . 15. SUBJECT TERMS 16. SECURITY...superquantilies, because it is deeply tied to generalized regression . The joint formula (3) is central to quantile regression , a well known alternative
Fifty-year flood-inundation maps for Comayagua, Hondura
Kresch, David L.; Mastin, Mark C.; Olsen, T.D.
2002-01-01
After the devastating floods caused by Hurricane Mitch in 1998, maps of the areas and depths of the 50-year-flood inundation at 15 municipalities in Honduras were prepared as a tool for agencies involved in reconstruction and planning. This report, which is one in a series of 15, presents maps of areas in the municipality of Comayagua that would be inundated by 50-year floods on Rio Humuya and Rio Majada. Geographic Information System (GIS) coverages of the flood inundation are available on a computer in the municipality of Comayagua as part of the Municipal GIS project and on the Internet at the Flood Hazard Mapping Web page (http://mitchnts1.cr.usgs.gov/projects/floodhazard.html). These coverages allow users to view the flood inundation in much more detail than is possible using the maps in this report. Water-surface elevations for 50-year-floods on Rio Humuya and Rio Majada at Comayagua were estimated using HEC-RAS, a one-dimensional, steady-flow, step-backwater computer program. The channel and floodplain cross sections used in HEC-RAS were developed from an airborne light-detection-and-ranging (LIDAR) topographic survey of the area. The 50-year-flood discharge for Rio Humuya at Comayagua, 1,400 cubic meters per second, was estimated using a regression equation that relates the 50-year-flood discharge to drainage area and mean annual precipitation. The reasonableness of the regression discharge was evaluated by comparing it with drainage-area-adjusted 50-year-flood discharges estimated for three long-term Rio Humuya stream-gaging stations. The drainage-area-adjusted 50-year-flood discharges estimated from the gage records ranged from 946 to 1,365 cubic meters per second. Because the regression equation discharge agrees closely with the high end of the range of discharges estimated from the gaging-station records, it was used for the hydraulic modeling to ensure that the resulting 50-year-flood water-surface elevations would not be underestimated. The 50-year-flood discharge for Rio Majada at Comayagua (230 cubic meters per second) was estimated using the regression equation because there are no long-term gaging-stations on this river from which to estimate the discharge.
Estimation of flood discharges at selected recurrence intervals for streams in New Hampshire.
DOT National Transportation Integrated Search
2008-01-01
This report provides estimates of flood discharges at selected recurrence intervals for streamgages in and adjacent to New Hampshire and equations for estimating flood discharges at recurrence intervals of 2-, 5-, 10-, 25-, 50-, 100-, and 500-years f...
Nonuniform sampling by quantiles.
Craft, D Levi; Sonstrom, Reilly E; Rovnyak, Virginia G; Rovnyak, David
2018-03-01
A flexible strategy for choosing samples nonuniformly from a Nyquist grid using the concept of statistical quantiles is presented for broad classes of NMR experimentation. Quantile-directed scheduling is intuitive and flexible for any weighting function, promotes reproducibility and seed independence, and is generalizable to multiple dimensions. In brief, weighting functions are divided into regions of equal probability, which define the samples to be acquired. Quantile scheduling therefore achieves close adherence to a probability distribution function, thereby minimizing gaps for any given degree of subsampling of the Nyquist grid. A characteristic of quantile scheduling is that one-dimensional, weighted NUS schedules are deterministic, however higher dimensional schedules are similar within a user-specified jittering parameter. To develop unweighted sampling, we investigated the minimum jitter needed to disrupt subharmonic tracts, and show that this criterion can be met in many cases by jittering within 25-50% of the subharmonic gap. For nD-NUS, three supplemental components to choosing samples by quantiles are proposed in this work: (i) forcing the corner samples to ensure sampling to specified maximum values in indirect evolution times, (ii) providing an option to triangular backfill sampling schedules to promote dense/uniform tracts at the beginning of signal evolution periods, and (iii) providing an option to force the edges of nD-NUS schedules to be identical to the 1D quantiles. Quantile-directed scheduling meets the diverse needs of current NUS experimentation, but can also be used for future NUS implementations such as off-grid NUS and more. A computer program implementing these principles (a.k.a. QSched) in 1D- and 2D-NUS is available under the general public license. Copyright © 2018 Elsevier Inc. All rights reserved.
Nonuniform sampling by quantiles
NASA Astrophysics Data System (ADS)
Craft, D. Levi; Sonstrom, Reilly E.; Rovnyak, Virginia G.; Rovnyak, David
2018-03-01
A flexible strategy for choosing samples nonuniformly from a Nyquist grid using the concept of statistical quantiles is presented for broad classes of NMR experimentation. Quantile-directed scheduling is intuitive and flexible for any weighting function, promotes reproducibility and seed independence, and is generalizable to multiple dimensions. In brief, weighting functions are divided into regions of equal probability, which define the samples to be acquired. Quantile scheduling therefore achieves close adherence to a probability distribution function, thereby minimizing gaps for any given degree of subsampling of the Nyquist grid. A characteristic of quantile scheduling is that one-dimensional, weighted NUS schedules are deterministic, however higher dimensional schedules are similar within a user-specified jittering parameter. To develop unweighted sampling, we investigated the minimum jitter needed to disrupt subharmonic tracts, and show that this criterion can be met in many cases by jittering within 25-50% of the subharmonic gap. For nD-NUS, three supplemental components to choosing samples by quantiles are proposed in this work: (i) forcing the corner samples to ensure sampling to specified maximum values in indirect evolution times, (ii) providing an option to triangular backfill sampling schedules to promote dense/uniform tracts at the beginning of signal evolution periods, and (iii) providing an option to force the edges of nD-NUS schedules to be identical to the 1D quantiles. Quantile-directed scheduling meets the diverse needs of current NUS experimentation, but can also be used for future NUS implementations such as off-grid NUS and more. A computer program implementing these principles (a.k.a. QSched) in 1D- and 2D-NUS is available under the general public license.
Muller, Benjamin J.; Cade, Brian S.; Schwarzkoph, Lin
2018-01-01
Many different factors influence animal activity. Often, the value of an environmental variable may influence significantly the upper or lower tails of the activity distribution. For describing relationships with heterogeneous boundaries, quantile regressions predict a quantile of the conditional distribution of the dependent variable. A quantile count model extends linear quantile regression methods to discrete response variables, and is useful if activity is quantified by trapping, where there may be many tied (equal) values in the activity distribution, over a small range of discrete values. Additionally, different environmental variables in combination may have synergistic or antagonistic effects on activity, so examining their effects together, in a modeling framework, is a useful approach. Thus, model selection on quantile counts can be used to determine the relative importance of different variables in determining activity, across the entire distribution of capture results. We conducted model selection on quantile count models to describe the factors affecting activity (numbers of captures) of cane toads (Rhinella marina) in response to several environmental variables (humidity, temperature, rainfall, wind speed, and moon luminosity) over eleven months of trapping. Environmental effects on activity are understudied in this pest animal. In the dry season, model selection on quantile count models suggested that rainfall positively affected activity, especially near the lower tails of the activity distribution. In the wet season, wind speed limited activity near the maximum of the distribution, while minimum activity increased with minimum temperature. This statistical methodology allowed us to explore, in depth, how environmental factors influenced activity across the entire distribution, and is applicable to any survey or trapping regime, in which environmental variables affect activity.
NASA Astrophysics Data System (ADS)
Zolina, Olga; Simmer, Clemens; Kapala, Alice; Mächel, Hermann; Gulev, Sergey; Groisman, Pavel
2014-05-01
We present new high resolution precipitation daily grids developed at Meteorological Institute, University of Bonn and German Weather Service (DWD) under the STAMMEX project (Spatial and Temporal Scales and Mechanisms of Extreme Precipitation Events over Central Europe). Daily precipitation grids have been developed from the daily-observing precipitation network of DWD, which runs one of the World's densest rain gauge networks comprising more than 7500 stations. Several quality-controlled daily gridded products with homogenized sampling were developed covering the periods 1931-onwards (with 0.5 degree resolution), 1951-onwards (0.25 degree and 0.5 degree), and 1971-2000 (0.1 degree). Different methods were tested to select the best gridding methodology that minimizes errors of integral grid estimates over hilly terrain. Besides daily precipitation values with uncertainty estimates (which include standard estimates of the kriging uncertainty as well as error estimates derived by a bootstrapping algorithm), the STAMMEX data sets include a variety of statistics that characterize temporal and spatial dynamics of the precipitation distribution (quantiles, extremes, wet/dry spells, etc.). Comparisons with existing continental-scale daily precipitation grids (e.g., CRU, ECA E-OBS, GCOS) which include considerably less observations compared to those used in STAMMEX, demonstrate the added value of high-resolution grids for extreme rainfall analyses. These data exhibit spatial variability pattern and trends in precipitation extremes, which are missed or incorrectly reproduced over Central Europe from coarser resolution grids based on sparser networks. The STAMMEX dataset can be used for high-quality climate diagnostics of precipitation variability, as a reference for reanalyses and remotely-sensed precipitation products (including the upcoming Global Precipitation Mission products), and for input into regional climate and operational weather forecast models. We will present numerous application of the STAMMEX grids spanning from case studies of the major Central European floods to long-term changes in different precipitation statistics, including those accounting for the alternation of dry and wet periods and precipitation intensities associated with prolonged rainy episodes.
A framework for global river flood risk assessments
NASA Astrophysics Data System (ADS)
Winsemius, H. C.; Van Beek, L. P. H.; Jongman, B.; Ward, P. J.; Bouwman, A.
2012-08-01
There is an increasing need for strategic global assessments of flood risks in current and future conditions. In this paper, we propose a framework for global flood risk assessment for river floods, which can be applied in current conditions, as well as in future conditions due to climate and socio-economic changes. The framework's goal is to establish flood hazard and impact estimates at a high enough resolution to allow for their combination into a risk estimate. The framework estimates hazard at high resolution (~1 km2) using global forcing datasets of the current (or in scenario mode, future) climate, a global hydrological model, a global flood routing model, and importantly, a flood extent downscaling routine. The second component of the framework combines hazard with flood impact models at the same resolution (e.g. damage, affected GDP, and affected population) to establish indicators for flood risk (e.g. annual expected damage, affected GDP, and affected population). The framework has been applied using the global hydrological model PCR-GLOBWB, which includes an optional global flood routing model DynRout, combined with scenarios from the Integrated Model to Assess the Global Environment (IMAGE). We performed downscaling of the hazard probability distributions to 1 km2 resolution with a new downscaling algorithm, applied on Bangladesh as a first case-study application area. We demonstrate the risk assessment approach in Bangladesh based on GDP per capita data, population, and land use maps for 2010 and 2050. Validation of the hazard and damage estimates has been performed using the Dartmouth Flood Observatory database and damage estimates from the EM-DAT database and World Bank sources. We discuss and show sensitivities of the estimated risks with regard to the use of different climate input sets, decisions made in the downscaling algorithm, and different approaches to establish impact models.
Tighe, Elizabeth L; Schatschneider, Christopher
2016-07-01
The purpose of this study was to investigate the joint and unique contributions of morphological awareness and vocabulary knowledge at five reading comprehension levels in adult basic education (ABE) students. We introduce the statistical technique of multiple quantile regression, which enabled us to assess the predictive utility of morphological awareness and vocabulary knowledge at multiple points (quantiles) along the continuous distribution of reading comprehension. To demonstrate the efficacy of our multiple quantile regression analysis, we compared and contrasted our results with a traditional multiple regression analytic approach. Our results indicated that morphological awareness and vocabulary knowledge accounted for a large portion of the variance (82%-95%) in reading comprehension skills across all quantiles. Morphological awareness exhibited the greatest unique predictive ability at lower levels of reading comprehension whereas vocabulary knowledge exhibited the greatest unique predictive ability at higher levels of reading comprehension. These results indicate the utility of using multiple quantile regression to assess trajectories of component skills across multiple levels of reading comprehension. The implications of our findings for ABE programs are discussed. © Hammill Institute on Disabilities 2014.
The Dynamics of the Evolution of the Black-White Test Score Gap
ERIC Educational Resources Information Center
Sohn, Kitae
2012-01-01
We apply a quantile version of the Oaxaca-Blinder decomposition to estimate the counterfactual distribution of the test scores of Black students. In the Early Childhood Longitudinal Study, Kindergarten Class of 1998-1999 (ECLS-K), we find that the gap initially appears only at the top of the distribution of test scores. As children age, however,…
Fifty-year flood-inundation maps for Choluteca, Honduras
Kresch, David L.; Mastin, Mark C.; Olsen, T.D.
2002-01-01
After the devastating floods caused by Hurricane Mitch in 1998, maps of the areas and depths of 50-year-flood inundation at 15 municipalities in Honduras were prepared as a tool for agencies involved in reconstruction and planning. This report, which is one in a series of 15, presents maps of areas in the municipality of Choluteca that would be inundated by 50-year floods on Rio Choluteca and Rio Iztoca. Geographic Information System (GIS) coverages of the flood inundation are available on a computer in the municipality of Choluteca as part of the Municipal GIS project and on the Internet at the Flood Hazard Mapping Web page (http://mitchnts1.cr.usgs.gov/projects/floodhazard.html). These coverages allow users to view the flood inundation in much more detail than is possible using the maps in this report. Water-surface elevations for 50-year-floods on Rio Choluteca and Rio Iztoca at Choluteca were estimated using HEC-RAS, a one-dimensional, steady-flow, step-backwater computer program. The channel and floodplain cross sections used in HEC-RAS were developed from an airborne light-detection-and-ranging (LIDAR) topographic survey of the area. The estimated 50-year-flood discharge for Rio Choluteca at Choluteca is 4,620 cubic meters per second, which is the drainage-area-adjusted weighted-average of two independently estimated 50-year-flood discharges for the gaging station Rio Choluteca en Puente Choluteca. One discharge, 4,913 cubic meters per second, was estimated from a frequency analysis of the 17 years of peak discharge record for the gage, and the other, 2,650 cubic meters per second, was estimated from a regression equation that relates the 50-year-flood discharge to drainage area and mean annual precipitation. The weighted-average of the two discharges at the gage is 4,530 cubic meters per second. The 50-year-flood discharge for the study area reach of Rio Choluteca was estimated by multiplying the weighted discharge at the gage by the ratio of the drainage areas upstream from the two locations. The 50-year-flood discharge for Rio Iztoca, which was estimated from the regression equation, is 430 cubic meters per second.
Spillway sizing of large dams in Austria
NASA Astrophysics Data System (ADS)
Reszler, Ch.; Gutknecht, D.; Blöschl, G.
2003-04-01
This paper discusses the basic philosophy of defining and calculating design floods for large dams in Austria, both for the construction of new dams and for a re-assessment of the safety of existing dams. Currently the consensus is to choose flood peak values corresponding to a probability of exceedance of 2*10-4 for a given year. A two step procedure is proposed to estimate the design flood discharges - a rapid assessment and a detailed assessment. In the rapid assessment the design discharge is chosen as a constant multiple of flood values read from a map of regionalised floods. The safety factor or multiplier takes care of the uncertainties of the local estimation and the regionalisation procedure. If the current design level of a spillway exceeds the value so estimated, no further calculations are needed. Otherwise (and for new dams) a detailed assessment is required. The idea of the detailed assessment is to draw upon all existing sources of information to constrain the uncertainties. The three main sources are local flood frequency analysis, where flood data are available; regional flood estimation from hydrologically similar catchments; and rainfall-runoff modelling using design storms as inputs. The three values obtained by these methods are then assessed and weighted in terms of their reliability to facilitate selection of the design flood. The uncertainty assessment of the various methods is based on confidence intervals, estimates of regional heterogeneity, data availability and sensitivity analyses of the rainfall-runoff model. As the definition of the design floods discussed above is based on probability concepts it is also important to examine the excess risk, i.e. the possibility of the occurrence of a flood exceeding the design levels. The excess risk is evaluated based on a so called Safety Check Flood (SCF), similar to the existing practice in other countries in Europe. The SCF is a vehicle to analyse the damage potential of an event of this magnitude. This is to provide guidance for protective measures to dealing with very extreme floods. The SCF is used to check the vulnerability of the system with regard to structural stability, morphological effects, etc., and to develop alarm plans and disaster mitigation procedures. The basis for estimating the SCF are the uncertainty assessments of the design flood values estimated by the three methods including unlikely combinations of the controlling factors and attending uncertainties. Finally we discuss the impact on the downstream valley of floods exceeding the design values and of smaller floods and illustrate the basic concepts by examples from the recent flood in August 2002.
Characteristics of the April 2007 Flood at 10 Streamflow-Gaging Stations in Massachusetts
Zarriello, Phillip J.; Carlson, Carl S.
2009-01-01
A large 'nor'easter' storm on April 15-18, 2007, brought heavy rains to the southern New England region that, coupled with normal seasonal high flows and associated wet soil-moisture conditions, caused extensive flooding in many parts of Massachusetts and neighboring states. To characterize the magnitude of the April 2007 flood, a peak-flow frequency analysis was undertaken at 10 selected streamflow-gaging stations in Massachusetts to determine the magnitude of flood flows at 5-, 10-, 25-, 50-, 100-, 200-, and 500-year return intervals. The magnitude of flood flows at various return intervals were determined from the logarithms of the annual peaks fit to a Pearson Type III probability distribution. Analysis included augmenting the station record with longer-term records from one or more nearby stations to provide a common period of comparison that includes notable floods in 1936, 1938, and 1955. The April 2007 peak flow was among the highest recorded or estimated since 1936, often ranking between the 3d and 5th highest peak for that period. In general, the peak-flow frequency analysis indicates the April 2007 peak flow has an estimated return interval between 25 and 50 years; at stations in the northeastern and central areas of the state, the storm was less severe resulting in flows with return intervals of about 5 and 10 years, respectively. At Merrimack River at Lowell, the April 2007 peak flow approached a 100-year return interval that was computed from post-flood control records and the 1936 and 1938 peak flows adjusted for flood control. In general, the magnitude of flood flow for a given return interval computed from the streamflow-gaging station period-of-record was greater than those used to calculate flood profiles in various community flood-insurance studies. In addition, the magnitude of the updated flood flow and current (2008) stage-discharge relation at a given streamflow-gaging station often produced a flood stage that was considerably different than the flood stage indicated in the flood-insurance study flood profile at that station. Equations for estimating the flow magnitudes for 5-, 10-, 25-, 50-, 100-, 200-, and 500-year floods were developed from the relation of the magnitude of flood flows to drainage area calculated from the six streamflow-gaging stations with the longest unaltered record. These equations produced a more conservative estimate of flood flows (higher discharges) than the existing regional equations for estimating flood flows at ungaged rivers in Massachusetts. Large differences in the magnitude of flood flows for various return intervals determined in this study compared to results from existing regional equations and flood insurance studies indicate a need for updating regional analyses and equations for estimating the expected magnitude of flood flows in Massachusetts.
Evaluation of levee setbacks for flood-loss reduction, Middle Mississippi River, USA
NASA Astrophysics Data System (ADS)
Dierauer, Jennifer; Pinter, Nicholas; Remo, Jonathan W. F.
2012-07-01
SummaryOne-dimensional hydraulic modeling and flood-loss modeling were used to test the effectiveness of levee setbacks for flood-loss reduction along the Middle Mississippi River (MMR). Four levee scenarios were assessed: (1) the present-day levee configuration, (2) a 1000 m levee setback, (3) a 1500 m levee setback, and (4) an optimized setback configuration. Flood losses were estimated using FEMA's Hazus-MH (Hazards US Multi-Hazard) loss-estimation software on a structure-by-structure basis for a range of floods from the 2- to the 500-year events. These flood-loss estimates were combined with a levee-reliability model to calculate probability-weighted damage estimates. In the simplest case, the levee setback scenarios tested here reduced flood losses compared to current conditions for large, infrequent flooding events but increased flood losses for smaller, more frequent flood events. These increases occurred because levee protection was removed for some of the existing structures. When combined with buyouts of unprotected structures, levee setbacks reduced flood losses for all recurrence intervals. The "optimized" levee setback scenario, involving a levee configuration manually planned to protect existing high-value infrastructure, reduced damages with or without buyouts. This research shows that levee setbacks in combination with buyouts are an economically viable approach for flood-risk reduction along the study reach and likely elsewhere where levees are widely employed for flood control. Designing a levee setback around existing high-value infrastructure can maximize the benefit of the setback while simultaneously minimizing the costs. The optimized levee setback scenario analyzed here produced payback periods (costs divided by benefits) of less than 12 years. With many aging levees failing current inspections across the US, and flood losses spiraling up over time, levee setbacks are a viable solution for reducing flood exposure and flood levels.
Techniques for estimating flood hydrographs for ungaged urban watersheds
Stricker, V.A.; Sauer, V.B.
1984-01-01
The Clark Method, modified slightly was used to develop a synthetic, dimensionless hydrograph which can be used to estimate flood hydrographs for ungaged urban watersheds. Application of the technique results in a typical (average) flood hydrograph for a given peak discharge. Input necessary to apply the technique is an estimate of basin lagtime and the recurrence interval peak discharge. Equations for this purpose were obtained from a recent nationwide study on flood frequency in urban watersheds. A regression equation was developed which relates flood volumes to drainage area size, basin lagtime, and peak discharge. This equation is useful where storage of floodwater may be a part of design of flood prevention. (USGS)
Flood Scenario Simulation and Disaster Estimation of Ba-Ma Creek Watershed in Nantou County, Taiwan
NASA Astrophysics Data System (ADS)
Peng, S. H.; Hsu, Y. K.
2018-04-01
The present study proposed several scenario simulations of flood disaster according to the historical flood event and planning requirement in Ba-Ma Creek Watershed located in Nantou County, Taiwan. The simulations were made using the FLO-2D model, a numerical model which can compute the velocity and depth of flood on a two-dimensional terrain. Meanwhile, the calculated data were utilized to estimate the possible damage incurred by the flood disaster. The results thus obtained can serve as references for disaster prevention. Moreover, the simulated results could be employed for flood disaster estimation using the method suggested by the Water Resources Agency of Taiwan. Finally, the conclusions and perspectives are presented.
Design Life Level: Quantifying risk in a changing climate
NASA Astrophysics Data System (ADS)
Rootzén, Holger; Katz, Richard W.
2013-09-01
In the past, the concepts of return levels and return periods have been standard and important tools for engineering design. However, these concepts are based on the assumption of a stationary climate and do not apply to a changing climate, whether local or global. In this paper, we propose a refined concept, Design Life Level, which quantifies risk in a nonstationary climate and can serve as the basis for communication. In current practice, typical hydrologic risk management focuses on a standard (e.g., in terms of a high quantile corresponding to the specified probability of failure for a single year). Nevertheless, the basic information needed for engineering design should consist of (i) the design life period (e.g., the next 50 years, say 2015-2064); and (ii) the probability (e.g., 5% chance) of a hazardous event (typically, in the form of the hydrologic variable exceeding a high level) occurring during the design life period. Capturing both of these design characteristics, the Design Life Level is defined as an upper quantile (e.g., 5%) of the distribution of the maximum value of the hydrologic variable (e.g., water level) over the design life period. We relate this concept and variants of it to existing literature and illustrate how they, and some useful complementary plots, may be computed and used. One practically important consideration concerns quantifying the statistical uncertainty in estimating a high quantile under nonstationarity.
Evolution of precipitation extremes in two large ensembles of climate simulations
NASA Astrophysics Data System (ADS)
Martel, Jean-Luc; Mailhot, Alain; Talbot, Guillaume; Brissette, François; Ludwig, Ralf; Frigon, Anne; Leduc, Martin; Turcotte, Richard
2017-04-01
Recent studies project significant changes in the future distribution of precipitation extremes due to global warming. It is likely that extreme precipitation intensity will increase in a future climate and that extreme events will be more frequent. In this work, annual maxima daily precipitation series from the Canadian Earth System Model (CanESM2) 50-member large ensemble (spatial resolution of 2.8°x2.8°) and the Community Earth System Model (CESM1) 40-member large ensemble (spatial resolution of 1°x1°) are used to investigate extreme precipitation over the historical (1980-2010) and future (2070-2100) periods. The use of these ensembles results in respectively 1 500 (30 years x 50 members) and 1200 (30 years x 40 members) simulated years over both the historical and future periods. These large datasets allow the computation of empirical daily extreme precipitation quantiles for large return periods. Using the CanESM2 and CESM1 large ensembles, extreme daily precipitation with return periods ranging from 2 to 100 years are computed in historical and future periods to assess the impact of climate change. Results indicate that daily precipitation extremes generally increase in the future over most land grid points and that these increases will also impact the 100-year extreme daily precipitation. Considering that many public infrastructures have lifespans exceeding 75 years, the increase in extremes has important implications on service levels of water infrastructures and public safety. Estimated increases in precipitation associated to very extreme precipitation events (e.g. 100 years) will drastically change the likelihood of flooding and their extent in future climate. These results, although interesting, need to be extended to sub-daily durations, relevant for urban flooding protection and urban infrastructure design (e.g. sewer networks, culverts). Models and simulations at finer spatial and temporal resolution are therefore needed.
Using cost-benefit concepts in design floods improves communication of uncertainty
NASA Astrophysics Data System (ADS)
Ganora, Daniele; Botto, Anna; Laio, Francesco; Claps, Pierluigi
2017-04-01
Flood frequency analysis, i.e. the study of the relationships between the magnitude and the rarity of high flows in a river, is the usual procedure adopted to assess flood hazard, preliminary to the plan/design of flood protection measures. It grounds on the fit of a probability distribution to the peak discharge values recorded in gauging stations and the final estimates over a region are thus affected by uncertainty, due to the limited sample availability and of the possible alternatives in terms of the probabilistic model and the parameter estimation methods used. In the last decade, the scientific community dealt with this issue by developing a number of methods to quantify such uncertainty components. Usually, uncertainty is visually represented through confidence bands, which are easy to understand, but are not yet demonstrated to be useful for design purposes: they usually disorient decision makers, as the design flood is no longer univocally defined, making the decision process undetermined. These considerations motivated the development of the uncertainty-compliant design flood estimator (UNCODE) procedure (Botto et al., 2014) that allows one to select meaningful flood design values accounting for the associated uncertainty by considering additional constraints based on cost-benefit criteria. This method suggests an explicit multiplication factor that corrects the traditional (without uncertainty) design flood estimates to incorporate the effects of uncertainty in the estimate at the same safety level. Even though the UNCODE method was developed for design purposes, it can represent a powerful and robust tool to help clarifying the effects of the uncertainty in statistical estimation. As the process produces increased design flood estimates, this outcome demonstrates how uncertainty leads to more expensive flood protection measures, or insufficiency of current defenses. Moreover, the UNCODE approach can be used to assess the "value" of data, as the costs of flood prevention can get down by reducing uncertainty with longer observed flood records. As the multiplication factor is dimensionless, some examples of application provided show how this approach allows simple comparisons of the effects of uncertainty in different catchments, helping to build ranking procedures for planning purposes. REFERENCES Botto, A., Ganora, D., Laio, F., and Claps, P.: Uncertainty compliant design flood estimation, Water Resources Research, 50, doi:10.1002/2013WR014981, 2014.
Superquantile/CVaR Risk Measures: Second-Order Theory
2014-07-17
order version of quantile regression . Keywords: superquantiles, conditional value-at-risk, second-order superquantiles, mixed superquan- tiles... quantile regression . 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as Report (SAR) 18. NUMBER OF PAGES 26 19a...second-order superquantiles is in the domain of generalized regression . We laid out in [16] a parallel methodology to that of quantile regression
NASA Astrophysics Data System (ADS)
Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel
2017-04-01
Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.
Flood Damage and Loss Estimation for Iowa on Web-based Systems using HAZUS
NASA Astrophysics Data System (ADS)
Yildirim, E.; Sermet, M. Y.; Demir, I.
2016-12-01
Importance of decision support systems for flood emergency response and loss estimation increases with its social and economic impacts. To estimate the damage of the flood, there are several software systems available to researchers and decision makers. HAZUS-MH is one of the most widely used desktop program, developed by FEMA (Federal Emergency Management Agency), to estimate economic loss and social impacts of disasters such as earthquake, hurricane and flooding (riverine and coastal). HAZUS used loss estimation methodology and implements through geographic information system (GIS). HAZUS contains structural, demographic, and vehicle information across United States. Thus, it allows decision makers to understand and predict possible casualties and damage of the floods by running flood simulations through GIS application. However, it doesn't represent real time conditions because of using static data. To close this gap, an overview of a web-based infrastructure coupling HAZUS and real time data provided by IFIS (Iowa Flood Information System) is presented by this research. IFIS is developed by the Iowa Flood Center, and a one-stop web-platform to access community-based flood conditions, forecasts, visualizations, inundation maps and flood-related data, information, and applications. Large volume of real-time observational data from a variety of sensors and remote sensing resources (radars, rain gauges, stream sensors, etc.) and flood inundation models are staged on a user-friendly maps environment that is accessible to the general public. Providing cross sectional analyses between HAZUS-MH and IFIS datasets, emergency managers are able to evaluate flood damage during flood events easier and more accessible in real time conditions. With matching data from HAZUS-MH census tract layer and IFC gauges, economical effects of flooding can be observed and evaluated by decision makers. The system will also provide visualization of the data by using augmented reality for see-through displays. Emergency management experts can take advantage of this visualization mode to manage flood response activities in real time. Also, forecast system developed by the Iowa Flood Center will be used to predict probable damage of the flood.
Consistency of extreme flood estimation approaches
NASA Astrophysics Data System (ADS)
Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf
2017-04-01
Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.
Rosa, Sarah N.; Oki, Delwyn S.
2010-01-01
Reliable estimates of the magnitude and frequency of floods are necessary for the safe and efficient design of roads, bridges, water-conveyance structures, and flood-control projects and for the management of flood plains and flood-prone areas. StreamStats provides a simple, fast, and reproducible method to define drainage-basin characteristics and estimate the frequency and magnitude of peak discharges in Hawaii?s streams using recently developed regional regression equations. StreamStats allows the user to estimate the magnitude of floods for streams where data from stream-gaging stations do not exist. Existing estimates of the magnitude and frequency of peak discharges in Hawaii can be improved with continued operation of existing stream-gaging stations and installation of additional gaging stations for areas where limited stream-gaging data are available.
Ries(compiler), Kernell G.; With sections by Atkins, J. B.; Hummel, P.R.; Gray, Matthew J.; Dusenbury, R.; Jennings, M.E.; Kirby, W.H.; Riggs, H.C.; Sauer, V.B.; Thomas, W.O.
2007-01-01
The National Streamflow Statistics (NSS) Program is a computer program that should be useful to engineers, hydrologists, and others for planning, management, and design applications. NSS compiles all current U.S. Geological Survey (USGS) regional regression equations for estimating streamflow statistics at ungaged sites in an easy-to-use interface that operates on computers with Microsoft Windows operating systems. NSS expands on the functionality of the USGS National Flood Frequency Program, and replaces it. The regression equations included in NSS are used to transfer streamflow statistics from gaged to ungaged sites through the use of watershed and climatic characteristics as explanatory or predictor variables. Generally, the equations were developed on a statewide or metropolitan-area basis as part of cooperative study programs. Equations are available for estimating rural and urban flood-frequency statistics, such as the 1 00-year flood, for every state, for Puerto Rico, and for the island of Tutuila, American Samoa. Equations are available for estimating other statistics, such as the mean annual flow, monthly mean flows, flow-duration percentiles, and low-flow frequencies (such as the 7-day, 0-year low flow) for less than half of the states. All equations available for estimating streamflow statistics other than flood-frequency statistics assume rural (non-regulated, non-urbanized) conditions. The NSS output provides indicators of the accuracy of the estimated streamflow statistics. The indicators may include any combination of the standard error of estimate, the standard error of prediction, the equivalent years of record, or 90 percent prediction intervals, depending on what was provided by the authors of the equations. The program includes several other features that can be used only for flood-frequency estimation. These include the ability to generate flood-frequency plots, and plots of typical flood hydrographs for selected recurrence intervals, estimates of the probable maximum flood, extrapolation of the 500-year flood when an equation for estimating it is not available, and weighting techniques to improve flood-frequency estimates for gaging stations and ungaged sites on gaged streams. This report describes the regionalization techniques used to develop the equations in NSS and provides guidance on the applicability and limitations of the techniques. The report also includes a users manual and a summary of equations available for estimating basin lagtime, which is needed by the program to generate flood hydrographs. The NSS software and accompanying database, and the documentation for the regression equations included in NSS, are available on the Web at http://water.usgs.gov/software/.
Re-Evaluation of the 1921 Peak Discharge at Skagit River near Concrete, Washington
Mastin, M.C.
2007-01-01
The peak discharge record at the U.S. Geological Survey (USGS) gaging station at Skagit River near Concrete, Washington, is a key record that has come under intense scrutiny by the scientific and lay person communities in the last 4 years. A peak discharge of 240,000 cubic feet per second for the flood on December 13, 1921, was determined in 1923 by USGS hydrologist James Stewart by means of a slope-area measurement. USGS then determined the peak discharges of three other large floods on the Skagit River (1897, 1909, and 1917) by extending the stage-discharge rating through the 1921 flood measurement. The 1921 estimate of peak discharge was recalculated by Flynn and Benson of the USGS after a channel roughness verification was completed based on the 1949 flood on the Skagit River. The 1949 recalculation indicated that the peak discharge probably was 6.2 percent lower than Stewart's original estimate but the USGS did not officially change the peak discharge from Stewart's estimate because it was not more than a 10-percent change (which is the USGS guideline for revising peak flows) and the estimate already had error bands of 15 percent. All these flood peaks are now being used by the U.S. Army Corps of Engineers to determine the 100-year flood discharge for the Skagit River Flood Study so any method to confirm or improve the 1921 peak discharge estimate is warranted. During the last 4 years, two floods have occurred on the Skagit River (2003, 2006) that has enabled the USGS to collect additional data, do further analysis, and yet again re-evaluate the 1921 peak discharge estimate. Since 1949, an island/bar in the study reach has reforested itself. This has complicated the flow hydraulics and made the most recent recalculation of the 1921 flood based on channel roughness verification that used 2003 and 2006 flood data less reliable. However, this recent recalculation did indicate that the original peak-discharge calculation by Stewart may be high, and it added to a body of evidence that indicates a revision in the 1921 peak discharge estimate is appropriate. The USGS has determined that a lower peak-discharge estimate (5.0 percent lower) similar to the 1949 estimates is most appropriate based on (1) a recalculation of the 1921 flood using a channel roughness verification from the 1949 flood data, (2) a recalculation of the 1921 flood using a channel roughness verification from 2003 and 2006 flood data, and (3) straight-line extension of the stage-discharge relation at the gage based on current-meter discharge measurements. Given the significance of the 1921 flood peak, revising the estimate is appropriate even though it is less than the 10-percent guideline established by the USGS for revision. Revising the peak is warranted because all work subsequent to 1921 point to the 1921 peak being lower than originally published.
Flood Frequency Curves - Use of information on the likelihood of extreme floods
NASA Astrophysics Data System (ADS)
Faber, B.
2011-12-01
Investment in the infrastructure that reduces flood risk for flood-prone communities must incorporate information on the magnitude and frequency of flooding in that area. Traditionally, that information has been a probability distribution of annual maximum streamflows developed from the historical gaged record at a stream site. Practice in the United States fits a Log-Pearson type3 distribution to the annual maximum flows of an unimpaired streamflow record, using the method of moments to estimate distribution parameters. The procedure makes the assumptions that annual peak streamflow events are (1) independent, (2) identically distributed, and (3) form a representative sample of the overall probability distribution. Each of these assumptions can be challenged. We rarely have enough data to form a representative sample, and therefore must compute and display the uncertainty in the estimated flood distribution. But, is there a wet/dry cycle that makes precipitation less than independent between successive years? Are the peak flows caused by different types of events from different statistical populations? How does the watershed or climate changing over time (non-stationarity) affect the probability distribution floods? Potential approaches to avoid these assumptions vary from estimating trend and shift and removing them from early data (and so forming a homogeneous data set), to methods that estimate statistical parameters that vary with time. A further issue in estimating a probability distribution of flood magnitude (the flood frequency curve) is whether a purely statistical approach can accurately capture the range and frequency of floods that are of interest. A meteorologically-based analysis produces "probable maximum precipitation" (PMP) and subsequently a "probable maximum flood" (PMF) that attempts to describe an upper bound on flood magnitude in a particular watershed. This analysis can help constrain the upper tail of the probability distribution, well beyond the range of gaged data or even historical or paleo-flood data, which can be very important in risk analyses performed for flood risk management and dam and levee safety studies.
Tortorelli, R.L.; Bergman, D.L.
1985-01-01
Statewide regression relations for Oklahoma were determined for estimating peak discharge of floods for selected recurrence intervals from 2 to 500 years. The independent variables required for estimating flood discharge for rural streams are contributing drainage area and mean annual precipitation. Main-channel slope, a variable used in previous reports, was found to contribute very little to the accuracy of the relations and was not used. The regression equations are applicable for watersheds with drainage areas less than 2,500 square miles that are not significantly affected by regulation from manmade works. These relations are presented in graphical form for easy application. Limitations on the use of the regression relations and the reliability of regression estimates for rural unregulated streams are discussed. Basin and climatic characteristics, log-Pearson Type III statistics and the flood-frequency relations for 226 gaging stations in Oklahoma and adjacent states are presented. Regression relations are investigated for estimating flood magnitude and frequency for watersheds affected by regulation from small FRS (floodwater retarding structures) built by the U.S. Soil Conservation Service in their watershed protection and flood prevention program. Gaging-station data from nine FRS regulated sites in Oklahoma and one FRS regulated site in Kansas are used. For sites regulated by FRS, an adjustment of the statewide rural regression relations can be used to estimate flood magnitude and frequency. The statewide regression equations are used by substituting the drainage area below the FRS, or drainage area that represents the percent of the basin unregulated, in the contributing drainage area parameter to obtain flood-frequency estimates. Flood-frequency curves and flow-duration curves are presented for five gaged sites to illustrate the effects of FRS regulation on peak discharge.
A framework for global river flood risk assessments
NASA Astrophysics Data System (ADS)
Winsemius, H. C.; Van Beek, L. P. H.; Jongman, B.; Ward, P. J.; Bouwman, A.
2013-05-01
There is an increasing need for strategic global assessments of flood risks in current and future conditions. In this paper, we propose a framework for global flood risk assessment for river floods, which can be applied in current conditions, as well as in future conditions due to climate and socio-economic changes. The framework's goal is to establish flood hazard and impact estimates at a high enough resolution to allow for their combination into a risk estimate, which can be used for strategic global flood risk assessments. The framework estimates hazard at a resolution of ~ 1 km2 using global forcing datasets of the current (or in scenario mode, future) climate, a global hydrological model, a global flood-routing model, and more importantly, an inundation downscaling routine. The second component of the framework combines hazard with flood impact models at the same resolution (e.g. damage, affected GDP, and affected population) to establish indicators for flood risk (e.g. annual expected damage, affected GDP, and affected population). The framework has been applied using the global hydrological model PCR-GLOBWB, which includes an optional global flood routing model DynRout, combined with scenarios from the Integrated Model to Assess the Global Environment (IMAGE). We performed downscaling of the hazard probability distributions to 1 km2 resolution with a new downscaling algorithm, applied on Bangladesh as a first case study application area. We demonstrate the risk assessment approach in Bangladesh based on GDP per capita data, population, and land use maps for 2010 and 2050. Validation of the hazard estimates has been performed using the Dartmouth Flood Observatory database. This was done by comparing a high return period flood with the maximum observed extent, as well as by comparing a time series of a single event with Dartmouth imagery of the event. Validation of modelled damage estimates was performed using observed damage estimates from the EM-DAT database and World Bank sources. We discuss and show sensitivities of the estimated risks with regard to the use of different climate input sets, decisions made in the downscaling algorithm, and different approaches to establish impact models.
Flood type specific construction of synthetic design hydrographs
NASA Astrophysics Data System (ADS)
Brunner, Manuela I.; Viviroli, Daniel; Sikorska, Anna E.; Vannier, Olivier; Favre, Anne-Catherine; Seibert, Jan
2017-02-01
Accurate estimates of flood peaks, corresponding volumes, and hydrographs are required to design safe and cost-effective hydraulic structures. In this paper, we propose a statistical approach for the estimation of the design variables peak and volume by constructing synthetic design hydrographs for different flood types such as flash-floods, short-rain floods, long-rain floods, and rain-on-snow floods. Our approach relies on the fitting of probability density functions to observed flood hydrographs of a certain flood type and accounts for the dependence between peak discharge and flood volume. It makes use of the statistical information contained in the data and retains the process information of the flood type. The method was tested based on data from 39 mesoscale catchments in Switzerland and provides catchment specific and flood type specific synthetic design hydrographs for all of these catchments. We demonstrate that flood type specific synthetic design hydrographs are meaningful in flood-risk management when combined with knowledge on the seasonality and the frequency of different flood types.
Quantile regression analyses of associated factors for body mass index in Korean adolescents.
Kim, T H; Lee, E K; Han, E
2015-05-01
This study examined the influence of home and school environments, and individual health-risk behaviours on body weight outcomes in Korean adolescents. This was a cross-sectional observational study. Quantile regression models to explore heterogeneity in the association of specific factors with body mass index (BMI) over the entire conditional BMI distribution was used. A nationally representative web-based survey for youths was used. Paternal education level of college or more education was associated with lower BMI for girls, whereas college or more education of mothers was associated with higher BMI for boys; for both, the magnitude of association became larger at the upper quantiles of the conditional BMI distribution. Girls with good family economic status were more likely to have higher BMIs than those with average family economic status, particularly at the upper quantile of the conditional BMI distribution. Attending a co-ed school was associated with lower BMI for both genders with a larger association at the upper quantiles. Substantial screen time for TV watching, video games, or internet surfing was associated with a higher BMI with a larger association at the upper quantiles for both girls and boys. Dental prevention was negatively associated with BMI, whereas suicide consideration was positively associated with BMIs of both genders with a larger association at a higher quantile. These findings suggest that interventions aimed at behavioural changes and positive parental roles are needed to effectively address high adolescent BMI. Copyright © 2015 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Regional regression of flood characteristics employing historical information
Tasker, Gary D.; Stedinger, J.R.
1987-01-01
Streamflow gauging networks provide hydrologic information for use in estimating the parameters of regional regression models. The regional regression models can be used to estimate flood statistics, such as the 100 yr peak, at ungauged sites as functions of drainage basin characteristics. A recent innovation in regional regression is the use of a generalized least squares (GLS) estimator that accounts for unequal station record lengths and sample cross correlation among the flows. However, this technique does not account for historical flood information. A method is proposed here to adjust this generalized least squares estimator to account for possible information about historical floods available at some stations in a region. The historical information is assumed to be in the form of observations of all peaks above a threshold during a long period outside the systematic record period. A Monte Carlo simulation experiment was performed to compare the GLS estimator adjusted for historical floods with the unadjusted GLS estimator and the ordinary least squares estimator. Results indicate that using the GLS estimator adjusted for historical information significantly improves the regression model. ?? 1987.
NASA Astrophysics Data System (ADS)
Longo, Elisa; Tito Aronica, Giuseppe; Di Baldassarre, Giuliano; Mukolwe, Micah
2015-04-01
Flooding is one of the most impactful natural hazards. In particular, by looking at the data of damages from natural hazards in Europe collected in the International Disaster Database (EM-DAT) one can see a significant increase over the past four decades of both frequency of floods and associated economic damages. Similarly, dramatic trends are also found by analyzing other types of flood losses, such as the number of people affected by floods, homeless, injured or killed. To deal with the aforementioned increase of flood risk, more and more efforts are being made to promote integrated flood risk management, for instance, at the end of 2007, the European Community (EC) issued the Flood Directive (F.D.) 2007/60/EC. One of the major innovations was that the F.D. 2007/60/C requires Member State to carry out risk maps and then take appropriate measures to reduce the evaluated risk. The main goal of this research was to estimate flood damaging using a computer code based on a recently developed method (KULTURisk, www.kulturisk.eu) and to compare the estimated damage with the observed one. The study area was the municipality of Eilenburg, which in 2002 was subjected to a destructive flood event. Were produced flood damage maps with new procedures (e.g. KULTURisk) and compared the estimates with observed data. This study showed the possibility to extend the lesson learned with the Eilenburg case study in other similar contexts. The outcomes of this test provided interesting insights about the flood risk mapping, which are expected to contribute to raise awareness to the flooding issues,to plan (structural and/or non-structural) measures of flood risk reduction and to support better land-use and urban planning.
ERIC Educational Resources Information Center
Guo, Jiin-Huarng; Luh, Wei-Ming
2008-01-01
This study proposes an approach for determining appropriate sample size for Welch's F test when unequal variances are expected. Given a certain maximum deviation in population means and using the quantile of F and t distributions, there is no need to specify a noncentrality parameter and it is easy to estimate the approximate sample size needed…
ERIC Educational Resources Information Center
Andrews, Rodney J.; Li, Jing; Lovenheim, Michael F.
2012-01-01
This paper uses administrative data on schooling and earnings from Texas to estimate the effect of college quality on the distribution of earnings. We proxy college quality using the college sector from which students graduate and focus on identifying how graduating from UT-Austin, Texas A\\&M or a community college affects the distribution of…
NASA Astrophysics Data System (ADS)
Ono, T.; Takahashi, T.
2017-12-01
Non-structural mitigation measures such as flood hazard map based on estimated inundation area have been more important because heavy rains exceeding the design rainfall frequently occur in recent years. However, conventional method may lead to an underestimation of the area because assumed locations of dike breach in river flood analysis are limited to the cases exceeding the high-water level. The objective of this study is to consider the uncertainty of estimated inundation area with difference of the location of dike breach in river flood analysis. This study proposed multiple flood scenarios which can set automatically multiple locations of dike breach in river flood analysis. The major premise of adopting this method is not to be able to predict the location of dike breach correctly. The proposed method utilized interval of dike breach which is distance of dike breaches placed next to each other. That is, multiple locations of dike breach were set every interval of dike breach. The 2D shallow water equations was adopted as the governing equation of river flood analysis, and the leap-frog scheme with staggered grid was used. The river flood analysis was verified by applying for the 2015 Kinugawa river flooding, and the proposed multiple flood scenarios was applied for the Akutagawa river in Takatsuki city. As the result of computation in the Akutagawa river, a comparison with each computed maximum inundation depth of dike breaches placed next to each other proved that the proposed method enabled to prevent underestimation of estimated inundation area. Further, the analyses on spatial distribution of inundation class and maximum inundation depth in each of the measurement points also proved that the optimum interval of dike breach which can evaluate the maximum inundation area using the minimum assumed locations of dike breach. In brief, this study found the optimum interval of dike breach in the Akutagawa river, which enabled estimated maximum inundation area to predict efficiently and accurately. The river flood analysis by using this proposed method will contribute to mitigate flood disaster by improving the accuracy of estimated inundation area.
Ries, Kernell G.; Crouse, Michele Y.
2002-01-01
For many years, the U.S. Geological Survey (USGS) has been developing regional regression equations for estimating flood magnitude and frequency at ungaged sites. These regression equations are used to transfer flood characteristics from gaged to ungaged sites through the use of watershed and climatic characteristics as explanatory or predictor variables. Generally, these equations have been developed on a Statewide or metropolitan-area basis as part of cooperative study programs with specific State Departments of Transportation. In 1994, the USGS released a computer program titled the National Flood Frequency Program (NFF), which compiled all the USGS available regression equations for estimating the magnitude and frequency of floods in the United States and Puerto Rico. NFF was developed in cooperation with the Federal Highway Administration and the Federal Emergency Management Agency. Since the initial release of NFF, the USGS has produced new equations for many areas of the Nation. A new version of NFF has been developed that incorporates these new equations and provides additional functionality and ease of use. NFF version 3 provides regression-equation estimates of flood-peak discharges for unregulated rural and urban watersheds, flood-frequency plots, and plots of typical flood hydrographs for selected recurrence intervals. The Program also provides weighting techniques to improve estimates of flood-peak discharges for gaging stations and ungaged sites. The information provided by NFF should be useful to engineers and hydrologists for planning and design applications. This report describes the flood-regionalization techniques used in NFF and provides guidance on the applicability and limitations of the techniques. The NFF software and the documentation for the regression equations included in NFF are available at http://water.usgs.gov/software/nff.html.
Summary of U.S. Geological Survey reports documenting flood profiles of streams in Iowa, 1963-2012
Eash, David A.
2014-01-01
This report is part of an ongoing program that is publishing flood profiles of streams in Iowa. The program is managed by the U.S. Geological Survey in cooperation with the Iowa Department of Transportation and the Iowa Highway Research Board (Project HR-140). Information from flood profiles is used by engineers to analyze and design bridges, culverts, and roadways. This report summarizes 47 U.S. Geological Survey flood-profile reports that were published for streams in Iowa during a 50-year period from 1963 to 2012. Flood events profiled in the reports range from 1903 to 2010. Streams in Iowa that have been selected for the preparation of flood-profile reports typically have drainage areas of 100 square miles or greater, and the documented flood events have annual exceedance probabilities of less than 2 to 4 percent. This report summarizes flood-profile measurements, changes in flood-profile report content throughout the years, streams that were profiled in the reports, the occurrence of flood events profiled, and annual exceedance-probability estimates of observed flood events. To develop flood profiles for selected flood events for selected stream reaches, the U.S. Geological Survey measured high-water marks and river miles at selected locations. A total of 94 stream reaches have been profiled in U.S. Geological Survey flood-profile reports. Three rivers in Iowa have been profiled along the same stream reach for five different flood events and six rivers in Iowa have been profiled along the same stream reach for four different flood events. Floods were profiled for June flood events for 18 different years, followed by July flood events for 13 years, May flood events for 11 years, and April flood events for 9 years. Most of the flood-profile reports include estimates of annual exceedance probabilities of observed flood events at streamgages located along profiled stream reaches. Comparisons of 179 historic and updated annual exceedance-probability estimates indicate few differences that are considered substantial between the historic and updated estimates for the observed flood events. Overall, precise comparisons for 114 observed flood events indicate that updated annual exceedance probabilities have increased for most of the observed flood events compared to the historic annual exceedance probabilities. Multiple large flood events exceeding the 2-percent annual exceedance-probability discharge estimate occurred at 37 of 98 selected streamgages during 1960–2012. Five large flood events were recorded at two streamgages in Ames during 1990–2010 and four large flood events were recorded at four other streamgages during 1973–2010. Results of Kendall’s tau trend-analysis tests for 35 of 37 selected streamgages indicate that a statistically significant trend is not evident for the 1963–2012 period of record; nor is an overall clear positive or negative trend evident for the 37 streamgages.
An operational procedure for rapid flood risk assessment in Europe
NASA Astrophysics Data System (ADS)
Dottori, Francesco; Kalas, Milan; Salamon, Peter; Bianchi, Alessandra; Alfieri, Lorenzo; Feyen, Luc
2017-07-01
The development of methods for rapid flood mapping and risk assessment is a key step to increase the usefulness of flood early warning systems and is crucial for effective emergency response and flood impact mitigation. Currently, flood early warning systems rarely include real-time components to assess potential impacts generated by forecasted flood events. To overcome this limitation, this study describes the benchmarking of an operational procedure for rapid flood risk assessment based on predictions issued by the European Flood Awareness System (EFAS). Daily streamflow forecasts produced for major European river networks are translated into event-based flood hazard maps using a large map catalogue derived from high-resolution hydrodynamic simulations. Flood hazard maps are then combined with exposure and vulnerability information, and the impacts of the forecasted flood events are evaluated in terms of flood-prone areas, economic damage and affected population, infrastructures and cities.An extensive testing of the operational procedure has been carried out by analysing the catastrophic floods of May 2014 in Bosnia-Herzegovina, Croatia and Serbia. The reliability of the flood mapping methodology is tested against satellite-based and report-based flood extent data, while modelled estimates of economic damage and affected population are compared against ground-based estimations. Finally, we evaluate the skill of risk estimates derived from EFAS flood forecasts with different lead times and combinations of probabilistic forecasts. Results highlight the potential of the real-time operational procedure in helping emergency response and management.
Rapid flood loss estimation for large scale floods in Germany
NASA Astrophysics Data System (ADS)
Schröter, Kai; Kreibich, Heidi; Merz, Bruno
2013-04-01
Rapid evaluations of flood events are needed for efficient responses both in emergency management and financial appraisal. Beyond that, closely monitoring and documenting the formation and development of flood events and their impacts allows for an improved understanding and in depth analyses of the interplay between meteorological, hydrological, hydraulic and societal causes leading to flood damage. This contribution focuses on the development of a methodology for the rapid assessment of flood events. In the first place, the focus is on the prediction of damage to residential buildings caused by large scale floods in Germany. For this purpose an operational flood event analysis system is developed. This system has basic spatial thematic data available and supports data capturing about the current flood situation. This includes the retrieval of online gauge data and the integration of remote sensing data. Further, it provides functionalities to evaluate the current flood situation, to assess the hazard extent and intensity and to estimate the current flood impact using the flood loss estimation model FLEMOps+r. The operation of the flood event analysis system will be demonstrated for the past flood event from January 2011 with a focus on the Elbe/Saale region. On this grounds, further requirements and potential for improving the information basis as for instance by including hydrological and /or hydraulic model results as well as information from social sensors will be discussed.
Estimating the number of motor units using random sums with independently thinned terms.
Müller, Samuel; Conforto, Adriana Bastos; Z'graggen, Werner J; Kaelin-Lang, Alain
2006-07-01
The problem of estimating the numbers of motor units N in a muscle is embedded in a general stochastic model using the notion of thinning from point process theory. In the paper a new moment type estimator for the numbers of motor units in a muscle is denned, which is derived using random sums with independently thinned terms. Asymptotic normality of the estimator is shown and its practical value is demonstrated with bootstrap and approximative confidence intervals for a data set from a 31-year-old healthy right-handed, female volunteer. Moreover simulation results are presented and Monte-Carlo based quantiles, means, and variances are calculated for N in{300,600,1000}.
Inman, Ernest J.
1997-01-01
Flood-frequency relations were computed for 28 urban stations, for 2-, 25-, and 100-year recurrence interval floods and the computations were compared to corresponding recurrence interval floods computed from the estimating equations from a 1995 investigation. Two stations were excluded from further comparisons or analyses because neither station had a significant flood during the period of observed record. The comparisons, based on the student's t-test statistics at the 0.05 level of significance, indicate that the mean residuals of the 25- and 100-year floods were negatively biased by 26.2 percent and 31.6 percent, respectively, at the 26 stations. However, the mean residuals of the 2-year floods were 2.5 percent lower than the mean of the 2-year floods computed from the equations, and were not significantly biased. The reason for this negative bias is that the period of observed record at the 26 stations was a relatively dry period. At 25 of the 26 stations, the two highest simulated peaks used to develop the estimating equations occurred many years before the observed record began. However, no attempt was made to adjust the estimating equations because higher peaks could occur after the period of observed record and an adjustment to the equations would cause an underestimation of design floods.
Applications of ASFCM(Assessment System of Flood Control Measurement) in Typhoon Committee Members
NASA Astrophysics Data System (ADS)
Kim, C.
2013-12-01
Due to extreme weather environment such as global warming and greenhouse effect, the risks of having flood damage has been increased with larger scale of flood damages. Therefore, it became necessary to consider modifying climate change, flood damage and its scale to the previous dimension measurement evaluation system. In this regard, it is needed to establish a comprehensive and integrated system to evaluate the most optimized measures for flood control through eliminating uncertainties of socio-economic impacts. Assessment System of Structural Flood Control Measures (ASFCM) was developed for determining investment priorities of the flood control measures and establishing the social infrastructure projects. ASFCM consists of three modules: 1) the initial setup and inputs module, 2) the flood and damage estimation module, and 3) the socio-economic analysis module. First, we have to construct the D/B for flood damage estimation, which is the initial and input data about the estimation unit, property, historical flood damages, and applied area's topographic & hydrological data. After that, it is important to classify local characteristic for constructing flood damage data. Five local characteristics (big city, medium size city, small city, farming area, and mountain area) are classified by criterion of application (population density). Next step is the floodplain simulation with HEC-RAS which is selected to simulate inundation. Through inputting the D/B and damage estimation, it is able to estimate the total damage (only direct damage) that is the amount of cost to recover the socio-economic activities back to the safe level before flood did occur. The last module suggests the economic analysis index (B/C ratio) with Multidimensional Flood Damage Analysis. Consequently, ASFCM suggests the reference index in constructing flood control measures and planning non-structural systems to reduce water-related damage. It is possible to encourage flood control planners and managers to consider and apply the socio-economic analysis results. ASFCM was applied in Republic of Korea, Thailand and Philippines to review efficiency and applicability. Figure 1. ASFCM Application(An-yang Stream, Republic of Korea)
Brennan, Angela K.; Cross, Paul C.; Creely, Scott
2015-01-01
Synthesis and applications. Our analysis of elk group size distributions using quantile regression suggests that private land, irrigation, open habitat, elk density and wolf abundance can affect large elk group sizes. Thus, to manage larger groups by removal or dispersal of individuals, we recommend incentivizing hunting on private land (particularly if irrigated) during the regular and late hunting seasons, promoting tolerance of wolves on private land (if elk aggregate in these areas to avoid wolves) and creating more winter range and varied habitats. Relationships to the variables of interest also differed by quantile, highlighting the importance of using quantile regression to examine response variables more completely to uncover relationships important to conservation and management.
NASA Astrophysics Data System (ADS)
Pal, Debdatta; Mitra, Subrata Kumar
2018-01-01
This study used a quantile autoregressive distributed lag (QARDL) model to capture asymmetric impact of rainfall on food production in India. It was found that the coefficient corresponding to the rainfall in the QARDL increased till the 75th quantile and started decreasing thereafter, though it remained in the positive territory. Another interesting finding is that at the 90th quantile and above the coefficients of rainfall though remained positive was not statistically significant and therefore, the benefit of high rainfall on crop production was not conclusive. However, the impact of other determinants, such as fertilizer and pesticide consumption, is quite uniform over the whole range of the distribution of food grain production.
An assessment of temporal effect on extreme rainfall estimates
NASA Astrophysics Data System (ADS)
Das, Samiran; Zhu, Dehua; Chi-Han, Cheng
2018-06-01
This study assesses the temporal behaviour in terms of inter-decadal variability of extreme daily rainfall of stated return period relevant for hydrologic risk analysis using a novel regional parametric approach. The assessment is carried out based on annual maximum daily rainfall series of 180 meteorological stations of Yangtze River Basin over a 50-year period (1961-2010). The outcomes of the analysis reveal that while there were effects present indicating higher quantile values when estimated from data of the 1990s, it is found not to be noteworthy to exclude the data of any decade from the extreme rainfall estimation process for hydrologic risk analysis.
A minimum distance estimation approach to the two-sample location-scale problem.
Zhang, Zhiyi; Yu, Qiqing
2002-09-01
As reported by Kalbfleisch and Prentice (1980), the generalized Wilcoxon test fails to detect a difference between the lifetime distributions of the male and female mice died from Thymic Leukemia. This failure is a result of the test's inability to detect a distributional difference when a location shift and a scale change exist simultaneously. In this article, we propose an estimator based on the minimization of an average distance between two independent quantile processes under a location-scale model. Large sample inference on the proposed estimator, with possible right-censorship, is discussed. The mouse leukemia data are used as an example for illustration purpose.
NASA Astrophysics Data System (ADS)
Paudel, Y.; Botzen, W. J. W.; Aerts, J. C. J. H.
2013-03-01
This study applies Bayesian Inference to estimate flood risk for 53 dyke ring areas in the Netherlands, and focuses particularly on the data scarcity and extreme behaviour of catastrophe risk. The probability density curves of flood damage are estimated through Monte Carlo simulations. Based on these results, flood insurance premiums are estimated using two different practical methods that each account in different ways for an insurer's risk aversion and the dispersion rate of loss data. This study is of practical relevance because insurers have been considering the introduction of flood insurance in the Netherlands, which is currently not generally available.
Climate, orography and scale controls on flood frequency in Triveneto (Italy)
NASA Astrophysics Data System (ADS)
Persiano, Simone; Castellarin, Attilio; Salinas, Jose Luis; Domeneghetti, Alessio; Brath, Armando
2016-05-01
The growing concern about the possible effects of climate change on flood frequency regime is leading Authorities to review previously proposed reference procedures for design-flood estimation, such as national flood frequency models. Our study focuses on Triveneto, a broad geographical region in North-eastern Italy. A reference procedure for design flood estimation in Triveneto is available from the Italian NCR research project "VA.PI.", which considered Triveneto as a single homogeneous region and developed a regional model using annual maximum series (AMS) of peak discharges that were collected up to the 1980s by the former Italian Hydrometeorological Service. We consider a very detailed AMS database that we recently compiled for 76 catchments located in Triveneto. All 76 study catchments are characterized in terms of several geomorphologic and climatic descriptors. The objective of our study is threefold: (1) to inspect climatic and scale controls on flood frequency regime; (2) to verify the possible presence of changes in flood frequency regime by looking at changes in time of regional L-moments of annual maximum floods; (3) to develop an updated reference procedure for design flood estimation in Triveneto by using a focused-pooling approach (i.e. Region of Influence, RoI). Our study leads to the following conclusions: (1) climatic and scale controls on flood frequency regime in Triveneto are similar to the controls that were recently found in Europe; (2) a single year characterized by extreme floods can have a remarkable influence on regional flood frequency models and analyses for detecting possible changes in flood frequency regime; (3) no significant change was detected in the flood frequency regime, yet an update of the existing reference procedure for design flood estimation is highly recommended and we propose the RoI approach for properly representing climate and scale controls on flood frequency in Triveneto, which cannot be regarded as a single homogeneous region.
A statistical approach to evaluate flood risk at the regional level: an application to Italy
NASA Astrophysics Data System (ADS)
Rossi, Mauro; Marchesini, Ivan; Salvati, Paola; Donnini, Marco; Guzzetti, Fausto; Sterlacchini, Simone; Zazzeri, Marco; Bonazzi, Alessandro; Carlesi, Andrea
2016-04-01
Floods are frequent and widespread in Italy, causing every year multiple fatalities and extensive damages to public and private structures. A pre-requisite for the development of mitigation schemes, including financial instruments such as insurance, is the ability to quantify their costs starting from the estimation of the underlying flood hazard. However, comprehensive and coherent information on flood prone areas, and estimates on the frequency and intensity of flood events, are not often available at scales appropriate for risk pooling and diversification. In Italy, River Basins Hydrogeological Plans (PAI), prepared by basin administrations, are the basic descriptive, regulatory, technical and operational tools for environmental planning in flood prone areas. Nevertheless, such plans do not cover the entire Italian territory, having significant gaps along the minor hydrographic network and in ungauged basins. Several process-based modelling approaches have been used by different basin administrations for the flood hazard assessment, resulting in an inhomogeneous hazard zonation of the territory. As a result, flood hazard assessments expected and damage estimations across the different Italian basin administrations are not always coherent. To overcome these limitations, we propose a simplified multivariate statistical approach for the regional flood hazard zonation coupled with a flood impact model. This modelling approach has been applied in different Italian basin administrations, allowing a preliminary but coherent and comparable estimation of the flood hazard and the relative impact. Model performances are evaluated comparing the predicted flood prone areas with the corresponding PAI zonation. The proposed approach will provide standardized information (following the EU Floods Directive specifications) on flood risk at a regional level which can in turn be more readily applied to assess flood economic impacts. Furthermore, in the assumption of an appropriate flood risk statistical characterization, the proposed procedure could be applied straightforward outside the national borders, particularly in areas with similar geo-environmental settings.
Challenges estimating the return period of extreme floods for reinsurance applications
NASA Astrophysics Data System (ADS)
Raven, Emma; Busby, Kathryn; Liu, Ye
2013-04-01
Mapping and modelling extreme natural events is fundamental within the insurance and reinsurance industry for assessing risk. For example, insurers might use a 1 in 100-year flood hazard map to set the annual premium of a property, whilst a reinsurer might assess the national scale loss associated with the 1 in 200-year return period for capital and regulatory requirements. Using examples from a range of international flood projects, we focus on exploring how to define what the n-year flood looks like for predictive uses in re/insurance applications, whilst considering challenges posed by short historical flow records and the spatial and temporal complexities of flood. First, we shall explore the use of extreme value theory (EVT) statistics for extrapolating data beyond the range of observations in a marginal analysis. In particular, we discuss how to estimate the return period of historical flood events and explore the impact that a range of statistical decisions have on these estimates. Decisions include: (1) selecting which distribution type to apply (e.g. generalised Pareto distribution (GPD) vs. generalised extreme value distribution (GEV)); (2) if former, the choice of the threshold above which the GPD is fitted to the data; and (3) the necessity to perform a cluster analysis to group flow peaks to temporally represent individual flood events. Second, we summarise a specialised multivariate extreme value model, which combines the marginal analysis above with dependence modelling to generate industry standard event sets containing thousands of simulated, equi-probable floods across a region/country. These events represent the typical range of anticipated flooding across a region and can be used to estimate the largest or most widespread events that are expected to occur. Finally, we summarise how a reinsurance catastrophe model combines the event set with detailed flood hazard maps to estimate the financial cost of floods; both the full event set and also individual extreme events. Since the predicted loss estimates, typically in the form of a curve plotting return period against modelled loss, are used in the pricing of reinsurance, we demonstrate the importance of the estimated return period and understanding the uncertainties associated with it.
Flood area and damage estimation in Zhejiang, China.
Liu, Renyi; Liu, Nan
2002-09-01
A GIS-based method to estimate flood area and damage is presented in this paper, which is oriented to developing countries like China, where labor is readily available for GIS data collecting, and tools such as, HEC-GeoRAS might not be readily available. At present local authorities in developing countries are often not predisposed to pay for commercial GIS platforms. To calculate flood area, two cases, non-source flood and source flood, are distinguished and a seed-spread algorithm suitable for source-flooding is described. The flood damage estimation is calculated in raster format by overlaying the flood area range with thematic maps and relating this to other socioeconomic data. Several measures used to improve the geometric accuracy and computing efficiency are presented. The management issues related to the application of this method, including the cost-effectiveness of approximate method in practice and supplementing two technical lines (self-programming and adopting commercial GIS software) to each other, are also discussed. The applications show that this approach has practical significance to flood fighting and control in developing countries like China.
Benchmarking an operational procedure for rapid flood mapping and risk assessment in Europe
NASA Astrophysics Data System (ADS)
Dottori, Francesco; Salamon, Peter; Kalas, Milan; Bianchi, Alessandra; Feyen, Luc
2016-04-01
The development of real-time methods for rapid flood mapping and risk assessment is crucial to improve emergency response and mitigate flood impacts. This work describes the benchmarking of an operational procedure for rapid flood risk assessment based on the flood predictions issued by the European Flood Awareness System (EFAS). The daily forecasts produced for the major European river networks are translated into event-based flood hazard maps using a large map catalogue derived from high-resolution hydrodynamic simulations, based on the hydro-meteorological dataset of EFAS. Flood hazard maps are then combined with exposure and vulnerability information, and the impacts of the forecasted flood events are evaluated in near real-time in terms of flood prone areas, potential economic damage, affected population, infrastructures and cities. An extensive testing of the operational procedure is carried out using the catastrophic floods of May 2014 in Bosnia-Herzegovina, Croatia and Serbia. The reliability of the flood mapping methodology is tested against satellite-derived flood footprints, while ground-based estimations of economic damage and affected population is compared against modelled estimates. We evaluated the skill of flood hazard and risk estimations derived from EFAS flood forecasts with different lead times and combinations. The assessment includes a comparison of several alternative approaches to produce and present the information content, in order to meet the requests of EFAS users. The tests provided good results and showed the potential of the developed real-time operational procedure in helping emergency response and management.
Challenges in estimating the health impact of Hurricane Sandy using macro-level flood data.
NASA Astrophysics Data System (ADS)
Lieberman-Cribbin, W.; Liu, B.; Schneider, S.; Schwartz, R.; Taioli, E.
2016-12-01
Background: Hurricane Sandy caused extensive physical and economic damage but the long-term health impacts are unknown. Flooding is a central component of hurricane exposure, influencing health through multiple pathways that unfold over months after flooding recedes. This study assesses concordance in Federal Emergency Management (FEMA) and self-reported flood exposure after Hurricane Sandy to elucidate discrepancies in flood exposure assessments. Methods: Three meter resolution New York State flood data was obtained from the FEMA Modeling Task Force Hurricane Sandy Impact Analysis. FEMA data was compared to self-reported flood data obtained through validated questionnaires from New York City and Long Island residents following Sandy. Flooding was defined as both dichotomous and continuous variables and analyses were performed in SAS v9.4 and ArcGIS 10.3.1. Results: There was a moderate agreement between FEMA and self-reported flooding (Kappa statistic 0.46) and continuous (Spearman's correlation coefficient 0.50) measures of flood exposure. Flooding was self-reported and recorded by FEMA in 23.6% of cases, while agreement between the two measures on no flooding was 51.1%. Flooding was self-reported but not recorded by FEMA in 8.5% of cases, while flooding was not self-reported but indicated by FEMA in 16.8% of cases. In this last instance, 84% of people (173/207; 83.6%) resided in an apartment (no flooding reported). Spatially, the most concordance resided in the interior of New York City / Long Island, while the greatest areas of discordance were concentrated in the Rockaway Peninsula and Long Beach, especially among those living in apartments. Conclusions: There were significant discrepancies between FEMA and self-reported flood data. While macro-level FEMA flood data is a relatively less expensive and faster way to provide exposure estimates spanning larger geographic areas affected by Hurricane Sandy than micro-level estimates from cohort studies, macro-level exposure estimates may underestimate the full flooding and health impacts of the hurricane. Future disaster preparedness efforts must integrate micro and macro-level flood exposures to produce the most accurate evaluation of health impacts in affected populations.
Estimated value of insurance premium due to Citarum River flood by using Bayesian method
NASA Astrophysics Data System (ADS)
Sukono; Aisah, I.; Tampubolon, Y. R. H.; Napitupulu, H.; Supian, S.; Subiyanto; Sidi, P.
2018-03-01
Citarum river flood in South Bandung, West Java Indonesia, often happens every year. It causes property damage, producing economic loss. The risk of loss can be mitigated by following the flood insurance program. In this paper, we discussed about the estimated value of insurance premiums due to Citarum river flood by Bayesian method. It is assumed that the risk data for flood losses follows the Pareto distribution with the right fat-tail. The estimation of distribution model parameters is done by using Bayesian method. First, parameter estimation is done with assumption that prior comes from Gamma distribution family, while observation data follow Pareto distribution. Second, flood loss data is simulated based on the probability of damage in each flood affected area. The result of the analysis shows that the estimated premium value of insurance based on pure premium principle is as follows: for the loss value of IDR 629.65 million of premium IDR 338.63 million; for a loss of IDR 584.30 million of its premium IDR 314.24 million; and the loss value of IDR 574.53 million of its premium IDR 308.95 million. The premium value estimator can be used as neither a reference in the decision of reasonable premium determination, so as not to incriminate the insured, nor it result in loss of the insurer.
Fifty-year flood-inundation maps for Juticalpa, Honduras
Kresch, David L.; Mastin, M.C.; Olsen, T.D.
2002-01-01
After the devastating floods caused by Hurricane Mitch in 1998, maps of the areas and depths of 50-year-flood inundation at 15 municipalities in Honduras were prepared as a tool for agencies involved in reconstruction and planning. This report, which is one in a series of 15, presents maps of areas in the municipality of Juticalpa that would be inundated by a 50-year flood of Rio Juticalpa. Geographic Information System (GIS) coverages of the flood inundation are available on a computer in the municipality of Juticalpa as part of the Municipal GIS project and on the Internet at the Flood Hazard Mapping Web page (http://mitchnts1.cr.usgs.gov/projects/floodhazard.html). These coverages allow users to view the flood inundation in much more detail than is possible using the maps in this report. Water-surface elevations for a 50-year-flood on Rio Juticalpa at Juticalpa were estimated using HEC-RAS, a one-dimensional, steady-flow, step-backwater computer program. The channel and floodplain cross sections used in HEC-RAS were developed from an airborne light-detection-and-ranging (LIDAR) topographic survey of the area. The estimated 50-year-flood discharge for Rio Juticalpa at Juticalpa, 1,360 cubic meters per second, was computed as the drainage-area-adjusted weighted average of two independently estimated 50-year-flood discharges for the gaging station Rio Juticalpa en El Torito, located about 2 kilometers upstream from Juticalpa. One discharge, 1,551 cubic meters per second, was estimated from a frequency analysis of the 33 years of peak-discharge record for the gage, and the other, 486 cubic meters per second, was estimated from a regression equation that relates the 50-year-flood discharge to drainage area and mean annual precipitation. The weighted-average of the two discharges at the gage is 1,310 cubic meters per second. The 50-year flood discharge for the study area reach of Rio Juticalpa was estimated by multiplying the weighted discharge at the gage by the ratio of the drainage areas upstream from the two locations.
Ahearn, Elizabeth A.
2004-01-01
Multiple linear-regression equations were developed to estimate the magnitudes of floods in Connecticut for recurrence intervals ranging from 2 to 500 years. The equations can be used for nonurban, unregulated stream sites in Connecticut with drainage areas ranging from about 2 to 715 square miles. Flood-frequency data and hydrologic characteristics from 70 streamflow-gaging stations and the upstream drainage basins were used to develop the equations. The hydrologic characteristics?drainage area, mean basin elevation, and 24-hour rainfall?are used in the equations to estimate the magnitude of floods. Average standard errors of prediction for the equations are 31.8, 32.7, 34.4, 35.9, 37.6 and 45.0 percent for the 2-, 10-, 25-, 50-, 100-, and 500-year recurrence intervals, respectively. Simplified equations using only one hydrologic characteristic?drainage area?also were developed. The regression analysis is based on generalized least-squares regression techniques. Observed flows (log-Pearson Type III analysis of the annual maximum flows) from five streamflow-gaging stations in urban basins in Connecticut were compared to flows estimated from national three-parameter and seven-parameter urban regression equations. The comparison shows that the three- and seven- parameter equations used in conjunction with the new statewide equations generally provide reasonable estimates of flood flows for urban sites in Connecticut, although a national urban flood-frequency study indicated that the three-parameter equations significantly underestimated flood flows in many regions of the country. Verification of the accuracy of the three-parameter or seven-parameter national regression equations using new data from Connecticut stations was beyond the scope of this study. A technique for calculating flood flows at streamflow-gaging stations using a weighted average also is described. Two estimates of flood flows?one estimate based on the log-Pearson Type III analyses of the annual maximum flows at the gaging station, and the other estimate from the regression equation?are weighted together based on the years of record at the gaging station and the equivalent years of record value determined from the regression. Weighted averages of flood flows for the 2-, 10-, 25-, 50-, 100-, and 500-year recurrence intervals are tabulated for the 70 streamflow-gaging stations used in the regression analysis. Generally, weighted averages give the most accurate estimate of flood flows at gaging stations. An evaluation of the Connecticut's streamflow-gaging network was performed to determine whether the spatial coverage and range of geographic and hydrologic conditions are adequately represented for transferring flood characteristics from gaged to ungaged sites. Fifty-one of 54 stations in the current (2004) network support one or more flood needs of federal, state, and local agencies. Twenty-five of 54 stations in the current network are considered high-priority stations by the U.S. Geological Survey because of their contribution to the longterm understanding of floods, and their application for regionalflood analysis. Enhancements to the network to improve overall effectiveness for regionalization can be made by increasing the spatial coverage of gaging stations, establishing stations in regions of the state that are not well-represented, and adding stations in basins with drainage area sizes not represented. Additionally, the usefulness of the network for characterizing floods can be maintained and improved by continuing operation at the current stations because flood flows can be more accurately estimated at stations with continuous, long-term record.
NASA Astrophysics Data System (ADS)
Sandercock, Peter; Wyrwoll, Karl-Heinz
2005-12-01
The discharge regimes of the large rivers of northern Australia are characterized by the occurrence of extreme flood events with far-reaching environmental and societal impacts. In January 1998 the largest flood ever recorded on the Katherine River, northern Australia, resulted in widespread inundation and resultant damage to the town of Katherine. The occurrence of the flood emphasized the unreliability of the then available flood probability estimates and prompted a palaeoflood approach to estimate the recurrence interval of the event. The location of Katherine is ideal for such a study, as the town is located immediately downstream from Katherine Gorge, which provides the necessary bedrock-confined channel required for such an approach. In addition, previous work in Katherine Gorge had demonstrated that the gorge sections hold suitable deposits for palaeoflood stage reconstruction. The results of the present study show that at least two flow events with discharges similar to the 1998 flood have occurred within the last 600 years, and that high-magnitude floods are a general feature of the discharge record of the Katherine River over the last c. 2000 years. Furthermore, because the study was undertaken within a few months of the occurrence of the 1998 flood, it provided the opportunity to evaluate the previously obtained flood discharge estimates and draw attention to the general uncertainties associated with palaeoflood studies. Our results emphasize that palaeoflood stage estimates based on slackwater deposits need to be treated as conservative estimates only. More specifically, with respect to the 1998 event, our study demonstrates that the controls of flood peak were more complex than simply flood routing through the gorge sections. It is clear that the areas downstream from Katherine Gorge made an important contribution to the flood peak of the 1998 event. Copyright
Techniques for estimating magnitude and frequency of floods in Minnesota
Guetzkow, Lowell C.
1977-01-01
Estimating relations have been developed to provide engineers and designers with improved techniques for defining flow-frequency characteristics to satisfy hydraulic planning and design requirements. The magnitude and frequency of floods up to the 100-year recurrence interval can be determined for most streams in Minnesota by methods presented. By multiple regression analysis, equations have been developed for estimating flood-frequency relations at ungaged sites on natural flow streams. Eight distinct hydrologic regions are delineated within the State with boundaries defined generally by river basin divides. Regression equations are provided for each region which relate selected frequency floods to significant basin parameters. For main-stem streams, graphs are presented showing floods for selected recurrence intervals plotted against contributing drainage area. Flow-frequency estimates for intervening sites along the Minnesota River, Mississippi River, and the Red River of the North can be derived from these graphs. Flood-frequency characteristics are tabulated for 201 paging stations having 10 or more years of record.
Estimating earnings losses due to mental illness: a quantile regression approach.
Marcotte, Dave E; Wilcox-Gök, Virginia
2003-09-01
The ability of workers to remain productive and sustain earnings when afflicted with mental illness depends importantly on access to appropriate treatment and on flexibility and support from employers. In the United States there is substantial variation in access to health care and sick leave and other employment flexibilities across the earnings distribution. Consequently, a worker's ability to work and how much his/her earnings are impeded likely depend upon his/her position in the earnings distribution. Because of this, focusing on average earnings losses may provide insufficient information on the impact of mental illness in the labor market. In this paper, we examine the effects of mental illness on earnings by recognizing that effects could vary across the distribution of earnings. Using data from the National Comorbidity Survey, we employ a quantile regression estimator to identify the effects at key points in the earnings distribution. We find that earnings effects vary importantly across the distribution. While average effects are often not large, mental illness more commonly imposes earnings losses at the lower tail of the distribution, especially for women. In only one case do we find an illness to have negative effects across the distribution. Mental illness can have larger negative impacts on economic outcomes than previously estimated, even if those effects are not uniform. Consequently, researchers and policy makers alike should not be placated by findings that mean earnings effects are relatively small. Such estimates miss important features of how and where mental illness is associated with real economic losses for the ill.
Paleohydrologic techniques used to define the spatial occurrence of floods
Jarrett, R.D.
1990-01-01
Defining the cause and spatial characteristics of floods may be difficult because of limited streamflow and precipitation data. New paleohydrologic techniques that incorporate information from geomorphic, sedimentologic, and botanic studies provide important supplemental information to define homogeneous hydrologic regions. These techniques also help to define the spatial structure of rainstorms and floods and improve regional flood-frequency estimates. The occurrence and the non-occurrence of paleohydrologic evidence of floods, such as flood bars, alluvial fans, and tree scars, provide valuable hydrologic information. The paleohydrologic research to define the spatial characteristics of floods improves the understanding of flood hydrometeorology. This research was used to define the areal extent and contributing drainage area of flash floods in Colorado. Also, paleohydrologic evidence was used to define the spatial boundaries for the Colorado foothills region in terms of the meteorologic cause of flooding and elevation. In general, above 2300 m, peak flows are caused by snowmelt. Below 2300 m, peak flows primarily are caused by rainfall. The foothills region has an upper elevation limit of about 2300 m and a lower elevation limit of about 1500 m. Regional flood-frequency estimates that incorporate the paleohydrologic information indicate that the Big Thompson River flash flood of 1976 had a recurrence interval of approximately 10,000 years. This contrasts markedly with 100 to 300 years determined by using conventional hydrologic analyses. Flood-discharge estimates based on rainfall-runoff methods in the foothills of Colorado result in larger values than those estimated with regional flood-frequency relations, which are based on long-term streamflow data. Preliminary hydrologic and paleohydrologic research indicates that intense rainfall does not occur at higher elevations in other Rocky Mountain states and that the highest elevations for rainfall-producing floods vary by latitude. The study results have implications for floodplain management and design of hydraulic structures in the mountains of Colorado and other Rocky Mountain States. ?? 1990.
Estimating Paleoflood Magnitude From Tree-Ring Anatomy and the Height of Abrasion Scars
NASA Astrophysics Data System (ADS)
Yanosky, T. M.; Jarrett, R. D.
2003-12-01
Evidence of floods preserved in the growth rings of trees can be used to extend the historical record of flooding or to estimate the magnitude of extraordinary floods on ungaged streams. Floods that damage the aerial parts of trees during the growing season sometimes induce striking anatomical changes in subsequent growth of rings in the lower trunk. In ring-porous species, this growth most commonly produces concentric bands of atypically large vessels within the latewood. The number and diameter of anomalous vessels seem positively related to the amount of flood damage, and thus can be used to refine estimates of flood magnitude when also considering the position of the tree relative to the channel and its approximate height during the flood. Floods of long duration on low-gradient streams are less likely to damage trees directly, but prolonged root flooding often results in the formation of narrow rings with atypically small vessels; shorter-duration floods, sometimes inundating roots for as little as several days, are followed by the production of fibers (non-conducting cells) with large lumens and thin walls that appear as light-colored bands compared to earlier-formed tissue. In these instances, a series of trees increasingly distant from the channel can be used to estimate a minimum flood elevation. Abrasion scars from flood-borne debris often are the most easily observed evidence of flood damage and, like anatomical abnormalities, can be precisely dated. The relation between the heights of scars and maximum flood stages depends in part upon channel slope. Previous studies have indicated that scar heights along low-gradient streams are the same or slightly lower than maximum flood elevations. Along the high-gradient (6% maximum slope) Buffalo Creek, Colorado USA, scar heights measured in 102 trees following a flood in 1996 ranged from -0.6 to +1.5 m relative to the actual crest elevation. Scar elevations exceeding flood elevations by 3-4 m, however, were observed following a flood in 2002 along a small Colorado stream with slopes ranging from 6 to 15%.
Estimation of flood-frequency characteristics of small urban streams in North Carolina
Robbins, J.C.; Pope, B.F.
1996-01-01
A statewide study was conducted to develop methods for estimating the magnitude and frequency of floods of small urban streams in North Carolina. This type of information is critical in the design of bridges, culverts and water-control structures, establishment of flood-insurance rates and flood-plain regulation, and for other uses by urban planners and engineers. Concurrent records of rainfall and runoff data collected in small urban basins were used to calibrate rainfall-runoff models. Historic rain- fall records were used with the calibrated models to synthesize a long- term record of annual peak discharges. The synthesized record of annual peak discharges were used in a statistical analysis to determine flood- frequency distributions. These frequency distributions were used with distributions from previous investigations to develop a database for 32 small urban basins in the Blue Ridge-Piedmont, Sand Hills, and Coastal Plain hydrologic areas. The study basins ranged in size from 0.04 to 41.0 square miles. Data describing the size and shape of the basin, level of urban development, and climate and rural flood charac- teristics also were included in the database. Estimation equations were developed by relating flood-frequency char- acteristics to basin characteristics in a generalized least-squares regression analysis. The most significant basin characteristics are drainage area, impervious area, and rural flood discharge. The model error and prediction errors for the estimating equations were less than those for the national flood-frequency equations previously reported. Resulting equations, which have prediction errors generally less than 40 percent, can be used to estimate flood-peak discharges for 2-, 5-, 10-, 25-, 50-, and 100-year recurrence intervals for small urban basins across the State assuming negligible, sustainable, in- channel detention or basin storage.
NASA Astrophysics Data System (ADS)
Chapman, Sandra; Stainforth, David; Watkins, Nicholas
2016-04-01
Characterizing how our climate is changing includes local information which can inform adaptation planning decisions. This requires quantifying the geographical patterns in changes at specific quantiles or thresholds in distributions of variables such as daily surface temperature. Here we focus on these local changes and on a model independent method to transform daily observations into patterns of local climate change. Our method [1] is a simple mathematical deconstruction of how the difference between two observations from two different time periods can be assigned to the combination of natural statistical variability and/or the consequences of secular climate change. This deconstruction facilitates an assessment of how fast different quantiles of the distributions are changing. This involves both determining which quantiles and geographical locations show the greatest change but also, those at which any change is highly uncertain. For temperature, changes in the distribution itself can yield robust results [2]. We demonstrate how the fundamental timescales of anthropogenic climate change limit the identification of societally relevant aspects of changes. We show that it is nevertheless possible to extract, solely from observations, some confident quantified assessments of change at certain thresholds and locations [3]. We demonstrate this approach using E-OBS gridded data [4] timeseries of local daily surface temperature from specific locations across Europe over the last 60 years. [1] Chapman, S. C., D. A. Stainforth, N. W. Watkins, On estimating long term local climate trends, Phil. Trans. Royal Soc., A,371 20120287 (2013) [2] Stainforth, D. A. S. C. Chapman, N. W. Watkins, Mapping climate change in European temperature distributions, ERL 8, 034031 (2013) [3] Chapman, S. C., Stainforth, D. A., Watkins, N. W. Limits to the quantification of local climate change, ERL 10, 094018 (2015) [4] Haylock M. R. et al ., A European daily high-resolution gridded dataset of surface temperature and precipitation. J. Geophys. Res (Atmospheres), 113, D20119, (2008)
NASA Astrophysics Data System (ADS)
Singh, Vishal; Goyal, Manish Kumar
2016-01-01
This paper draws attention to highlight the spatial and temporal variability in precipitation lapse rate (PLR) and precipitation extreme indices (PEIs) through the mesoscale characterization of Teesta river catchment, which corresponds to north Sikkim eastern Himalayas. A PLR rate is an important variable for the snowmelt runoff models. In a mountainous region, the PLR could be varied from lower elevation parts to high elevation parts. In this study, a PLR was computed by accounting elevation differences, which varies from around 1500 m to 7000 m. A precipitation variability and extremity were analysed using multiple mathematical functions viz. quantile regression, spatial mean, spatial standard deviation, Mann-Kendall test and Sen's estimation. For this reason, a daily precipitation, in the historical (years 1980-2005) as measured/observed gridded points and projected experiments for the 21st century (years 2006-2100) simulated by CMIP5 ESM-2 M model (Coupled Model Intercomparison Project Phase 5 Earth System Model 2) employing three different radiative forcing scenarios (Representative Concentration Pathways), utilized for the research work. The outcomes of this study suggest that a PLR is significantly varied from lower elevation to high elevation parts. The PEI based analysis showed that the extreme high intensity events have been increased significantly, especially after 2040s. The PEI based observations also showed that the numbers of wet days are increased for all the RCPs. The quantile regression plots showed significant increments in the upper and lower quantiles of the various extreme indices. The Mann-Kendall test and Sen's estimation tests clearly indicated significant changing patterns in the frequency and intensity of the precipitation indices across all the sub-basins and RCP scenario in an intra-decadal time series domain. The RCP8.5 showed extremity of the projected outcomes.
Etilé, Fabrice; Sharma, Anurag
2015-09-01
This study compares the impact of sugar-sweetened beverages (SSBs) tax between moderate and high consumers in Australia. The key methodological contribution is that price response heterogeneity is identified while controlling for censoring of consumption at zero and endogeneity of expenditure by using a finite mixture instrumental variable Tobit model. The SSB price elasticity estimates show a decreasing trend across increasing consumption quantiles, from -2.3 at the median to -0.2 at the 95th quantile. Although high consumers of SSBs have a less elastic demand for SSBs, their very high consumption levels imply that a tax would achieve higher reduction in consumption and higher health gains. Our results also suggest that an SSB tax would represent a small fiscal burden for consumers whatever their pre-policy level of consumption, and that an excise tax should be preferred to an ad valorem tax. Copyright © 2015 John Wiley & Sons, Ltd.
Income elasticity of health expenditures in Iran.
Zare, Hossein; Trujillo, Antonio J; Leidman, Eva; Buttorff, Christine
2013-09-01
Because of its policy implications, the income elasticity of health care expenditures is a subject of much debate. Governments may have an interest in subsidizing the care of those with low income. Using more than two decades of data from the Iran Household Expenditure and Income Survey, this article investigates the relationship between income and health care expenditure in urban and rural areas in Iran, a resource rich, upper-middle-income country. We implemented spline and quantile regression techniques to obtain a more robust description of the relationship of interest. This study finds non-uniform effects of income on health expenditures. Although the results show that health care is a necessity for all income brackets, spline regression estimates indicate that the income elasticity is lowest for the poorest Iranians in urban and rural areas. This suggests that they will show low flexibility in medical expenses as income fluctuates. Further, a quantile regression model assessing the effect of income at different level of medical expenditure suggests that households with lower medical expenses are less elastic.
NASA Astrophysics Data System (ADS)
Shahzad, Syed Jawad Hussain; Hernandez, Jose Areola; Hanif, Waqas; Kayani, Ghulam Mujtaba
2018-09-01
We investigate the dynamics of efficiency and long memory, and the impact of trading volume on the efficiency of returns and volatilities of four major traded currencies, namely, the EUR, GBP, CHF and JPY. We do so by implementing full sample and rolling window multifractal detrended fluctuation analysis (MF-DFA) and a quantile-on-quantile (QQ) approach. This paper sheds new light by employing high frequency (5-min interval) data spanning from Jan 1, 2007 to Dec 31, 2016. Realized volatilities are estimated using Andersen et al.'s (2001) measure, while the QQ method employed is drawn from Sim and Zhou (2015). We find evidence of higher efficiency levels in the JPY and CHF currency markets. The impact of trading volume on efficiency is only significant for the JPY and CHF currencies. The GBP currency appears to be the least efficient, followed by the EUR. Implications of the results are discussed.
Flood extent and water level estimation from SAR using data-model integration
NASA Astrophysics Data System (ADS)
Ajadi, O. A.; Meyer, F. J.
2017-12-01
Synthetic Aperture Radar (SAR) images have long been recognized as a valuable data source for flood mapping. Compared to other sources, SAR's weather and illumination independence and large area coverage at high spatial resolution supports reliable, frequent, and detailed observations of developing flood events. Accordingly, SAR has the potential to greatly aid in the near real-time monitoring of natural hazards, such as flood detection, if combined with automated image processing. This research works towards increasing the reliability and temporal sampling of SAR-derived flood hazard information by integrating information from multiple SAR sensors and SAR modalities (images and Interferometric SAR (InSAR) coherence) and by combining SAR-derived change detection information with hydrologic and hydraulic flood forecast models. First, the combination of multi-temporal SAR intensity images and coherence information for generating flood extent maps is introduced. The application of least-squares estimation integrates flood information from multiple SAR sensors, thus increasing the temporal sampling. SAR-based flood extent information will be combined with a Digital Elevation Model (DEM) to reduce false alarms and to estimate water depth and flood volume. The SAR-based flood extent map is assimilated into the Hydrologic Engineering Center River Analysis System (Hec-RAS) model to aid in hydraulic model calibration. The developed technology is improving the accuracy of flood information by exploiting information from data and models. It also provides enhanced flood information to decision-makers supporting the response to flood extent and improving emergency relief efforts.
NASA Astrophysics Data System (ADS)
Cunderlik, Juraj M.; Burn, Donald H.
2002-04-01
Improving techniques of flood frequency estimation at ungauged sites is one of the foremost goals of contemporary hydrology. River flood regime is a resultant reflection of a composite catchment hydrologic response to flood producing processes. In this sense the process of identifying homogeneous pooling groups can be plausibly based on catchment similarity in flood regime. Unfortunately the application of any pooling approach that is based on flood regime is restricted to gauged sites. Because flood regime can be markedly determined by rainfall regime, catchment similarity in rainfall regime can be an alternative option for identifying flood frequency pooling groups. An advantage of such a pooling approach is that rainfall data are usually spatially and temporary more abundant than flood data and the approach can also be applied at ungauged sites. Therefore in this study we have quantified the linkage between rainfall and flood regime and explored the appropriateness of substituting rainfall regime for flood regime in regional pooling schemes. Two different approaches to describing rainfall regime similarity using tools of directional statistics have been tested and used for evaluation of the potential of rainfall regime for identification of hydrologically homogeneous pooling groups. The outputs were compared to an existing pooling framework adopted in the Flood Estimation Handbook. The results demonstrate that regional pooling based on rainfall regime information leads to a high number of initially homogeneous groups and seems to be a sound pooling alternative for catchments with a close linkage between rain and flood regimes.
NASA Astrophysics Data System (ADS)
Zamora-Reyes, D.; Hirschboeck, K. K.; Paretti, N. V.
2012-12-01
Bulletin 17B (B17B) has prevailed for 30 years as the standard manual for determining flood frequency in the United States. Recently proposed updates to B17B include revising the issue of flood heterogeneity, and improving flood estimates by using the Expected Moments Algorithm (EMA) which can better address low outliers and accommodate information on historical peaks. Incorporating information on mixed populations, such as flood-causing mechanisms, into flood estimates for regions that have noticeable flood heterogeneity can be statistically challenging when systematic flood records are short. The problem magnifies when the population sample size is reduced by decomposing the record, especially if multiple flood mechanisms are involved. In B17B, the guidelines for dealing with mixed populations focus primarily on how to rule out any need to perform a mixed-population analysis. However, in some regions mixed flood populations are critically important determinants of regional flood frequency variations and should be explored from this perspective. Arizona is an area with a heterogeneous mixture of flood processes due to: warm season convective thunderstorms, cool season synoptic-scale storms, and tropical cyclone-enhanced convective activity occurring in the late summer or early fall. USGS station data throughout Arizona was compiled into a database and each flood peak (annual and partial duration series) was classified according to its meteorological cause. Using these data, we have explored the role of flood heterogeneity in Arizona flood estimates through composite flood frequency analysis based on mixed flood populations using EMA. First, for selected stations, the three flood-causing populations were separated out from the systematic annual flood series record and analyzed individually. Second, to create composite probability curves, the individual curves for each of the three populations were generated and combined using Crippen's (1978) composite probability equations for sites that have two or more independent flood populations. Finally, the individual probability curves generated for each of the three flood-causing populations were compared with both the site's composite probability curve and the standard B17B curve to explore the influence of heterogeneity using the 100-year and 200-year flood estimates as a basis of comparison. Results showed that sites located in southern Arizona and along the abrupt elevation transition zone of the Mogollon Rim exhibit a better fit to the systematic data using their composite probability curves than the curves derived from standard B17B analysis. Synoptic storm floods and tropical cyclone-enhanced floods had the greatest influence on 100-year and 200-year flood estimates. This was especially true in southern Arizona, even though summer convective floods are much more frequent and therefore dominate the composite curve. Using the EMA approach also influenced our results because all possible low outliers were censored by the built-in Multiple Grubbs-Beck Test, providing a better fit to the systematic data in the upper probabilities. In conclusion, flood heterogeneity can play an important role in regional flood frequency variations in Arizona and that understanding its influence is important when making projections about future flood variations.
Risk to life due to flooding in post-Katrina New Orleans
NASA Astrophysics Data System (ADS)
Miller, A.; Jonkman, S. N.; Van Ledden, M.
2015-01-01
Since the catastrophic flooding of New Orleans due to Hurricane Katrina in 2005, the city's hurricane protection system has been improved to provide protection against a hurricane load with a 1/100 per year exceedance frequency. This paper investigates the risk to life in post-Katrina New Orleans. In a flood risk analysis the probabilities and consequences of various flood scenarios have been analyzed for the central area of the city (the metro bowl) to give a preliminary estimate of the risk to life in the post-Katrina situation. A two-dimensional hydrodynamic model has been used to simulate flood characteristics of various breaches. The model for estimation of fatality rates is based on the loss of life data for Hurricane Katrina. Results indicate that - depending on the flood scenario - the estimated loss of life in case of flooding ranges from about 100 to nearly 500, with the highest life loss due to breaching of the river levees leading to large flood depths. The probability and consequence estimates are combined to determine the individual risk and societal risk for New Orleans. When compared to risks of other large-scale engineering systems (e.g., other flood prone areas, dams and the nuclear sector) and acceptable risk criteria found in literature, the risks for the metro bowl are found to be relatively high. Thus, despite major improvements to the flood protection system, the flood risk to life of post-Katrina New Orleans is still expected to be significant. Indicative effects of reduction strategies on the risk level are discussed as a basis for further evaluation and discussion.
Fast Flood damage estimation coupling hydraulic modeling and Multisensor Satellite data
NASA Astrophysics Data System (ADS)
Fiorini, M.; Rudari, R.; Delogu, F.; Candela, L.; Corina, A.; Boni, G.
2011-12-01
Damage estimation requires a good representation of the Elements at risk and their vulnerability, the knowledge of the flooded area extension and the description of the hydraulic forcing. In this work the real time use of a simplified two dimensional hydraulic model constrained by satellite retrieved flooded areas is analyzed. The main features of such a model are computational speed and simple start-up, with no need to insert complex information but a subset of simplified boundary and initial condition. Those characteristics allow the model to be fast enough to be used in real time for the simulation of flooding events. The model fills the gap of information left by single satellite scenes of flooded area, allowing for the estimation of the maximum flooding extension and magnitude. The static information provided by earth observation (like SAR extension of flooded areas at a certain time) are interpreted in a dynamic consistent way and very useful hydraulic information (e.g., water depth, water speed and the evolution of flooded areas)are provided. These information are merged with satellite identification of elements exposed to risk that are characterized in terms of their vulnerability to floods in order to obtain fast estimates of Food damages. The model has been applied in several flooding events occurred worldwide. amongst the other activations in the Mediterranean areas like Veneto (IT) (October 2010), Basilicata (IT) (March 2011) and Shkoder (January 2010 and December 2010) are considered and compared with larger types of floods like the one of Queensland in December 2010.
Heritability Across the Distribution: An Application of Quantile Regression
Petrill, Stephen A.; Hart, Sara A.; Schatschneider, Christopher; Thompson, Lee A.; Deater-Deckard, Kirby; DeThorne, Laura S.; Bartlett, Christopher
2016-01-01
We introduce a new method for analyzing twin data called quantile regression. Through the application presented here, quantile regression is able to assess the genetic and environmental etiology of any skill or ability, at multiple points in the distribution of that skill or ability. This method is compared to the Cherny et al. (Behav Genet 22:153–162, 1992) method in an application to four different reading-related outcomes in 304 pairs of first-grade same sex twins enrolled in the Western Reserve Reading Project. Findings across the two methods were similar; both indicated some variation across the distribution of the genetic and shared environmental influences on non-word reading. However, quantile regression provides more details about the location and size of the measured effect. Applications of the technique are discussed. PMID:21877231
Statistical Models and Inference Procedures for Structural and Materials Reliability
1990-12-01
as an official Department of the Army positio~n, policy, or decision, unless sD designated by other documentazion. 12a. DISTRIBUTION /AVAILABILITY...Some general stress-strength models were also developed and applied to the failure of systems subject to cyclic loading. Involved in the failure of...process control ideas and sequential design and analysis methods. Finally, smooth nonparametric quantile .wJ function estimators were studied. All of
NASA Astrophysics Data System (ADS)
Kalyanapu, A. J.; Dullo, T. T.; Gangrade, S.; Kao, S. C.; Marshall, R.; Islam, S. R.; Ghafoor, S. K.
2017-12-01
Hurricane Harvey that made landfall in the southern Texas this August is one of the most destructive hurricanes during the 2017 hurricane season. During its active period, many areas in coastal Texas region received more than 40 inches of rain. This downpour caused significant flooding resulting in about 77 casualties, displacing more than 30,000 people, inundating hundreds of thousands homes and is currently estimated to have caused more than $70 billion in direct damage. One of the significantly affected areas is Harris County where the city of Houston, TX is located. Covering over two HUC-8 drainage basins ( 2702 mi2), this county experienced more than 80% of its annual average rainfall during this event. This study presents an effort to reconstruct flooding caused by extreme rainfall due to Hurricane Harvey in Harris County, Texas. This computationally intensive task was performed at a 30-m spatial resolution using a rapid flood model called Flood2D-GPU, a graphics processing unit (GPU) accelerated model, on Oak Ridge National Laboratory's (ORNL) Titan Supercomputer. For this task, the hourly rainfall estimates from the National Center for Environmental Prediction Stage IV Quantitative Precipitation Estimate were fed into the Variable Infiltration Capacity (VIC) hydrologic model and Routing Application for Parallel computation of Discharge (RAPID) routing model to estimate flow hydrographs at 69 locations for Flood2D-GPU simulation. Preliminary results of the simulation including flood inundation extents, maps of flood depths and inundation duration will be presented. Future efforts will focus on calibrating and validating the simulation results and assessing the flood damage for better understanding the impacts made by Hurricane Harvey.
Flood Map for the Winooski River in Waterbury, Vermont, 2014
Olson, Scott A.
2015-01-01
High-water marks from Tropical Storm Irene were available for seven locations along the study reach. The highwater marks were used to estimate water-surface profiles and discharges resulting from Tropical Storm Irene throughout the study reach. From a comparison of the estimated water-surface profile for Tropical Storm Irene with the water-surface profiles for the 1- and 0.2-percent annual exceedance probability (AEP) floods, it was determined that the high-water elevations resulting from Tropical Storm Irene exceeded the estimated 1-percent AEP flood throughout the Winooski River study reach but did not exceed the estimated 0.2-percent AEP flood at any location within the study reach.
Effects of a flooding event on a threatened black bear population in Louisiana
O'Connell-Goode, Kaitlin C.; Lowe, Carrie L.; Clark, Joseph D.
2014-01-01
The Louisiana black bear, Ursus americanus luteolus, is listed as threatened under the Endangered Species Act as a result of habitat loss and human-related mortality. Information on population-level responses of large mammals to flooding events is scarce, and we had a unique opportunity to evaluate the viability of the Upper Atchafalaya River Basin (UARB) black bear population before and after a significant flooding event. We began collecting black bear hair samples in 2007 for a DNA mark-recapture study to estimate abundance (N) and apparent survival (φ). In 2011, the Morganza Spillway was opened to divert floodwaters from the Mississippi River through the UARB, inundating > 50% of our study area, potentially impacting recovery of this important bear population. To evaluate the effects of this flooding event on bear population dynamics, we used a robust design multistate model to estimate changes in transition rates from the flooded area to non-flooded area (ψF→NF) before (2007–2010), during (2010–2011) and after (2011–2012) the flood. Average N across all years of study was 63.2 (SE = 5.2), excluding the year of the flooding event. Estimates of ψF→NF increased from 0.014 (SE = 0.010; meaning that 1.4% of the bears moved from the flooded area to non-flooded areas) before flooding to 0.113 (SE = 0.045) during the flood year, and then decreased to 0.028 (SE= 0.035) after the flood. Although we demonstrated a flood effect on transition rates as hypothesized, the effect was small (88.7% of the bears remained in the flooded area during flooding) and φ was unchanged, suggesting that the 2011 flooding event had minimal impact on survival and site fidelity.
Statistical analysis of the uncertainty related to flood hazard appraisal
NASA Astrophysics Data System (ADS)
Notaro, Vincenza; Freni, Gabriele
2015-12-01
The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.
NASA Astrophysics Data System (ADS)
Apel, H.; Trepat, O. M.; Hung, N. N.; Chinh, D. T.; Merz, B.; Dung, N. V.
2015-08-01
Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU) for time-efficient flood propagation modelling. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by percentile maps. The results are critically discussed and ways for their usage in flood risk management are outlined.
Techniques for estimating flood-peak discharges of rural, unregulated streams in Ohio
Koltun, G.F.
2003-01-01
Regional equations for estimating 2-, 5-, 10-, 25-, 50-, 100-, and 500-year flood-peak discharges at ungaged sites on rural, unregulated streams in Ohio were developed by means of ordinary and generalized least-squares (GLS) regression techniques. One-variable, simple equations and three-variable, full-model equations were developed on the basis of selected basin characteristics and flood-frequency estimates determined for 305 streamflow-gaging stations in Ohio and adjacent states. The average standard errors of prediction ranged from about 39 to 49 percent for the simple equations, and from about 34 to 41 percent for the full-model equations. Flood-frequency estimates determined by means of log-Pearson Type III analyses are reported along with weighted flood-frequency estimates, computed as a function of the log-Pearson Type III estimates and the regression estimates. Values of explanatory variables used in the regression models were determined from digital spatial data sets by means of a geographic information system (GIS), with the exception of drainage area, which was determined by digitizing the area within basin boundaries manually delineated on topographic maps. Use of GIS-based explanatory variables represents a major departure in methodology from that described in previous reports on estimating flood-frequency characteristics of Ohio streams. Examples are presented illustrating application of the regression equations to ungaged sites on ungaged and gaged streams. A method is provided to adjust regression estimates for ungaged sites by use of weighted and regression estimates for a gaged site on the same stream. A region-of-influence method, which employs a computer program to estimate flood-frequency characteristics for ungaged sites based on data from gaged sites with similar characteristics, was also tested and compared to the GLS full-model equations. For all recurrence intervals, the GLS full-model equations had superior prediction accuracy relative to the simple equations and therefore are recommended for use.
NASA Astrophysics Data System (ADS)
Palán, Ladislav; Punčochář, Petr
2017-04-01
Looking on the impact of flooding from the World-wide perspective, in last 50 years flooding has caused over 460,000 fatalities and caused serious material damage. Combining economic loss from ten costliest flood events (from the same period) returns a loss (in the present value) exceeding 300bn USD. Locally, in Brazil, flood is the most damaging natural peril with alarming increase of events frequencies as 5 out of the 10 biggest flood losses ever recorded have occurred after 2009. The amount of economic and insured losses particularly caused by various flood types was the key driver of the local probabilistic flood model development. Considering the area of Brazil (being 5th biggest country in the World) and the scattered distribution of insured exposure, a domain covered by the model was limited to the entire state of Sao Paolo and 53 additional regions. The model quantifies losses on approx. 90 % of exposure (for regular property lines) of key insurers. Based on detailed exposure analysis, Impact Forecasting has developed this tool using long term local hydrological data series (Agencia Nacional de Aguas) from riverine gauge stations and digital elevation model (Instituto Brasileiro de Geografia e Estatística). To provide most accurate representation of local hydrological behaviour needed for the nature of probabilistic simulation, a hydrological data processing focused on frequency analyses of seasonal peak flows - done by fitting appropriate extreme value statistical distribution and stochastic event set generation consisting of synthetically derived flood events respecting realistic spatial and frequency patterns visible in entire period of hydrological observation. Data were tested for homogeneity, consistency and for any significant breakpoint occurrence in time series so the entire observation or only its subparts were used for further analysis. The realistic spatial patterns of stochastic events are reproduced through the innovative use of d-vine copula scheme to generate probabilistic flood event set. The derived design flows for selected rivers inside model domain were used as an input for 2-dimensional hydrodynamic inundation modelling techniques (using the tool TUFLOW by BMT WBM) on mesh size 30 x 30 metres. Outputs from inundation modelling and stochastic event set were implemented in the Aon Benfield's platform ELEMENTS developed and managed internally by Impact Forecasting; Aon Benfield internal catastrophe model development center. The model was designed to evaluate potential financial impact caused by fluvial flooding on portfolios of insurance and/or reinsurance companies. The structure of presented model follows typical scheme of financial loss catastrophe model and combines hazard with exposure and vulnerability to produce potential financial loss expressed in the form of loss exceedance probability curve and many other insured perspectives, such as average annual loss, event or quantile loss tables and etc. Model can take financial inputs as well as provide split of results for exact specified location or related higher administrative units: municipalities and 5-digit postal codes.
Methodology and Implications of Maximum Paleodischarge Estimates for
Channels, M.; Pruess, J.; Wohl, E.E.; Jarrett, R.D.
1998-01-01
Historical and geologic records may be used to enhance magnitude estimates for extreme floods along mountain channels, as demonstrated in this study from the San Juan Mountains of Colorado. Historical photographs and local newspaper accounts from the October 1911 flood indicate the likely extent of flooding and damage. A checklist designed to organize and numerically score evidence of flooding was used in 15 field reconnaissance surveys in the upper Animas River valley of southwestern Colorado. Step-backwater flow modeling estimated the discharges necessary to create longitudinal flood bars observed at 6 additional field sites. According to these analyses, maximum unit discharge peaks at approximately 1.3 m3 s~' km"2 around 2200 m elevation, with decreased unit discharges at both higher and lower elevations. These results (1) are consistent with Jarrett's (1987, 1990, 1993) maximum 2300-m elevation limit for flash-flooding in the Colorado Rocky Mountains, and (2) suggest that current Probable Maximum Flood (PMF) estimates based on a 24-h rainfall of 30 cm at elevations above 2700 m are unrealistically large. The methodology used for this study should be readily applicable to other mountain regions where systematic streamflow records are of short duration or nonexistent. ?? 1998 Regents of the University of Colorado.
Model synthesis in frequency analysis of Missouri floods
Hauth, Leland D.
1974-01-01
Synthetic flood records for 43 small-stream sites aided in definition of techniques for estimating the magnitude and frequency of floods in Missouri. The long-term synthetic flood records were generated by use of a digital computer model of the rainfall-runoff process. A relatively short period of concurrent rainfall and runoff data observed at each of the 43 sites was used to calibrate the model, and rainfall records covering from 66 to 78 years for four Missouri sites and pan-evaporation data were used to generate the synthetic records. Flood magnitude and frequency characteristics of both the synthetic records and observed long-term flood records available for 109 large-stream sites were used in a multiple-regression analysis to define relations for estimating future flood characteristics at ungaged sites. That analysis indicated that drainage basin size and slope were the most useful estimating variables. It also indicated that a more complex regression model than the commonly used log-linear one was needed for the range of drainage basin sizes available in this study.
Estimation of Flood Discharges at Selected Recurrence Intervals for Streams in New Hampshire
Olson, Scott A.
2009-01-01
This report provides estimates of flood discharges at selected recurrence intervals for streamgages in and adjacent to New Hampshire and equations for estimating flood discharges at recurrence intervals of 2-, 5-, 10-, 25-, 50-, 100-, and 500-years for ungaged, unregulated, rural streams in New Hampshire. The equations were developed using generalized least-squares regression. Flood-frequency and drainage-basin characteristics from 117 streamgages were used in developing the equations. The drainage-basin characteristics used as explanatory variables in the regression equations include drainage area, mean April precipitation, percentage of wetland area, and main channel slope. The average standard error of prediction for estimating the 2-, 5-, 10-, 25-, 50-, 100-, and 500-year recurrence interval flood discharges with these equations are 30.0, 30.8, 32.0, 34.2, 36.0, 38.1, and 43.4 percent, respectively. Flood discharges at selected recurrence intervals for selected streamgages were computed following the guidelines in Bulletin 17B of the U.S. Interagency Advisory Committee on Water Data. To determine the flood-discharge exceedence probabilities at streamgages in New Hampshire, a new generalized skew coefficient map covering the State was developed. The standard error of the data on new map is 0.298. To improve estimates of flood discharges at selected recurrence intervals for 20 streamgages with short-term records (10 to 15 years), record extension using the two-station comparison technique was applied. The two-station comparison method uses data from a streamgage with long-term record to adjust the frequency characteristics at a streamgage with a short-term record. A technique for adjusting a flood-discharge frequency curve computed from a streamgage record with results from the regression equations is described in this report. Also, a technique is described for estimating flood discharge at a selected recurrence interval for an ungaged site upstream or downstream from a streamgage using a drainage-area adjustment. The final regression equations and the flood-discharge frequency data used in this study will be available in StreamStats. StreamStats is a World Wide Web application providing automated regression-equation solutions for user-selected sites on streams.
NASA Astrophysics Data System (ADS)
Florian, Ehmele; Michael, Kunz
2016-04-01
Several major flood events occurred in Germany in the past 15-20 years especially in the eastern parts along the rivers Elbe and Danube. Examples include the major floods of 2002 and 2013 with an estimated loss of about 2 billion Euros each. The last major flood events in the State of Baden-Württemberg in southwest Germany occurred in the years 1978 and 1993/1994 along the rivers Rhine and Neckar with an estimated total loss of about 150 million Euros (converted) each. Flood hazard originates from a combination of different meteorological, hydrological and hydraulic processes. Currently there is no defined methodology available for evaluating and quantifying the flood hazard and related risk for larger areas or whole river catchments instead of single gauges. In order to estimate the probable maximum loss for higher return periods (e.g. 200 years, PML200), a stochastic model approach is designed since observational data are limited in time and space. In our approach, precipitation is linearly composed of three elements: background precipitation, orographically-induces precipitation, and a convectively-driven part. We use linear theory of orographic precipitation formation for the stochastic precipitation model (SPM), which is based on fundamental statistics of relevant atmospheric variables. For an adequate number of historic flood events, the corresponding atmospheric conditions and parameters are determined in order to calculate a probability density function (pdf) for each variable. This method involves all theoretically possible scenarios which may not have happened, yet. This work is part of the FLORIS-SV (FLOod RISk Sparkassen Versicherung) project and establishes the first step of a complete modelling chain of the flood risk. On the basis of the generated stochastic precipitation event set, hydrological and hydraulic simulations will be performed to estimate discharge and water level. The resulting stochastic flood event set will be used to quantify the flood risk and to estimate probable maximum loss (e.g. PML200) for a given property (buildings, industry) portfolio.
NASA Astrophysics Data System (ADS)
Formetta, Giuseppe; Stewart, Elizabeth; Bell, Victoria; Reynard, Nick
2017-04-01
Estimation of peak discharge for an assigned return period is a crucial issue in engineering hydrology. It is required for designing and managing hydraulic infrastructure such as dams, reservoirs and bridges. In the UK, the Flood Estimation Handbook (FEH) recommends the use of the index flood method to estimate the design flood as the product of a local scale factor (the index flood, IF) and a dimensionless regional growth factor (GF). For gauged catchments the IF is usually estimated as the median annual maximum flood (QMED), while for ungauged catchments it is computed through multiple linear regression models based on a set of morpho-climatic indices of the basin. The GF is estimated by fitting the annual maxima with the generalised logistic distribution (GL) using two methods depending on the record length and the target return period: single-site or pooled analysis. The single site-analysis estimates the GF from the annual maxima of the subject site alone; the pooled analysis uses data from a set of catchments hydrologically similar to the subject site. In this work estimates of floods up to 100-year return period obtained from the FEH approach are compared to those obtained using Grid-to-Grid, a continuous physically-based hydrological model. The model converts rainfall and potential evapotranspiration into river flows by modelling surface/sub-surface runoff, lateral water movements, and snow-pack. It is configured on a 1km2 grid resolution and it uses spatial datasets of topography, soil, and land cover. It was set up in Great Britain and has been evaluated for the period 1960-2014 in forward-mode (i.e. without parameter calibration) using daily meteorological forcing data. The modelled floods with a given return period (5,10, 30, 50, and 100 years) were computed from the modelled discharge annual maxima and compared to the FEH estimates for 100 catchments in Great Britain. Preliminary results suggest that there is a good agreement between modelled and measured floods with a correlation coefficient that ranges from 0.8 for low return periods to 0.65 for the highest. It is shown that model performance is robust and independent of catchment features such as area and mean annual rainfall. The promising results for Great Britain support the aspiration that continuous simulation from large-scale hydrological models, supported by the increasing availability of global weather, climate and hydrological products, could be used to develop robust methods to help engineers estimate design floods in regions with limited gauge data or affected by environmental change.
Documentary evidence of past floods in Europe and their utility in flood frequency estimation
NASA Astrophysics Data System (ADS)
Kjeldsen, T. R.; Macdonald, N.; Lang, M.; Mediero, L.; Albuquerque, T.; Bogdanowicz, E.; Brázdil, R.; Castellarin, A.; David, V.; Fleig, A.; Gül, G. O.; Kriauciuniene, J.; Kohnová, S.; Merz, B.; Nicholson, O.; Roald, L. A.; Salinas, J. L.; Sarauskiene, D.; Šraj, M.; Strupczewski, W.; Szolgay, J.; Toumazis, A.; Vanneuville, W.; Veijalainen, N.; Wilson, D.
2014-09-01
This review outlines the use of documentary evidence of historical flood events in contemporary flood frequency estimation in European countries. The study shows that despite widespread consensus in the scientific literature on the utility of documentary evidence, the actual migration from academic to practical application has been limited. A detailed review of flood frequency estimation guidelines from different countries showed that the value of historical data is generally recognised, but practical methods for systematic and routine inclusion of this type of data into risk analysis are in most cases not available. Studies of historical events were identified in most countries, and good examples of national databases attempting to collate the available information were identified. The conclusion is that there is considerable potential for improving the reliability of the current flood risk assessments by harvesting the valuable information on past extreme events contained in the historical data sets.
Strategically placing green infrastructure: cost-effective land conservation in the floodplain.
Kousky, Carolyn; Olmstead, Sheila M; Walls, Margaret A; Macauley, Molly
2013-04-16
Green infrastructure approaches have attracted increased attention from local governments as a way to lower flood risk and provide an array of other environmental services. The peer-reviewed literature, however, offers few estimates of the economic impacts of such approaches at the watershed scale. We estimate the avoided flood damages and the costs of preventing development of floodplain parcels in the East River Watershed of Wisconsin's Lower Fox River Basin. Results suggest that the costs of preventing conversion of all projected floodplain development would exceed the flood damage mitigation benefits by a substantial margin. However, targeting of investments to high-benefit, low-cost parcels can reverse this equation, generating net benefits. The analysis demonstrates how any flood-prone community can use a geographic-information-based model to estimate the flood damage reduction benefits of green infrastructure, compare them to the costs, and target investments to design cost-effective nonstructural flood damage mitigation policies.
Variable screening via quantile partial correlation
Ma, Shujie; Tsai, Chih-Ling
2016-01-01
In quantile linear regression with ultra-high dimensional data, we propose an algorithm for screening all candidate variables and subsequently selecting relevant predictors. Specifically, we first employ quantile partial correlation for screening, and then we apply the extended Bayesian information criterion (EBIC) for best subset selection. Our proposed method can successfully select predictors when the variables are highly correlated, and it can also identify variables that make a contribution to the conditional quantiles but are marginally uncorrelated or weakly correlated with the response. Theoretical results show that the proposed algorithm can yield the sure screening set. By controlling the false selection rate, model selection consistency can be achieved theoretically. In practice, we proposed using EBIC for best subset selection so that the resulting model is screening consistent. Simulation studies demonstrate that the proposed algorithm performs well, and an empirical example is presented. PMID:28943683
Design flood estimation in ungauged basins: probabilistic extension of the design-storm concept
NASA Astrophysics Data System (ADS)
Berk, Mario; Špačková, Olga; Straub, Daniel
2016-04-01
Design flood estimation in ungauged basins is an important hydrological task, which is in engineering practice typically solved with the design storm concept. However, neglecting the uncertainty in the hydrological response of the catchment through the assumption of average-recurrence-interval (ARI) neutrality between rainfall and runoff can lead to flawed design flood estimates. Additionally, selecting a single critical rainfall duration neglects the contribution of other rainfall durations on the probability of extreme flood events. In this study, the design flood problem is approached with concepts from structural reliability that enable a consistent treatment of multiple uncertainties in estimating the design flood. The uncertainty of key model parameters are represented probabilistically and the First-Order Reliability Method (FORM) is used to compute the flood exceedance probability. As an important by-product, the FORM analysis provides the most likely parameter combination to lead to a flood with a certain exceedance probability; i.e. it enables one to find representative scenarios for e.g., a 100 year or a 1000 year flood. Possible different rainfall durations are incorporated by formulating the event of a given design flood as a series system. The method is directly applicable in practice, since for the description of the rainfall depth-duration characteristics, the same inputs as for the classical design storm methods are needed, which are commonly provided by meteorological services. The proposed methodology is applied to a case study of Trauchgauer Ach catchment in Bavaria, SCS Curve Number (CN) and Unit hydrograph models are used for modeling the hydrological process. The results indicate, in accordance with past experience, that the traditional design storm concept underestimates design floods.
Risk to life due to flooding in post-Katrina New Orleans
NASA Astrophysics Data System (ADS)
Miller, A.; Jonkman, S. N.; Van Ledden, M.
2014-01-01
After the catastrophic flooding of New Orleans due to hurricane Katrina in the year 2005, the city's hurricane protection system has been improved to provide protection against a hurricane load with a 1/100 per year exceedance frequency. This paper investigates the risk to life in post-Katrina New Orleans. In a risk-based approach the probabilities and consequences of various flood scenarios have been analyzed for the central area of the city (the metro bowl) to give a preliminary estimate of the risk to life in the post-Katrina situation. A two-dimensional hydrodynamic model has been used to simulate flood characteristics of various breaches. The model for estimation of fatality rates is based on the loss of life data for Hurricane Katrina. Results indicate that - depending on the flood scenario - the estimated loss of life in case of flooding ranges from about 100 to nearly 500, with the highest life loss due to breaching of the river levees leading to large flood depths. The probability and consequence estimates are combined to determine the individual risk and societal risk for New Orleans. When compared to risks of other large scale engineering systems (e.g. other flood prone areas, dams and the nuclear sector) and acceptable risk criteria found in literature, the risks for the metro bowl are found to be relatively high. Thus, despite major improvements to the flood protection system, the flood risk of post-Katrina New Orleans is still expected to be significant. Effects of reduction strategies on the risk level are discussed as a basis for further evaluation.
Fifty-year flood-inundation maps for Catacamas, Honduras
Kresch, David L.; Mastin, Mark C.; Olsen, T.D.
2002-01-01
After the devastating floods caused by Hurricane Mitch in 1998, maps of the areas and depths of the 50-year-flood inundation at 15 municipalities in Honduras were prepared as a tool for agencies involved in reconstruction and planning. This report, which is one in a series of 15, presents maps of areas in the municipality of Catacamas that would be inundated by a 50-year-flood of Rio Catacamas. Geographic Information System (GIS) coverages of the flood inundation are available on a computer in the municipality of Catacamas as part of the Municipal GIS project and on the Internet at the Flood Hazard Mapping Web page (http://mitchnts1.cr.usgs.gov/projects/ floodhazard.html). These coverages allow users to view the flood inundation in much more detail than is possible using the maps in this report. Water-surface elevations for a 50-year-flood on Rio Catacamas at Catacamas were estimated using HEC-RAS, a one-dimensional, steady-flow, step-backwater computer program. The channel and floodplain cross sections used in HEC-RAS were developed from an airborne light-detection-and-ranging (LIDAR) topographic survey of the area. The 50-year-flood discharge for Rio Catacamas at Catacamas, 216 cubic meters per second, was estimated using a regression equation that relates the 50-year-flood discharge to drainage area and mean annual precipitation because there are no long-term stream-gaging stations on the river from which to estimate the discharge. The drainage area and mean annual precipitation estimated for Rio Catacamas at Catacamas are 45.4 square kilometers and 1,773 millimeters, respectively.
Technique for estimating depth of 100-year floods in Tennessee
Gamble, Charles R.; Lewis, James G.
1977-01-01
Preface: A method is presented for estimating the depth of the loo-year flood in four hydrologic areas in Tennessee. Depths at 151 gaging stations on streams that were not significantly affected by man made changes were related to basin characteristics by multiple regression techniques. Equations derived from the analysis can be used to estimate the depth of the loo-year flood if the size of the drainage basin is known.
1998-03-01
benefit estimation techniques used to monetize the value of flood hazard reduction in the City of Roanoke. Each method was then used to estimate...behavior. This framework justifies interpreting people’s choices to infer and then monetize their preferences. If individuals have well-ordered and...Journal of Agricultural Economics. 68 (1986) 2: 280-290. Soule, Don M. and Claude M. Vaughn, "Flood Protection Benefits as Reflected in Property
Non-inferiority tests for anti-infective drugs using control group quantiles.
Fay, Michael P; Follmann, Dean A
2016-12-01
In testing for non-inferiority of anti-infective drugs, the primary endpoint is often the difference in the proportion of failures between the test and control group at a landmark time. The landmark time is chosen to approximately correspond to the qth historic quantile of the control group, and the non-inferiority margin is selected to be reasonable for the target level q. For designing these studies, a troubling issue is that the landmark time must be pre-specified, but there is no guarantee that the proportion of control failures at the landmark time will be close to the target level q. If the landmark time is far from the target control quantile, then the pre-specified non-inferiority margin may not longer be reasonable. Exact variable margin tests have been developed by Röhmel and Kieser to address this problem, but these tests can have poor power if the observed control failure rate at the landmark time is far from its historic value. We develop a new variable margin non-inferiority test where we continue sampling until a pre-specified proportion of failures, q, have occurred in the control group, where q is the target quantile level. The test does not require any assumptions on the failure time distributions, and hence, no knowledge of the true [Formula: see text] control quantile for the study is needed. Our new test is exact and has power comparable to (or greater than) its competitors when the true control quantile from the study equals (or differs moderately from) its historic value. Our nivm R package performs the test and gives confidence intervals on the difference in failure rates at the true target control quantile. The tests can be applied to time to cure or other numeric variables as well. A substantial proportion of new anti-infective drugs being developed use non-inferiority tests in their development, and typically, a pre-specified landmark time and its associated difference margin are set at the design stage to match a specific target control quantile. If through changing standard of care or selection of a different population the target quantile for the control group changes from its historic value, then the appropriateness of the pre-specified margin at the landmark time may be questionable. Our proposed test avoids this problem by sampling until a pre-specified proportion of the controls have failed. © The Author(s) 2016.
Estimates of Present and Future Flood Risk in the Conterminous United States
NASA Astrophysics Data System (ADS)
Wing, O.; Bates, P. D.; Smith, A.; Sampson, C. C.; Johnson, K.; Fargione, J.; Morefield, P.
2017-12-01
Past attempts to estimate flood risk across the USA either have incomplete coverage, coarse resolution or use overly simplified models of the flooding process. In this paper, we use a new 30m resolution model of the entire conterminous US (CONUS) with realistic flood physics to produce estimates of flood hazard which match to within 90% accuracy the skill of local models built with detailed data. Socio-economic data of commensurate resolution are combined with these flood depths to estimate current and future flood risk. Future population and land-use projections from the US Environmental Protection Agency (USEPA) are employed to indicate how flood risk might change through the 21st Century, while present-day estimates utilize the Federal Emergency Management Agency (FEMA) National Structure Inventory and a USEPA map of population distribution. Our data show that the total CONUS population currently exposed to serious flooding is 2.6 to 3.1 times higher than previous estimates; with nearly 41 million Americans living within the so-called 1 in 100-year (1% annual probability) floodplain, compared to only 13 million according to FEMA flood maps. Moreover, socio-economic change alone leads to significant future increases in flood exposure and risk, even before climate change impacts are accounted for. The share of the population living on the 1 in 100-year floodplain is projected to increase from 13.3% in the present-day to 15.6 - 15.8% in 2050 and 16.4 - 16.8% in 2100. The area of developed land within this floodplain, currently at 150,000 km2, is likely to increase by 37 - 72% in 2100 based on the scenarios selected. 5.5 trillion worth of assets currently lie on the 1% floodplain; we project that by 2100 this number will exceed 10 trillion. With this detailed spatial information on present-day flood risk, federal and state agencies can take appropriate action to mitigate losses. Use of USEPA population and land-use projections mean that particular attention can be paid to floodplains where development is projected. Steps to conserve such areas or ensure adequate defenses are in place could avoid the exposure of trillions of dollars of assets, not to mention the human suffering caused by loss of property and life.
Fan, Qin; Davlasheridze, Meri
2016-06-01
Climate change is expected to worsen the negative effects of natural disasters like floods. The negative impacts, however, can be mitigated by individuals' adjustments through migration and relocation behaviors. Previous literature has identified flood risk as one significant driver in relocation decisions, but no prior study examines the effect of the National Flood Insurance Program's voluntary program-the Community Rating System (CRS)-on residential location choice. This article fills this gap and tests the hypothesis that flood risk and the CRS-creditable flood control activities affect residential location choices. We employ a two-stage sorting model to empirically estimate the effects. In the first stage, individuals' risk perception and preference heterogeneity for the CRS activities are considered, while mean effects of flood risk and the CRS activities are estimated in the second stage. We then estimate heterogeneous marginal willingness to pay (WTP) for the CRS activities by category. Results show that age, ethnicity and race, educational attainment, and prior exposure to risk explain risk perception. We find significant values for the CRS-creditable mitigation activities, which provides empirical evidence for the benefits associated with the program. The marginal WTP for an additional credit point earned for public information activities, including hazard disclosure, is found to be the highest. Results also suggest that water amenities dominate flood risk. Thus, high amenity values may increase exposure to flood risk, and flood mitigation projects should be strategized in coastal regions accordingly. © 2015 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Apel, Heiko; Martínez Trepat, Oriol; Nghia Hung, Nguyen; Thi Chinh, Do; Merz, Bruno; Viet Dung, Nguyen
2016-04-01
Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either a fluvial or pluvial flood hazard, studies of a combined fluvial and pluvial flood hazard are hardly available. Thus this study aims to analyse a fluvial and a pluvial flood hazard individually, but also to develop a method for the analysis of a combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as an example. In this tropical environment the annual monsoon triggered floods of the Mekong River, which can coincide with heavy local convective precipitation events, causing both fluvial and pluvial flooding at the same time. The fluvial flood hazard was estimated with a copula-based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. The pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data and a stochastic rainstorm generator. Inundation for all flood scenarios was simulated by a 2-dimensional hydrodynamic model implemented on a Graphics Processing Unit (GPU) for time-efficient flood propagation modelling. The combined fluvial-pluvial flood scenarios were derived by adding rainstorms to the fluvial flood events during the highest fluvial water levels. The probabilities of occurrence of the combined events were determined assuming independence of the two flood types and taking the seasonality and probability of coincidence into account. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation taking into account the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by percentile maps. The results are critically discussed and their usage in flood risk management are outlined.
Olson, Scott A.; with a section by Veilleux, Andrea G.
2014-01-01
This report provides estimates of flood discharges at selected annual exceedance probabilities (AEPs) for streamgages in and adjacent to Vermont and equations for estimating flood discharges at AEPs of 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent (recurrence intervals of 2-, 5-, 10-, 25-, 50-, 100-, 200-, and 500-years, respectively) for ungaged, unregulated, rural streams in Vermont. The equations were developed using generalized least-squares regression. Flood-frequency and drainage-basin characteristics from 145 streamgages were used in developing the equations. The drainage-basin characteristics used as explanatory variables in the regression equations include drainage area, percentage of wetland area, and the basin-wide mean of the average annual precipitation. The average standard errors of prediction for estimating the flood discharges at the 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent AEP with these equations are 34.9, 36.0, 38.7, 42.4, 44.9, 47.3, 50.7, and 55.1 percent, respectively. Flood discharges at selected AEPs for streamgages were computed by using the Expected Moments Algorithm. To improve estimates of the flood discharges for given exceedance probabilities at streamgages in Vermont, a new generalized skew coefficient was developed. The new generalized skew for the region is a constant, 0.44. The mean square error of the generalized skew coefficient is 0.078. This report describes a technique for using results from the regression equations to adjust an AEP discharge computed from a streamgage record. This report also describes a technique for using a drainage-area adjustment to estimate flood discharge at a selected AEP for an ungaged site upstream or downstream from a streamgage. The final regression equations and the flood-discharge frequency data used in this study will be available in StreamStats. StreamStats is a World Wide Web application providing automated regression-equation solutions for user-selected sites on streams.
Quantiles for Finite Mixtures of Normal Distributions
ERIC Educational Resources Information Center
Rahman, Mezbahur; Rahman, Rumanur; Pearson, Larry M.
2006-01-01
Quantiles for finite mixtures of normal distributions are computed. The difference between a linear combination of independent normal random variables and a linear combination of independent normal densities is emphasized. (Contains 3 tables and 1 figure.)
Merging information from multi-model flood projections in a hierarchical Bayesian framework
NASA Astrophysics Data System (ADS)
Le Vine, Nataliya
2016-04-01
Multi-model ensembles are becoming widely accepted for flood frequency change analysis. The use of multiple models results in large uncertainty around estimates of flood magnitudes, due to both uncertainty in model selection and natural variability of river flow. The challenge is therefore to extract the most meaningful signal from the multi-model predictions, accounting for both model quality and uncertainties in individual model estimates. The study demonstrates the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach facilitates explicit treatment of shared multi-model discrepancy as well as the probabilistic nature of the flood estimates, by treating the available models as a sample from a hypothetical complete (but unobserved) set of models. The advantages of the approach are: 1) to insure an adequate 'baseline' conditions with which to compare future changes; 2) to reduce flood estimate uncertainty; 3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; 4) to adjust multi-model consistency criteria when model biases are large; and 5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.
NASA Astrophysics Data System (ADS)
Akyurek, Z.; Bozoglu, B.; Girayhan, T.
2015-12-01
Flooding has the potential to cause significant impacts to economic activities as well as to disrupt or displace populations. Changing climate regimes such as extreme precipitation events increase flood vulnerability and put additional stresses on infrastructure. In this study the flood modelling in an urbanized area, namely Samsun-Terme in Blacksea region of Turkey is done. MIKE21 with flexible grid is used in 2- dimensional shallow water flow modelling. 1/1000 scaled maps with the buildings for the urbanized area and 1/5000 scaled maps for the rural parts are used to obtain DTM needed in the flood modelling. The bathymetry of the river is obtained from additional surveys. The main river passing through the urbanized area has a capacity of Q5 according to the design discharge obtained by simple ungauged discharge estimation depending on catchment area only. The effects of the available structures like bridges across the river on the flooding are presented. The upstream structural measures are studied on scenario basis. Four sub-catchments of Terme River are considered as contributing the downstream flooding. The existing circumstance of the Terme River states that the meanders of the river have a major effect on the flood situation and lead to approximately 35% reduction in the peak discharge between upstream and downstream of the river. It is observed that if the flow from the upstream catchments can be retarded through a detention pond constructed in at least two of the upstream catchments, estimated Q100 flood can be conveyed by the river without overtopping from the river channel. The operation of the upstream detention ponds and the scenarios to convey Q500 without causing flooding are also presented. Structural management measures to address changes in flood characteristics in water management planning are discussed. Flood risk is obtained by using the flood hazard maps and water depth-damage functions plotted for a variety of building types and occupancies. The estimated mean annual hazard for the area is calculated as $340 000 and it is estimated that the upstream structural management measures can decrease the direct economic risk 11% for the 500 return period flood.
Uncertainty and sensitivity assessment of flood risk assessments
NASA Astrophysics Data System (ADS)
de Moel, H.; Aerts, J. C.
2009-12-01
Floods are one of the most frequent and costly natural disasters. In order to protect human lifes and valuable assets from the effect of floods many defensive structures have been build. Despite these efforts economic losses due to catastrophic flood events have, however, risen substantially during the past couple of decades because of continuing economic developments in flood prone areas. On top of that, climate change is expected to affect the magnitude and frequency of flood events. Because these ongoing trends are expected to continue, a transition can be observed in various countries to move from a protective flood management approach to a more risk based flood management approach. In a risk based approach, flood risk assessments play an important role in supporting decision making. Most flood risk assessments assess flood risks in monetary terms (damage estimated for specific situations or expected annual damage) in order to feed cost-benefit analysis of management measures. Such flood risk assessments contain, however, considerable uncertainties. This is the result from uncertainties in the many different input parameters propagating through the risk assessment and accumulating in the final estimate. Whilst common in some other disciplines, as with integrated assessment models, full uncertainty and sensitivity analyses of flood risk assessments are not so common. Various studies have addressed uncertainties regarding flood risk assessments, but have mainly focussed on the hydrological conditions. However, uncertainties in other components of the risk assessment, like the relation between water depth and monetary damage, can be substantial as well. This research therefore tries to assess the uncertainties of all components of monetary flood risk assessments, using a Monte Carlo based approach. Furthermore, the total uncertainty will also be attributed to the different input parameters using a variance based sensitivity analysis. Assessing and visualizing the uncertainties of the final risk estimate will be helpful to decision makers to make better informed decisions and attributing this uncertainty to the input parameters helps to identify which parameters are most important when it comes to uncertainty in the final estimate and should therefore deserve additional attention in further research.
How are flood risk estimates affected by the choice of return-periods?
NASA Astrophysics Data System (ADS)
Ward, P. J.; de Moel, H.; Aerts, J. C. J. H.
2011-12-01
Flood management is more and more adopting a risk based approach, whereby flood risk is the product of the probability and consequences of flooding. One of the most common approaches in flood risk assessment is to estimate the damage that would occur for floods of several exceedance probabilities (or return periods), to plot these on an exceedance probability-loss curve (risk curve) and to estimate risk as the area under the curve. However, there is little insight into how the selection of the return-periods (which ones and how many) used to calculate risk actually affects the final risk calculation. To gain such insights, we developed and validated an inundation model capable of rapidly simulating inundation extent and depth, and dynamically coupled this to an existing damage model. The method was applied to a section of the River Meuse in the southeast of the Netherlands. Firstly, we estimated risk based on a risk curve using yearly return periods from 2 to 10 000 yr (€ 34 million p.a.). We found that the overall risk is greatly affected by the number of return periods used to construct the risk curve, with over-estimations of annual risk between 33% and 100% when only three return periods are used. In addition, binary assumptions on dike failure can have a large effect (a factor two difference) on risk estimates. Also, the minimum and maximum return period considered in the curve affects the risk estimate considerably. The results suggest that more research is needed to develop relatively simple inundation models that can be used to produce large numbers of inundation maps, complementary to more complex 2-D-3-D hydrodynamic models. It also suggests that research into flood risk could benefit by paying more attention to the damage caused by relatively high probability floods.
Flood-frequency relations for urban streams in Georgia; 1994 update
Inman, Ernest J.
1995-01-01
A statewide study of flood magnitude and frequency in urban areas of Georgia was made to develop methods of estimating flood characteristics at ungaged urban sites. A knowledge of the magnitude and frequency of floods is needed for the design of highway drainage structures, establishing flood- insurance rates, and other uses by urban planners and engineers. A U.S. Geological Survey rainfall-runoff model was calibrated for 65 urban drainage basins ranging in size from 0.04 to 19.1 square miles in 10 urban areas of Georgia. Rainfall-runoff data were collected for a period of 5 to 7 years at each station beginning in 1973 in Metropolitan Atlanta and ending in 1993 in Thomasville, Ga. Calibrated models were used to synthesize long-term annual flood peak discharges for these basins from existing Long-term rainfall records. The 2- to 500-year flood-frequency estimates were developed for each basin by fitting a Pearson Type III frequency distribution curve to the logarithms of these annual peak discharges. Multiple-regression analyses were used to define relations between the station flood-frequency data and several physical basin characteristics, of which drainage area and total impervious area were the most statistically significant. Using theseregression equations and basin characteristics, the magnitude and frequency of floods at ungaged urban basins can be estimated throughout Georgia.
Valuing preferences over stormwater management outcomes including improved hydrologic function
NASA Astrophysics Data System (ADS)
LondoñO Cadavid, Catalina; Ando, Amy W.
2013-07-01
Stormwater runoff causes environmental problems such as flooding, soil erosion, and water pollution. Conventional stormwater management has focused primarily on flood reduction, while a new generation of decentralized stormwater solutions yields ancillary benefits such as healthier aquatic habitat, improved surface water quality, and increased water table recharge. Previous research has estimated values for flood reduction from stormwater management, but no estimates exist for the willingness to pay (WTP) for some of the other environmental benefits of alternative approaches to stormwater control. This paper uses a choice experiment survey of households in Champaign-Urbana, Illinois, to estimate the values of several attributes of stormwater management outcomes. We analyzed data from 131 surveyed households in randomly selected neighborhoods. We find that people value reduced basement flooding more than reductions in yard or street flooding, but WTP for basement flood reduction in the area only exists if individuals are currently experiencing significant flooding themselves. Citizens value both improved water quality and improved hydrologic function and aquatic habitat from runoff reduction. Thus, widespread investment in low impact development stormwater solutions could have very large total benefits, and stormwater managers should be wary of policies and infrastructure plans that reduce flooding at the expense of water quality and aquatic habitat.
Flood return level analysis of Peaks over Threshold series under changing climate
NASA Astrophysics Data System (ADS)
Li, L.; Xiong, L.; Hu, T.; Xu, C. Y.; Guo, S.
2016-12-01
Obtaining insights into future flood estimation is of great significance for water planning and management. Traditional flood return level analysis with the stationarity assumption has been challenged by changing environments. A method that takes into consideration the nonstationarity context has been extended to derive flood return levels for Peaks over Threshold (POT) series. With application to POT series, a Poisson distribution is normally assumed to describe the arrival rate of exceedance events, but this distribution assumption has at times been reported as invalid. The Negative Binomial (NB) distribution is therefore proposed as an alternative to the Poisson distribution assumption. Flood return levels were extrapolated in nonstationarity context for the POT series of the Weihe basin, China under future climate scenarios. The results show that the flood return levels estimated under nonstationarity can be different with an assumption of Poisson and NB distribution, respectively. The difference is found to be related to the threshold value of POT series. The study indicates the importance of distribution selection in flood return level analysis under nonstationarity and provides a reference on the impact of climate change on flood estimation in the Weihe basin for the future.
Combining information from multiple flood projections in a hierarchical Bayesian framework
NASA Astrophysics Data System (ADS)
Le Vine, Nataliya
2016-04-01
This study demonstrates, in the context of flood frequency analysis, the potential of a recently proposed hierarchical Bayesian approach to combine information from multiple models. The approach explicitly accommodates shared multimodel discrepancy as well as the probabilistic nature of the flood estimates, and treats the available models as a sample from a hypothetical complete (but unobserved) set of models. The methodology is applied to flood estimates from multiple hydrological projections (the Future Flows Hydrology data set) for 135 catchments in the UK. The advantages of the approach are shown to be: (1) to ensure adequate "baseline" with which to compare future changes; (2) to reduce flood estimate uncertainty; (3) to maximize use of statistical information in circumstances where multiple weak predictions individually lack power, but collectively provide meaningful information; (4) to diminish the importance of model consistency when model biases are large; and (5) to explicitly consider the influence of the (model performance) stationarity assumption. Moreover, the analysis indicates that reducing shared model discrepancy is the key to further reduction of uncertainty in the flood frequency analysis. The findings are of value regarding how conclusions about changing exposure to flooding are drawn, and to flood frequency change attribution studies.
Techniques for estimating magnitude and frequency of floods on streams in Indiana
Glatfelter, D.R.
1984-01-01
A rainfall-runoff model was tlsed to synthesize long-term peak data at 11 gaged locations on small streams. Flood-frequency curves developed from the long-term synthetic data were combined with curves based on short-term observed data to provide weighted estimates of flood magnitude and frequency at the rainfall-runoff stations.
Pruess, J.; Wohl, E.E.; Jarrett, R.D.
1998-01-01
Historical and geologic records may be used to enhance magnitude estimates for extreme floods along mountain channels, as demonstrated in this study from the San Juan Mountains of Colorado. Historical photographs and local newspaper accounts from the October 1911 flood indicate the likely extent of flooding and damage. A checklist designed to organize and numerically score evidence of flooding was used in 15 field reconnaissance surveys in the upper Animas River valley of southwestern Colorado. Step-backwater flow modeling estimated the discharges necessary to create longitudinal flood bars observed at 6 additional field sites. According to these analyses, maximum unit discharge peaks at approximately 1.3 m3 s-1 km-2 around 2200 m elevation, with decreased unit discharges at both higher and lower elevations. These results (1) are consistent with Jarrett's (1987, 1990, 1993) maximum 2300-m elevation limit for flash-flooding in the Colorado Rocky Mountains, and (2) suggest that current Probable Maximum Flood (PMF) estimates based on a 24-h rainfall of 30 cm at elevations above 2700 m are unrealistically large. The methodology used for this study should be readily applicable to other mountain regions where systematic streamflow records are of short duration or nonexistent.
Ying, Yung-Hsiang; Wu, Chin-Chih; Chang, Koyin
2013-09-27
To understand the impact of drinking and driving laws on drinking and driving fatality rates, this study explored the different effects these laws have on areas with varying severity rates for drinking and driving. Unlike previous studies, this study employed quantile regression analysis. Empirical results showed that policies based on local conditions must be used to effectively reduce drinking and driving fatality rates; that is, different measures should be adopted to target the specific conditions in various regions. For areas with low fatality rates (low quantiles), people's habits and attitudes toward alcohol should be emphasized instead of transportation safety laws because "preemptive regulations" are more effective. For areas with high fatality rates (or high quantiles), "ex-post regulations" are more effective, and impact these areas approximately 0.01% to 0.05% more than they do areas with low fatality rates.
Spatial quantile regression using INLA with applications to childhood overweight in Malawi.
Mtambo, Owen P L; Masangwi, Salule J; Kazembe, Lawrence N M
2015-04-01
Analyses of childhood overweight have mainly used mean regression. However, using quantile regression is more appropriate as it provides flexibility to analyse the determinants of overweight corresponding to quantiles of interest. The main objective of this study was to fit a Bayesian additive quantile regression model with structured spatial effects for childhood overweight in Malawi using the 2010 Malawi DHS data. Inference was fully Bayesian using R-INLA package. The significant determinants of childhood overweight ranged from socio-demographic factors such as type of residence to child and maternal factors such as child age and maternal BMI. We observed significant positive structured spatial effects on childhood overweight in some districts of Malawi. We recommended that the childhood malnutrition policy makers should consider timely interventions based on risk factors as identified in this paper including spatial targets of interventions. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ying, Yung-Hsiang; Wu, Chin-Chih; Chang, Koyin
2013-01-01
To understand the impact of drinking and driving laws on drinking and driving fatality rates, this study explored the different effects these laws have on areas with varying severity rates for drinking and driving. Unlike previous studies, this study employed quantile regression analysis. Empirical results showed that policies based on local conditions must be used to effectively reduce drinking and driving fatality rates; that is, different measures should be adopted to target the specific conditions in various regions. For areas with low fatality rates (low quantiles), people’s habits and attitudes toward alcohol should be emphasized instead of transportation safety laws because “preemptive regulations” are more effective. For areas with high fatality rates (or high quantiles), “ex-post regulations” are more effective, and impact these areas approximately 0.01% to 0.05% more than they do areas with low fatality rates. PMID:24084673
Microarray image analysis: background estimation using quantile and morphological filters.
Bengtsson, Anders; Bengtsson, Henrik
2006-02-28
In a microarray experiment the difference in expression between genes on the same slide is up to 103 fold or more. At low expression, even a small error in the estimate will have great influence on the final test and reference ratios. In addition to the true spot intensity the scanned signal consists of different kinds of noise referred to as background. In order to assess the true spot intensity background must be subtracted. The standard approach to estimate background intensities is to assume they are equal to the intensity levels between spots. In the literature, morphological opening is suggested to be one of the best methods for estimating background this way. This paper examines fundamental properties of rank and quantile filters, which include morphological filters at the extremes, with focus on their ability to estimate between-spot intensity levels. The bias and variance of these filter estimates are driven by the number of background pixels used and their distributions. A new rank-filter algorithm is implemented and compared to methods available in Spot by CSIRO and GenePix Pro by Axon Instruments. Spot's morphological opening has a mean bias between -47 and -248 compared to a bias between 2 and -2 for the rank filter and the variability of the morphological opening estimate is 3 times higher than for the rank filter. The mean bias of Spot's second method, morph.close.open, is between -5 and -16 and the variability is approximately the same as for morphological opening. The variability of GenePix Pro's region-based estimate is more than ten times higher than the variability of the rank-filter estimate and with slightly more bias. The large variability is because the size of the background window changes with spot size. To overcome this, a non-adaptive region-based method is implemented. Its bias and variability are comparable to that of the rank filter. The performance of more advanced rank filters is equal to the best region-based methods. However, in order to get unbiased estimates these filters have to be implemented with great care. The performance of morphological opening is in general poor with a substantial spatial-dependent bias.
NASA Astrophysics Data System (ADS)
Yi, J.; Choi, C.
2014-12-01
Rainfall observation and forecasting using remote sensing such as RADAR(Radio Detection and Ranging) and satellite images are widely used to delineate the increased damage by rapid weather changeslike regional storm and flash flood. The flood runoff was calculated by using adaptive neuro-fuzzy inference system, the data driven models and MAPLE(McGill Algorithm for Precipitation Nowcasting by Lagrangian Extrapolation) forecasted precipitation data as the input variables.The result of flood estimation method using neuro-fuzzy technique and RADAR forecasted precipitation data was evaluated by comparing it with the actual data.The Adaptive Neuro Fuzzy method was applied to the Chungju Reservoir basin in Korea. The six rainfall events during the flood seasons in 2010 and 2011 were used for the input data.The reservoir inflow estimation results were comparedaccording to the rainfall data used for training, checking and testing data in the model setup process. The results of the 15 models with the combination of the input variables were compared and analyzed. Using the relatively larger clustering radius and the biggest flood ever happened for training data showed the better flood estimation in this study.The model using the MAPLE forecasted precipitation data showed better result for inflow estimation in the Chungju Reservoir.
Flood control surveys in the northeast
Arthur Bevan
1947-01-01
Floods are a grave danger to our Nation's resources. It is estimated that floods cost the United States at least $100 million every year. The recent Mississippi floods, which dramatically brought the seriousness of the situation to public attention, cost half a billion dollars in direct-damages. The Northeast carries a heavy burden of flood losses. In 1936, floods...
Adequacy of satellite derived rainfall data for stream flow modeling
Artan, G.; Gadain, Hussein; Smith, Jodie; Asante, Kwasi; Bandaragoda, C.J.; Verdin, J.P.
2007-01-01
Floods are the most common and widespread climate-related hazard on Earth. Flood forecasting can reduce the death toll associated with floods. Satellites offer effective and economical means for calculating areal rainfall estimates in sparsely gauged regions. However, satellite-based rainfall estimates have had limited use in flood forecasting and hydrologic stream flow modeling because the rainfall estimates were considered to be unreliable. In this study we present the calibration and validation results from a spatially distributed hydrologic model driven by daily satellite-based estimates of rainfall for sub-basins of the Nile and Mekong Rivers. The results demonstrate the usefulness of remotely sensed precipitation data for hydrologic modeling when the hydrologic model is calibrated with such data. However, the remotely sensed rainfall estimates cannot be used confidently with hydrologic models that are calibrated with rain gauge measured rainfall, unless the model is recalibrated. ?? Springer Science+Business Media, Inc. 2007.
Large-scale application of the flood damage model RAilway Infrastructure Loss (RAIL)
NASA Astrophysics Data System (ADS)
Kellermann, Patric; Schönberger, Christine; Thieken, Annegret H.
2016-11-01
Experience has shown that river floods can significantly hamper the reliability of railway networks and cause extensive structural damage and disruption. As a result, the national railway operator in Austria had to cope with financial losses of more than EUR 100 million due to flooding in recent years. Comprehensive information on potential flood risk hot spots as well as on expected flood damage in Austria is therefore needed for strategic flood risk management. In view of this, the flood damage model RAIL (RAilway Infrastructure Loss) was applied to estimate (1) the expected structural flood damage and (2) the resulting repair costs of railway infrastructure due to a 30-, 100- and 300-year flood in the Austrian Mur River catchment. The results were then used to calculate the expected annual damage of the railway subnetwork and subsequently analysed in terms of their sensitivity to key model assumptions. Additionally, the impact of risk aversion on the estimates was investigated, and the overall results were briefly discussed against the background of climate change and possibly resulting changes in flood risk. The findings indicate that the RAIL model is capable of supporting decision-making in risk management by providing comprehensive risk information on the catchment level. It is furthermore demonstrated that an increased risk aversion of the railway operator has a marked influence on flood damage estimates for the study area and, hence, should be considered with regard to the development of risk management strategies.
Economic valuation of flood mitigation services: A case study from the Otter Creek, VT.
NASA Astrophysics Data System (ADS)
Galford, G. L.; Ricketts, T.; Bryan, K. L.; ONeil-Dunne, J.; Polasky, S.
2014-12-01
The ecosystem services provided by wetlands are widely recognized but difficult to quantify. In particular, estimating the effect of landcover and land use on downstream flood outcomes remains challenging, but is increasingly important in light of climate change predictions of increased precipitation in many areas. Economic valuation can help incorporate ecosystem services into decisions and enable communities to plan for climate and flood resiliency. Here we estimate the economic value of Otter Creek wetlands for Middlebury, VT in mitigating the flood that followed Tropical Storm Irene, as well as for ten historic floods. Observationally, hydrographs above and below the wetlands in the case of each storm indicated the wetlands functioned as a temporary reservoir, slowing the delivery of water to Middlebury. We compare observed floods, based on Middlebury's hydrograph, with simulated floods for scenarios without wetlands. To simulate these "without wetlands" scenarios, we assume the same volume of water was delivered to Middlebury, but in a shorter time pulse similar to a hydrograph upstream of the wetlands. For scenarios with and without wetlands, we map the spatial extent of flooding using LiDAR digital elevation data. We then estimate flood depth at each affected building, and calculate monetary losses as a function of the flood depth and house value using established depth damage relationships. For example, we expect damages equal to 20% of the houses value for a flood depth of two feet in a two-story home with a basement. We define the value of flood mitigation services as the difference in damages between the with and without wetlands scenario, and find that the Otter Creek wetlands reduced flood damage in Middlebury by 88% following Hurricane Irene. Using the 10 additional historic floods, we estimate an ongoing mean value of $400,000 in avoided damages per year. Economic impacts of this magnitude stress the importance of wetland conservation and warrant the consideration of ecosystem services in land use decisions. Our study indicates that here and elsewhere, green infrastructure may have to potential to increase the resilience of communities to projected changes in climate.
Whetstone, B.H.
1982-01-01
A program to collect and analyze flood data from small streams in South Carolina was conducted from 1967-75, as a cooperative research project with the South Carolina Department of Highways and Public Transportation and the Federal Highway Administration. As a result of that program, a technique is presented for estimating the magnitude and frequency of floods on small streams in South Carolina with drainage areas ranging in size from 1 to 500 square miles. Peak-discharge data from 74 stream-gaging stations (25 small streams were synthesized, whereas 49 stations had long-term records) were used in multiple regression procedures to obtain equations for estimating magnitude of floods having recurrence intervals of 10, 25, 50, and 100 years on small natural streams. The significant independent variable was drainage area. Equations were developed for the three physiographic provinces of South Carolina (Coastal Plain, Piedmont, and Blue Ridge) and can be used for estimating floods on small streams. (USGS)
Birthweight Related Factors in Northwestern Iran: Using Quantile Regression Method.
Fallah, Ramazan; Kazemnejad, Anoshirvan; Zayeri, Farid; Shoghli, Alireza
2015-11-18
Birthweight is one of the most important predicting indicators of the health status in adulthood. Having a balanced birthweight is one of the priorities of the health system in most of the industrial and developed countries. This indicator is used to assess the growth and health status of the infants. The aim of this study was to assess the birthweight of the neonates by using quantile regression in Zanjan province. This analytical descriptive study was carried out using pre-registered (March 2010 - March 2012) data of neonates in urban/rural health centers of Zanjan province using multiple-stage cluster sampling. Data were analyzed using multiple linear regressions andquantile regression method and SAS 9.2 statistical software. From 8456 newborn baby, 4146 (49%) were female. The mean age of the mothers was 27.1±5.4 years. The mean birthweight of the neonates was 3104 ± 431 grams. Five hundred and seventy-three patients (6.8%) of the neonates were less than 2500 grams. In all quantiles, gestational age of neonates (p<0.05), weight and educational level of the mothers (p<0.05) showed a linear significant relationship with the i of the neonates. However, sex and birth rank of the neonates, mothers age, place of residence (urban/rural) and career were not significant in all quantiles (p>0.05). This study revealed the results of multiple linear regression and quantile regression were not identical. We strictly recommend the use of quantile regression when an asymmetric response variable or data with outliers is available.
Birthweight Related Factors in Northwestern Iran: Using Quantile Regression Method
Fallah, Ramazan; Kazemnejad, Anoshirvan; Zayeri, Farid; Shoghli, Alireza
2016-01-01
Introduction: Birthweight is one of the most important predicting indicators of the health status in adulthood. Having a balanced birthweight is one of the priorities of the health system in most of the industrial and developed countries. This indicator is used to assess the growth and health status of the infants. The aim of this study was to assess the birthweight of the neonates by using quantile regression in Zanjan province. Methods: This analytical descriptive study was carried out using pre-registered (March 2010 - March 2012) data of neonates in urban/rural health centers of Zanjan province using multiple-stage cluster sampling. Data were analyzed using multiple linear regressions andquantile regression method and SAS 9.2 statistical software. Results: From 8456 newborn baby, 4146 (49%) were female. The mean age of the mothers was 27.1±5.4 years. The mean birthweight of the neonates was 3104 ± 431 grams. Five hundred and seventy-three patients (6.8%) of the neonates were less than 2500 grams. In all quantiles, gestational age of neonates (p<0.05), weight and educational level of the mothers (p<0.05) showed a linear significant relationship with the i of the neonates. However, sex and birth rank of the neonates, mothers age, place of residence (urban/rural) and career were not significant in all quantiles (p>0.05). Conclusion: This study revealed the results of multiple linear regression and quantile regression were not identical. We strictly recommend the use of quantile regression when an asymmetric response variable or data with outliers is available. PMID:26925889
Curran, Janet H.; Barth, Nancy A.; Veilleux, Andrea G.; Ourso, Robert T.
2016-03-16
Estimates of the magnitude and frequency of floods are needed across Alaska for engineering design of transportation and water-conveyance structures, flood-insurance studies, flood-plain management, and other water-resource purposes. This report updates methods for estimating flood magnitude and frequency in Alaska and conterminous basins in Canada. Annual peak-flow data through water year 2012 were compiled from 387 streamgages on unregulated streams with at least 10 years of record. Flood-frequency estimates were computed for each streamgage using the Expected Moments Algorithm to fit a Pearson Type III distribution to the logarithms of annual peak flows. A multiple Grubbs-Beck test was used to identify potentially influential low floods in the time series of peak flows for censoring in the flood frequency analysis.For two new regional skew areas, flood-frequency estimates using station skew were computed for stations with at least 25 years of record for use in a Bayesian least-squares regression analysis to determine a regional skew value. The consideration of basin characteristics as explanatory variables for regional skew resulted in improvements in precision too small to warrant the additional model complexity, and a constant model was adopted. Regional Skew Area 1 in eastern-central Alaska had a regional skew of 0.54 and an average variance of prediction of 0.45, corresponding to an effective record length of 22 years. Regional Skew Area 2, encompassing coastal areas bordering the Gulf of Alaska, had a regional skew of 0.18 and an average variance of prediction of 0.12, corresponding to an effective record length of 59 years. Station flood-frequency estimates for study sites in regional skew areas were then recomputed using a weighted skew incorporating the station skew and regional skew. In a new regional skew exclusion area outside the regional skew areas, the density of long-record streamgages was too sparse for regional analysis and station skew was used for all estimates. Final station flood frequency estimates for all study streamgages are presented for the 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities.Regional multiple-regression analysis was used to produce equations for estimating flood frequency statistics from explanatory basin characteristics. Basin characteristics, including physical and climatic variables, were updated for all study streamgages using a geographical information system and geospatial source data. Screening for similar-sized nested basins eliminated hydrologically redundant sites, and screening for eligibility for analysis of explanatory variables eliminated regulated peaks, outburst peaks, and sites with indeterminate basin characteristics. An ordinary least‑squares regression used flood-frequency statistics and basin characteristics for 341 streamgages (284 in Alaska and 57 in Canada) to determine the most suitable combination of basin characteristics for a flood-frequency regression model and to explore regional grouping of streamgages for explaining variability in flood-frequency statistics across the study area. The most suitable model for explaining flood frequency used drainage area and mean annual precipitation as explanatory variables for the entire study area as a region. Final regression equations for estimating the 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probability discharge in Alaska and conterminous basins in Canada were developed using a generalized least-squares regression. The average standard error of prediction for the regression equations for the various annual exceedance probabilities ranged from 69 to 82 percent, and the pseudo-coefficient of determination (pseudo-R2) ranged from 85 to 91 percent.The regional regression equations from this study were incorporated into the U.S. Geological Survey StreamStats program for a limited area of the State—the Cook Inlet Basin. StreamStats is a national web-based geographic information system application that facilitates retrieval of streamflow statistics and associated information. StreamStats retrieves published data for gaged sites and, for user-selected ungaged sites, delineates drainage areas from topographic and hydrographic data, computes basin characteristics, and computes flood frequency estimates using the regional regression equations.
Data-driven modeling of surface temperature anomaly and solar activity trends
Friedel, Michael J.
2012-01-01
A novel two-step modeling scheme is used to reconstruct and analyze surface temperature and solar activity data at global, hemispheric, and regional scales. First, the self-organizing map (SOM) technique is used to extend annual modern climate data from the century to millennial scale. The SOM component planes are used to identify and quantify strength of nonlinear relations among modern surface temperature anomalies (<150 years), tropical and extratropical teleconnections, and Palmer Drought Severity Indices (0–2000 years). Cross-validation of global sea and land surface temperature anomalies verifies that the SOM is an unbiased estimator with less uncertainty than the magnitude of anomalies. Second, the quantile modeling of SOM reconstructions reveal trends and periods in surface temperature anomaly and solar activity whose timing agrees with published studies. Temporal features in surface temperature anomalies, such as the Medieval Warm Period, Little Ice Age, and Modern Warming Period, appear at all spatial scales but whose magnitudes increase when moving from ocean to land, from global to regional scales, and from southern to northern regions. Some caveats that apply when interpreting these data are the high-frequency filtering of climate signals based on quantile model selection and increased uncertainty when paleoclimatic data are limited. Even so, all models find the rate and magnitude of Modern Warming Period anomalies to be greater than those during the Medieval Warm Period. Lastly, quantile trends among reconstructed equatorial Pacific temperature profiles support the recent assertion of two primary El Niño Southern Oscillation types. These results demonstrate the efficacy of this alternative modeling approach for reconstructing and interpreting scale-dependent climate variables.
NASA Astrophysics Data System (ADS)
Restrepo-Estrada, Camilo; de Andrade, Sidgley Camargo; Abe, Narumi; Fava, Maria Clara; Mendiondo, Eduardo Mario; de Albuquerque, João Porto
2018-02-01
Floods are one of the most devastating types of worldwide disasters in terms of human, economic, and social losses. If authoritative data is scarce, or unavailable for some periods, other sources of information are required to improve streamflow estimation and early flood warnings. Georeferenced social media messages are increasingly being regarded as an alternative source of information for coping with flood risks. However, existing studies have mostly concentrated on the links between geo-social media activity and flooded areas. Thus, there is still a gap in research with regard to the use of social media as a proxy for rainfall-runoff estimations and flood forecasting. To address this, we propose using a transformation function that creates a proxy variable for rainfall by analysing geo-social media messages and rainfall measurements from authoritative sources, which are later incorporated within a hydrological model for streamflow estimation. We found that the combined use of official rainfall values with the social media proxy variable as input for the Probability Distributed Model (PDM), improved streamflow simulations for flood monitoring. The combination of authoritative sources and transformed geo-social media data during flood events achieved a 71% degree of accuracy and a 29% underestimation rate in a comparison made with real streamflow measurements. This is a significant improvement on the respective values of 39% and 58%, achieved when only authoritative data were used for the modelling. This result is clear evidence of the potential use of derived geo-social media data as a proxy for environmental variables for improving flood early-warning systems.
NASA Astrophysics Data System (ADS)
Werren, G.; Balin, D.; Reynard, E.; Lane, S. N.
2012-04-01
Flood modelling is essential for flood hazard assessment. Modelling becomes a challenge in small, ungauged watersheds prone to flash floods, like the ones draining the town of Beni Mellal (Morocco). Four temporary streams meet in the urban area of Beni Mellal, producing every year sheet floods, harmful to infrastructure and to people. Here, statistical analysis may not give realistic results, but the study of these repeated real flash flood events may provide a better understanding of watershed specific hydrology. This study integrates a larger cooperation project between Switzerland and Morroco, aimed at knowledge transfer in disaster risk reduction, especially through hazard mapping and land-use planning, related to implementation of hazard maps. Hydrologic and hydraulic modelling was carried out to obtain hazard maps. An important point was to find open source data and methods that could still produce a realistic model for the area concerned, in order to provide easy-to-use, cost-effective tools for risk management in developing countries like Morocco, where routine data collection is largely lacking. The data used for modelling is the Web available TRMM 3-Hour 0.25 degree rainfall data provided by the Tropical Rainfall Measurement Mission Project (TRMM). Hydrologic modelling for discharge estimation was undertaken using methods available in the HEC-HMS software provided by the US Army Corps of Engineers® (USACE). Several transfer models were used, so as to choose the best-suited method available. As no model calibration was possible for no measured flow data was available, a one-at-the-time sensitivity analysis was performed on the parameters chosen, in order to detect their influence on the results. But the most important verification method remained field observation, through post-flood field campaigns aimed at mapping water surfaces and depths in the flooded areas, as well as river section monitoring, where rough discharge estimates could be obtained using empirical equations. Another information source was local knowledge, as people could give a rough estimation of concentration time by describing flood evolution. Finally, hydraulic modelling of the flooded areas in the urban perimeter was performed using the USACE HEC-RAS® software capabilities. A specific challenge at this stage was field morphology, as the flooded areas form large alluvial fans, with very different flood behaviour compared to flood plains. Model "calibration" at this stage was undertaken using the mapped water surfaces and depths. Great care was taken for field geometry design, where field observations, measured cross sections and field images were used to improve the existing DTM data. The model included protection dikes already built by local authorities in their flood-fight effort. Because of flash-flood specific behaviour, only maximal flooded surfaces and flow velocities were simulated through steady flow analysis in HEC-RAS. The discharge estimates obtained for the chosen event were comparable to 10-year return periods as estimated by the watershed authorities. Times of concentration correspond to this previous estimation and to local people descriptions. The modelled water surfaces reflect field reality. Flash-flood modelling demands extensive knowledge of the studied field in order to compensate data scarcity. However, more precise data, like radar rainfall estimates available in Morocco, would definitely improve outputs. In this perspective, better data access at the local level and good use of the available methods could benefit the disaster risk reduction effort as a whole.
Allowances for evolving coastal flood risk under uncertain local sea-level rise
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buchanan, Maya K.; Kopp, Robert E.; Oppenheimer, Michael
Estimates of future flood hazards made under the assumption of stationary mean sea level are biased low due to sea-level rise (SLR). However, adjustments to flood return levels made assuming fixed increases of sea level are also inadequate when applied to sea level that is rising over time at an uncertain rate. SLR allowances—the height adjustment from historic flood levels that maintain under uncertainty the annual expected probability of flooding—are typically estimated independently of individual decision-makers’ preferences, such as time horizon, risk tolerance, and confidence in SLR projections.We provide a framework of SLR allowances that employs complete probability distributions ofmore » local SLR and a range of user-defined flood risk management preferences. Given non-stationary and uncertain sea-level rise, these metrics provide estimates of flood protection heights and offsets for different planning horizons in coastal areas. In conclusion, we illustrate the calculation of various allowance types for a set of long-duration tide gauges along U.S. coastlines.« less
Allowances for evolving coastal flood risk under uncertain local sea-level rise
Buchanan, Maya K.; Kopp, Robert E.; Oppenheimer, Michael; ...
2016-06-03
Estimates of future flood hazards made under the assumption of stationary mean sea level are biased low due to sea-level rise (SLR). However, adjustments to flood return levels made assuming fixed increases of sea level are also inadequate when applied to sea level that is rising over time at an uncertain rate. SLR allowances—the height adjustment from historic flood levels that maintain under uncertainty the annual expected probability of flooding—are typically estimated independently of individual decision-makers’ preferences, such as time horizon, risk tolerance, and confidence in SLR projections.We provide a framework of SLR allowances that employs complete probability distributions ofmore » local SLR and a range of user-defined flood risk management preferences. Given non-stationary and uncertain sea-level rise, these metrics provide estimates of flood protection heights and offsets for different planning horizons in coastal areas. In conclusion, we illustrate the calculation of various allowance types for a set of long-duration tide gauges along U.S. coastlines.« less
Bias-adjusted satellite-based rainfall estimates for predicting floods: Narayani Basin
Shrestha, M.S.; Artan, G.A.; Bajracharya, S.R.; Gautam, D.K.; Tokar, S.A.
2011-01-01
In Nepal, as the spatial distribution of rain gauges is not sufficient to provide detailed perspective on the highly varied spatial nature of rainfall, satellite-based rainfall estimates provides the opportunity for timely estimation. This paper presents the flood prediction of Narayani Basin at the Devghat hydrometric station (32000km2) using bias-adjusted satellite rainfall estimates and the Geospatial Stream Flow Model (GeoSFM), a spatially distributed, physically based hydrologic model. The GeoSFM with gridded gauge observed rainfall inputs using kriging interpolation from 2003 was used for calibration and 2004 for validation to simulate stream flow with both having a Nash Sutcliff Efficiency of above 0.7. With the National Oceanic and Atmospheric Administration Climate Prediction Centre's rainfall estimates (CPC-RFE2.0), using the same calibrated parameters, for 2003 the model performance deteriorated but improved after recalibration with CPC-RFE2.0 indicating the need to recalibrate the model with satellite-based rainfall estimates. Adjusting the CPC-RFE2.0 by a seasonal, monthly and 7-day moving average ratio, improvement in model performance was achieved. Furthermore, a new gauge-satellite merged rainfall estimates obtained from ingestion of local rain gauge data resulted in significant improvement in flood predictability. The results indicate the applicability of satellite-based rainfall estimates in flood prediction with appropriate bias correction. ?? 2011 The Authors. Journal of Flood Risk Management ?? 2011 The Chartered Institution of Water and Environmental Management.
Extreme river flow dependence in Northern Scotland
NASA Astrophysics Data System (ADS)
Villoria, M. Franco; Scott, M.; Hoey, T.; Fischbacher-Smith, D.
2012-04-01
Various methods for the spatial analysis of hydrologic data have been developed recently. Here we present results using the conditional probability approach proposed by Keef et al. [Appl. Stat. (2009): 58,601-18] to investigate spatial interdependence in extreme river flows in Scotland. This approach does not require the specification of a correlation function, being mostly suitable for relatively small geographical areas. The work is motivated by the Flood Risk Management Act (Scotland (2009)) which requires maps of flood risk that take account of spatial dependence in extreme river flow. The method is based on two conditional measures of spatial flood risk: firstly the conditional probability PC(p) that a set of sites Y = (Y 1,...,Y d) within a region C of interest exceed a flow threshold Qp at time t (or any lag of t), given that in the specified conditioning site X > Qp; and, secondly the expected number of sites within C that will exceed a flow Qp on average (given that X > Qp). The conditional probabilities are estimated using the conditional distribution of Y |X = x (for large x), which can be modeled using a semi-parametric approach (Heffernan and Tawn [Roy. Statist. Soc. Ser. B (2004): 66,497-546]). Once the model is fitted, pseudo-samples can be generated to estimate functionals of the joint tails of the distribution of (Y,X). Conditional return level plots were directly compared to traditional return level plots thus improving our understanding of the dependence structure of extreme river flow events. Confidence intervals were calculated using block bootstrapping methods (100 replicates). We report results from applying this approach to a set of four rivers (Dulnain, Lossie, Ewe and Ness) in Northern Scotland. These sites were chosen based on data quality, spatial location and catchment characteristics. The river Ness, being the largest (catchment size 1839.1km2) was chosen as the conditioning river. Both the Ewe (441.1km2) and Ness catchments have predominantly impermeable bedrock, with the Ewe's one being very wet. The Lossie(216km2) and Dulnain (272.2km2) both contain significant areas of glacial deposits. River flow in the Dulnain is usually affected by snowmelt. In all cases, the conditional probability of each of the three rivers (Dulnain, Lossie, Ewe) decreases as the event in the conditioning river (Ness) becomes more extreme. The Ewe, despite being the furthest of the three sites from the Ness shows the strongest dependence, with relatively high (>0.4) conditional probabilities even for very extreme events (>0.995). Although the Lossie is closer geographically to the Ness than the Ewe, it shows relatively low conditional probabilities and can be considered independent of the Ness for very extreme events (> 0.990). The conditional probabilities seem to reflect the different catchment characteristics and dominant precipitation generating events, with the Ewe being more similar to the Ness than the other two rivers. This interpretation suggests that the conditional method may yield improved estimates of extreme events, but the approach is time consuming. An alternative model that is easier to implement, using a spatial quantile regression, is currently being investigated, which would also allow the introduction of further covariates, essential as the effects of climate change are incorporated into estimation procedures.
Chen, Le-Yu; Ho, Christine
2016-09-01
Incense burning for rituals or religious purposes is an important tradition in many countries. However, incense smoke contains particulate matter and gas products such as carbon monoxide, sulfur, and nitrogen dioxide, which are potentially harmful to health. We analyzed the relationship between prenatal incense burning and birth weight and head circumference at birth using the Taiwan Birth Cohort Study. We also analyzed whether the associations varied by sex and along the distribution of birth outcomes. We performed ordinary least squares (OLS) and quantile regressions analysis on a sample of 15,773 term births (> 37 gestational weeks; 8,216 boys and 7,557 girls) in Taiwan in 2005. The associations were estimated separately for boys and girls as well as for the population as a whole. We controlled extensively for factors that may be correlated with incense burning and birth weight and head circumference, such as parental religion, demographics, and health characteristics, as well as pregnancy-related variables. Findings from fully adjusted OLS regressions indicated that exposure to incense was associated with lower birth weight in boys (-18 g; 95% CI: -36, -0.94) but not girls (1 g; 95% CI: -17, 19; interaction p-value = 0.31). Associations with head circumference were negative for boys (-0.95 mm; 95% CI: -1.8, -0.16) and girls (-0.71 mm; 95% CI: -1.5, 0.11; interaction p-values = 0.73). Quantile regression results suggested that the negative associations were larger among the lower quantiles of birth outcomes. OLS regressions showed that prenatal incense burning was associated with lower birth weight for boys and smaller head circumference for boys and girls. The associations were more pronounced among the lower quantiles of birth outcomes. Further research is necessary to confirm whether incense burning has differential effects by sex. Chen LY, Ho C. 2016. Incense burning during pregnancy and birth weight and head circumference among term births: The Taiwan Birth Cohort Study. Environ Health Perspect 124:1487-1492; http://dx.doi.org/10.1289/ehp.1509922.
Chen, Le-Yu; Ho, Christine
2016-01-01
Background: Incense burning for rituals or religious purposes is an important tradition in many countries. However, incense smoke contains particulate matter and gas products such as carbon monoxide, sulfur, and nitrogen dioxide, which are potentially harmful to health. Objectives: We analyzed the relationship between prenatal incense burning and birth weight and head circumference at birth using the Taiwan Birth Cohort Study. We also analyzed whether the associations varied by sex and along the distribution of birth outcomes. Methods: We performed ordinary least squares (OLS) and quantile regressions analysis on a sample of 15,773 term births (> 37 gestational weeks; 8,216 boys and 7,557 girls) in Taiwan in 2005. The associations were estimated separately for boys and girls as well as for the population as a whole. We controlled extensively for factors that may be correlated with incense burning and birth weight and head circumference, such as parental religion, demographics, and health characteristics, as well as pregnancy-related variables. Results: Findings from fully adjusted OLS regressions indicated that exposure to incense was associated with lower birth weight in boys (–18 g; 95% CI: –36, –0.94) but not girls (1 g; 95% CI: –17, 19; interaction p-value = 0.31). Associations with head circumference were negative for boys (–0.95 mm; 95% CI: –1.8, –0.16) and girls (–0.71 mm; 95% CI: –1.5, 0.11; interaction p-values = 0.73). Quantile regression results suggested that the negative associations were larger among the lower quantiles of birth outcomes. Conclusions: OLS regressions showed that prenatal incense burning was associated with lower birth weight for boys and smaller head circumference for boys and girls. The associations were more pronounced among the lower quantiles of birth outcomes. Further research is necessary to confirm whether incense burning has differential effects by sex. Citation: Chen LY, Ho C. 2016. Incense burning during pregnancy and birth weight and head circumference among term births: The Taiwan Birth Cohort Study. Environ Health Perspect 124:1487–1492; http://dx.doi.org/10.1289/ehp.1509922 PMID:26967367
The Importance of Studying Past Extreme Floods to Prepare for Uncertain Future Extremes
NASA Astrophysics Data System (ADS)
Burges, S. J.
2016-12-01
Hoyt and Langbein, 1955 in their book `Floods' wrote: " ..meteorologic and hydrologic conditions will combine to produce superfloods of unprecedented magnitude. We have every reason to believe that in most rivers past floods may not be an accurate measure of ultimate flood potentialities. It is this superflood with which we are always most concerned". I provide several examples to offer some historical perspective on assessing extreme floods. In one example, flooding in the Miami Valley, OH in 1913 claimed 350 lives. The engineering and socio-economic challenges facing the Morgan Engineering Co in how to mitigate against future flood damage and loss of life when limited information was available provide guidance about ways to face an uncertain hydroclimate future, particularly one of a changed climate. A second example forces us to examine mixed flood populations and illustrates the huge uncertainty in assigning flood magnitude and exceedance probability to extreme floods in such cases. There is large uncertainty in flood frequency estimates; knowledge of the total flood hydrograph, not the peak flood flow rate alone, is what is needed for hazard mitigation assessment or design. Some challenges in estimating the complete flood hydrograph in an uncertain future climate, including demands on hydrologic models and their inputs, are addressed.
Magnitude and frequency of floods in Washington
Cummans, J.E.; Collings, Michael R.; Nasser, Edmund George
1975-01-01
Relations are provided to estimate the magnitude and frequency of floods on Washington streams. Annual-peak-flow data from stream gaging stations on unregulated streams having 1 years or more of record were used to determine a log-Pearson Type III frequency curve for each station. Flood magnitudes having recurrence intervals of 2, 5, i0, 25, 50, and 10years were then related to physical and climatic indices of the drainage basins by multiple-regression analysis using the Biomedical Computer Program BMDO2R. These regression relations are useful for estimating flood magnitudes of the specified recurrence intervals at ungaged or short-record sites. Separate sets of regression equations were defined for western and eastern parts of the State, and the State was further subdivided into 12 regions in which the annual floods exhibit similar flood characteristics. Peak flows are related most significantly in western Washington to drainage-area size and mean annual precipitation. In eastern Washington-they are related most significantly to drainage-area size, mean annual precipitation, and percentage of forest cover. Standard errors of estimate of the estimating relations range from 25 to 129 percent, and the smallest errors are generally associated with the more humid regions.
NASA Astrophysics Data System (ADS)
Jeong, C.; Om, J.; Hwang, J.; Joo, K.; Heo, J.
2013-12-01
In recent, the frequency of extreme flood has been increasing due to climate change and global warming. Highly flood damages are mainly caused by the collapse of flood control structures such as dam and dike. In order to reduce these disasters, the disaster management system (DMS) through flood forecasting, inundation mapping, EAP (Emergency Action Plan) has been studied. The estimation of inundation damage and practical EAP are especially crucial to the DMS. However, it is difficult to predict inundation and take a proper action through DMS in real emergency situation because several techniques for inundation damage estimation are not integrated and EAP is supplied in the form of a document in Korea. In this study, the integrated simulation system including rainfall frequency analysis, rainfall-runoff modeling, inundation prediction, surface runoff analysis, and inland flood analysis was developed. Using this system coupled with standard GIS data, inundation damage can be estimated comprehensively and automatically. The standard EAP based on BIM (Building Information Modeling) was also established in this system. It is, therefore, expected that the inundation damages through this study over the entire area including buildings can be predicted and managed.
Floods of May 30 to June 15, 2008, in the Iowa and Cedar River basins, eastern Iowa
Linhart, Mike S.; Eash, David A.
2010-01-01
As a result of prolonged and intense periods of rainfall in late May and early June, 2008, along with heavier than normal snowpack the previous winter, record flooding occurred in Iowa in the Iowa River and Cedar River Basins. The storms were part of an exceptionally wet period from May 29 through June 12, when an Iowa statewide average of 9.03 inches of rain fell; the normal statewide average for the same period is 2.45 inches. From May 29 to June 13, the 16-day rainfall totals recorded at rain gages in Iowa Falls and Clutier were 14.00 and 13.83 inches, respectively. Within the Iowa River Basin, peak discharges of 51,000 cubic feet per second (flood-probability estimate of 0.2 to 1 percent) at the 05453100 Iowa River at Marengo, Iowa streamflow-gaging station (streamgage) on June 12, and of 39,900 cubic feet per second (flood-probability estimate of 0.2 to 1 percent) at the 05453520 Iowa River below Coralville Dam near Coralville, Iowa streamgage on June 15 are the largest floods on record for those sites. A peak discharge of 41,100 cubic feet per second (flood-probability estimate of 0.2 to 1 percent) on June 15 at the 05454500 Iowa River at Iowa City, Iowa streamgage is the fourth highest on record, but is the largest flood since regulation by the Coralville Dam began in 1958. Within the Cedar River Basin, the May 30 to June 15, 2008, flood is the largest on record at all six streamgages in Iowa located on the mainstem of the Cedar River and at five streamgages located on the major tributaries. Flood-probability estimates for 10 of these 11 streamgages are less than 1 percent. Peak discharges of 112,000 cubic feet per second (flood-probability estimate of 0.2 to 1 percent) at the 05464000 Cedar River at Waterloo, Iowa streamgage on June 11 and of 140,000 cubic feet per second (flood-probability estimate of less than 0.2 percent) at the 05464500 Cedar River at Cedar Rapids, Iowa streamgage on June 13 are the largest floods on record for those sites. Downstream from the confluence of the Iowa and Cedar Rivers, the peak discharge of 188,000 cubic feet per second (flood-probability estimate of less than 0.2 percent) at the 05465500 Iowa River at Wapello, Iowa streamgage on June 14, 2008, is the largest flood on record in the Iowa River and Cedar River Basins since 1903. High-water marks were measured at 88 locations along the Iowa River between State Highway 99 near Oakville and U.S. Highway 69 in Belmond, a distance of 319 river miles. High-water marks were measured at 127 locations along the Cedar River between Fredonia near the mouth (confluence with the Iowa River) and Riverview Drive north of Charles City, a distance of 236 river miles. The high-water marks were used to develop flood profiles for the Iowa and Cedar River.
Spline methods for approximating quantile functions and generating random samples
NASA Technical Reports Server (NTRS)
Schiess, J. R.; Matthews, C. G.
1985-01-01
Two cubic spline formulations are presented for representing the quantile function (inverse cumulative distribution function) of a random sample of data. Both B-spline and rational spline approximations are compared with analytic representations of the quantile function. It is also shown how these representations can be used to generate random samples for use in simulation studies. Comparisons are made on samples generated from known distributions and a sample of experimental data. The spline representations are more accurate for multimodal and skewed samples and to require much less time to generate samples than the analytic representation.
Estimating extreme losses for the Florida Public Hurricane Model—part II
NASA Astrophysics Data System (ADS)
Gulati, Sneh; George, Florence; Hamid, Shahid
2018-02-01
Rising global temperatures are leading to an increase in the number of extreme events and losses (http://www.epa.gov/climatechange/science/indicators/). Accurate estimation of these extreme losses with the intention of protecting themselves against them is critical to insurance companies. In a previous paper, Gulati et al. (2014) discussed probable maximum loss (PML) estimation for the Florida Public Hurricane Loss Model (FPHLM) using parametric and nonparametric methods. In this paper, we investigate the use of semi-parametric methods to do the same. Detailed analysis of the data shows that the annual losses from FPHLM do not tend to be very heavy tailed, and therefore, neither the popular Hill's method nor the moment's estimator work well. However, Pickand's estimator with threshold around the 84th percentile provides a good fit for the extreme quantiles for the losses.
Restoration of Monotonicity Respecting in Dynamic Regression
Huang, Yijian
2017-01-01
Dynamic regression models, including the quantile regression model and Aalen’s additive hazards model, are widely adopted to investigate evolving covariate effects. Yet lack of monotonicity respecting with standard estimation procedures remains an outstanding issue. Advances have recently been made, but none provides a complete resolution. In this article, we propose a novel adaptive interpolation method to restore monotonicity respecting, by successively identifying and then interpolating nearest monotonicity-respecting points of an original estimator. Under mild regularity conditions, the resulting regression coefficient estimator is shown to be asymptotically equivalent to the original. Our numerical studies have demonstrated that the proposed estimator is much more smooth and may have better finite-sample efficiency than the original as well as, when available as only in special cases, other competing monotonicity-respecting estimators. Illustration with a clinical study is provided. PMID:29430068
A framework for global river flood risk assessment
NASA Astrophysics Data System (ADS)
Winsemius, H. C.; Van Beek, L. P. H.; Bouwman, A.; Ward, P. J.; Jongman, B.
2012-04-01
There is an increasing need for strategic global assessments of flood risks. Such assessments may be required by: (a) International Financing Institutes and Disaster Management Agencies to evaluate where, when, and which investments in flood risk mitigation are most required; (b) (re-)insurers, who need to determine their required coverage capital; and (c) large companies to account for risks of regional investments. In this contribution, we propose a framework for global river flood risk assessment. The framework combines coarse scale resolution hazard probability distributions, derived from global hydrological model runs (typical scale about 0.5 degree resolution) with high resolution estimates of exposure indicators. The high resolution is required because floods typically occur at a much smaller scale than the typical resolution of global hydrological models, and exposure indicators such as population, land use and economic value generally are strongly variable in space and time. The framework therefore estimates hazard at a high resolution ( 1 km2) by using a) global forcing data sets of the current (or in scenario mode, future) climate; b) a global hydrological model; c) a global flood routing model, and d) importantly, a flood spatial downscaling routine. This results in probability distributions of annual flood extremes as an indicator of flood hazard, at the appropriate resolution. A second component of the framework combines the hazard probability distribution with classical flood impact models (e.g. damage, affected GDP, affected population) to establish indicators for flood risk. The framework can be applied with a large number of datasets and models and sensitivities of such choices can be evaluated by the user. The framework is applied using the global hydrological model PCR-GLOBWB, combined with a global flood routing model. Downscaling of the hazard probability distributions to 1 km2 resolution is performed with a new downscaling algorithm, applied on a number of target regions. We demonstrate the use of impact models in these regions based on global GDP, population, and land use maps. In this application, we show sensitivities of the estimated risks with regard to the use of different climate input datasets, decisions made in the downscaling algorithm, and different approaches to establish distributed estimates of GDP and asset exposure to flooding.
Yanosky, Thomas M.
1983-01-01
Ash trees along the Potomac River flood plain near Washington, D.C., were studied to determine changes in wood anatomy related to flood damage, and anomalous growth was compared to flood records for April 15 to August 31, 1930-79. Collectively, anatomical evidence was detected for 33 of the 34 growing-season floods during the study period. Evidence of 12 floods prior to 1930 was also noted, including catastrophic ones in 1889 and 1924. Trees damaged after the transition from earlywood to latewood growth typically formed ' flood rings ' of enlarged vessels within the latewood zone. Trees damaged near the beginning of the growth year developed flood rings within, or contiguous with, the earlywood. Both patterns are assumed to have developed when flood-damaged trees produced a second crop of leaves. Trees damaged by high-magnitude floods developed well formed flood rings along the entire height and around the entire circumference of the stem. Small floods were generally associated wtih diffuse or discontinuous anomalies restricted to stem apices. Frequency of flood rings was positively related to flood magnitude, and time of flood generation during the tree-growth season was estimated from the radial position of anomalous growth relative to annual ring width. Reconstructing tree heights in a year of flood-ring formation gives a minimum stage estimate along local stream reaches. Some trees provided evidence of numerous floods. Those with the greatest number of flood rings grew on frequently flooded surfaces subject to flood-flow velocities of at least 1 m/s, and more typically greater than 2 m/s. Tree size, more than age, was related to flood-ring formation. Trees kept small by frequent flood damage had more flood rings than taller trees of comparable age. (USGS)
NASA Astrophysics Data System (ADS)
Kinoshita, Youhei; Tanoue, Masahiro; Watanabe, Satoshi; Hirabayashi, Yukiko
2018-01-01
This study represents the first attempt to quantify the effects of autonomous adaptation on the projection of global flood hazards and to assess future flood risk by including this effect. A vulnerability scenario, which varies according to the autonomous adaptation effect for conventional disaster mitigation efforts, was developed based on historical vulnerability values derived from flood damage records and a river inundation simulation. Coupled with general circulation model outputs and future socioeconomic scenarios, potential future flood fatalities and economic loss were estimated. By including the effect of autonomous adaptation, our multimodel ensemble estimates projected a 2.0% decrease in potential flood fatalities and an 821% increase in potential economic losses by 2100 under the highest emission scenario together with a large population increase. Vulnerability changes reduced potential flood consequences by 64%-72% in terms of potential fatalities and 28%-42% in terms of potential economic losses by 2100. Although socioeconomic changes made the greatest contribution to the potential increased consequences of future floods, about a half of the increase of potential economic losses was mitigated by autonomous adaptation. There is a clear and positive relationship between the global temperature increase from the pre-industrial level and the estimated mean potential flood economic loss, while there is a negative relationship with potential fatalities due to the autonomous adaptation effect. A bootstrapping analysis suggests a significant increase in potential flood fatalities (+5.7%) without any adaptation if the temperature increases by 1.5 °C-2.0 °C, whereas the increase in potential economic loss (+0.9%) was not significant. Our method enables the effects of autonomous adaptation and additional adaptation efforts on climate-induced hazards to be distinguished, which would be essential for the accurate estimation of the cost of adaptation to climate change.
Fifty-year flood-inundation maps for Tocoa, Honduras
Kresch, David L.; Mastin, Mark C.; Olsen, T.D.
2002-01-01
After the devastating floods caused by Hurricane Mitch in 1998, maps of the areas and depths of the 50-year-flood inundation at 15 municipalities in Honduras were prepared as a tool for agencies involved in reconstruction and planning. This report, which is one in a series of 15, presents maps of areas in the municipality of Tocoa that would be inundated by a 50-year flood of Rio Tocoa. Geographic Information System (GIS) coverages of the flood inundation are available on a computer in the municipality of Tocoa as part of the Municipal GIS project and on the Internet at the Flood Hazard Mapping Web page (http://mitchnts1.cr.usgs.gov/projects/floodhazard.html). These coverages allow users to view the flood inundation in much more detail than is possible using the maps in this report. Water-surface elevations for an estimated 50-year-flood on Rio Tocoa at Tocoa were estimated using HEC-RAS, a one-dimensional, steady-flow, step-backwater computer program. The channel and floodplain cross sections used in HEC-RAS were developed from an airborne light-detection-and-ranging (LIDAR) topographic survey of the area and a ground survey at one bridge. There are no nearby long-term stream-gaging stations on Rio Tocoa; therefore, the 50-year-flood discharge for Rio Tocoa, 552 cubic meters per second, was estimated using a regression equation that relates the 50-year-flood discharge to drainage area and mean annual precipitation. The drainage area and mean annual precipitation estimated for Rio Tocoa at Tocoa are 204 square kilometers and 1,987 millimeters, respectively. It was assumed that a portion of the 50-year flood, 200 cubic meters per second, would escape the main channel and flow down a side channel before re-entering the main channel again near the lower end of the study area.
The end of trend-estimation for extreme floods under climate change?
NASA Astrophysics Data System (ADS)
Schulz, Karsten; Bernhardt, Matthias
2016-04-01
An increased risk of flood events is one of the major threats under future climate change conditions. Therefore, many recent studies have investigated trends in flood extreme occurences using historic long-term river discharge data as well as simulations from combined global/regional climate and hydrological models. Severe floods are relatively rare events and the robust estimation of their probability of occurrence requires long time series of data (6). Following a method outlined by the IPCC research community, trends in extreme floods are calculated based on the difference of discharge values exceeding e.g. a 100-year level (Q100) between two 30-year windows, which represents prevailing conditions in a reference and a future time period, respectively. Following this approach, we analysed multiple, synthetically derived 2,000-year trend-free, yearly maximum runoff data generated using three different extreme value distributions (EDV). The parameters were estimated from long term runoff data of four large European watersheds (Danube, Elbe, Rhine, Thames). Both, Q100-values estimated from 30-year moving windows, as well as the subsequently derived trends showed enormous variations with time: for example, estimating the Extreme Value (Gumbel) - distribution for the Danube data, trends of Q100 in the synthetic time-series range from -4,480 to 4,028 m³/s per 100 years (Q100 =10,071m³/s, for reference). Similar results were found when applying other extreme value distributions (Weibull, and log-Normal) to all of the watersheds considered. This variability or "background noise" of estimating trends in flood extremes makes it almost impossible to significantly distinguish any real trend in observed as well as modelled data when such an approach is applied. These uncertainties, even though known in principle are hardly addressed and discussed by the climate change impact community. Any decision making and flood risk management, including the dimensioning of flood protection measures, that is based on such studies might therefore be fundamentally flawed.
Natural Flood Management Plus: Scaling Up Nature Based Solutions to Larger Catchments
NASA Astrophysics Data System (ADS)
Quinn, Paul; Nicholson, Alex; Adams, Russ
2017-04-01
It has been established that networks NFM features, such as ponds and wetlands, can have a significant effect on flood flow and pollution at local scales (less than 10km2). However, it is much less certain that NFM and NBS can impact at larger scales and protect larger cities. This is especially true for recent storms in the UK such as storm Desmond that caused devastation across the north of England. It is possible using observed rainfall and runoff data to estimate the amounts of storage that would be required to impact on extreme flood events. Here we will how a toolkit that will estimate the amount of storage that can be accrued through a dense networks of NFM features. The analysis suggest that the use of many hundreds of small NFM features can have a significant impact on peak flow, however we still require more storage in order to address extreme events and to satisfy flood engineers who may propose more traditional flood defences. We will also show case studies of larger NFM feature positioned on flood plains that can store significantly more flood flow. Examples designs of NFM plus feature will be shown. The storage aggregation tool will then show the degree to which storing large amounts of flood flow in NFM plus features can contribute to flood management and estimate the likely costs. Together smaller and larger NFM features if used together can produce significant flood storage and at a much lower cost than traditional schemes.
NASA Astrophysics Data System (ADS)
Wobus, Cameron; Gutmann, Ethan; Jones, Russell; Rissing, Matthew; Mizukami, Naoki; Lorie, Mark; Mahoney, Hardee; Wood, Andrew W.; Mills, David; Martinich, Jeremy
2017-12-01
A growing body of work suggests that the extreme weather events that drive inland flooding are likely to increase in frequency and magnitude in a warming climate, thus potentially increasing flood damages in the future. We use hydrologic projections based on the Coupled Model Intercomparison Project Phase 5 (CMIP5) to estimate changes in the frequency of modeled 1 % annual exceedance probability (1 % AEP, or 100-year) flood events at 57 116 stream reaches across the contiguous United States (CONUS). We link these flood projections to a database of assets within mapped flood hazard zones to model changes in inland flooding damages throughout the CONUS over the remainder of the 21st century. Our model generates early 21st century flood damages that reasonably approximate the range of historical observations and trajectories of future damages that vary substantially depending on the greenhouse gas (GHG) emissions pathway. The difference in modeled flood damages between higher and lower emissions pathways approaches USD 4 billion per year by 2100 (in undiscounted 2014 dollars), suggesting that aggressive GHG emissions reductions could generate significant monetary benefits over the long term in terms of reduced flood damages. Although the downscaled hydrologic data we used have been applied to flood impacts studies elsewhere, this research expands on earlier work to quantify changes in flood risk by linking future flood exposure to assets and damages on a national scale. Our approach relies on a series of simplifications that could ultimately affect damage estimates (e.g., use of statistical downscaling, reliance on a nationwide hydrologic model, and linking damage estimates only to 1 % AEP floods). Although future work is needed to test the sensitivity of our results to these methodological choices, our results indicate that monetary damages from inland flooding could be significantly reduced through substantial GHG mitigation.
NASA Astrophysics Data System (ADS)
Wobus, C. W.; Gutmann, E. D.; Jones, R.; Rissing, M.; Mizukami, N.; Lorie, M.; Mahoney, H.; Wood, A.; Mills, D.; Martinich, J.
2017-12-01
A growing body of recent work suggests that the extreme weather events that drive inland flooding are likely to increase in frequency and magnitude in a warming climate, thus increasing monetary damages from flooding in the future. We use hydrologic projections based on the Coupled Model Intercomparison Project Phase 5 (CMIP5) to estimate changes in the frequency of modeled 1% annual exceedance probability flood events at 57,116 locations across the contiguous United States (CONUS). We link these flood projections to a database of assets within mapped flood hazard zones to model changes in inland flooding damages throughout the CONUS over the remainder of the 21st century, under two greenhouse gas (GHG) emissions scenarios. Our model generates early 21st century flood damages that reasonably approximate the range of historical observations, and trajectories of future damages that vary substantially depending on the GHG emissions pathway. The difference in modeled flood damages between higher and lower emissions pathways approaches $4 billion per year by 2100 (in undiscounted 2014 dollars), suggesting that aggressive GHG emissions reductions could generate significant monetary benefits over the long-term in terms of reduced flood risk. Although the downscaled hydrologic data we used have been applied to flood impacts studies elsewhere, this research expands on earlier work to quantify changes in flood risk by linking future flood exposure to assets and damages at a national scale. Our approach relies on a series of simplifications that could ultimately affect damage estimates (e.g., use of statistical downscaling, reliance on a nationwide hydrologic model, and linking damage estimates only to 1% AEP floods). Although future work is needed to test the sensitivity of our results to these methodological choices, our results suggest that monetary damages from inland flooding could be substantially reduced through more aggressive GHG mitigation policies.
Jennings, M.E.; Thomas, W.O.; Riggs, H.C.
1994-01-01
For many years, the U.S. Geological Survey (USGS) has been involved in the development of regional regression equations for estimating flood magnitude and frequency at ungaged sites. These regression equations are used to transfer flood characteristics from gaged to ungaged sites through the use of watershed and climatic characteristics as explanatory or predictor variables. Generally these equations have been developed on a statewide or metropolitan area basis as part of cooperative study programs with specific State Departments of Transportation or specific cities. The USGS, in cooperation with the Federal Highway Administration and the Federal Emergency Management Agency, has compiled all the current (as of September 1993) statewide and metropolitan area regression equations into a micro-computer program titled the National Flood Frequency Program.This program includes regression equations for estimating flood-peak discharges and techniques for estimating a typical flood hydrograph for a given recurrence interval peak discharge for unregulated rural and urban watersheds. These techniques should be useful to engineers and hydrologists for planning and design applications. This report summarizes the statewide regression equations for rural watersheds in each State, summarizes the applicable metropolitan area or statewide regression equations for urban watersheds, describes the National Flood Frequency Program for making these computations, and provides much of the reference information on the extrapolation variables needed to run the program.
Emerson, Joanne B; Keady, Patricia B; Brewer, Tess E; Clements, Nicholas; Morgan, Emily E; Awerbuch, Jonathan; Miller, Shelly L; Fierer, Noah
2015-03-03
Flood-damaged homes typically have elevated microbial loads, and their occupants have an increased incidence of allergies, asthma, and other respiratory ailments, yet the microbial communities in these homes remain under-studied. Using culture-independent approaches, we characterized bacterial and fungal communities in homes in Boulder, CO, USA 2-3 months after the historic September, 2013 flooding event. We collected passive air samples from basements in 50 homes (36 flood-damaged, 14 non-flooded), and we sequenced the bacterial 16S rRNA gene (V4-V5 region) and the fungal ITS1 region from these samples for community analyses. Quantitative PCR was used to estimate the abundances of bacteria and fungi in the passive air samples. Results indicate significant differences in bacterial and fungal community composition between flooded and non-flooded homes. Fungal abundances were estimated to be three times higher in flooded, relative to non-flooded homes, but there were no significant differences in bacterial abundances. Penicillium (fungi) and Pseudomonadaceae and Enterobacteriaceae (bacteria) were among the most abundant taxa in flooded homes. Our results suggest that bacterial and fungal communities continue to be affected by flooding, even after relative humidity has returned to baseline levels and remediation has removed any visible evidence of flood damage.
Hurricane Sandy's flood frequency increasing from year 1800 to 2100.
Lin, Ning; Kopp, Robert E; Horton, Benjamin P; Donnelly, Jeffrey P
2016-10-25
Coastal flood hazard varies in response to changes in storm surge climatology and the sea level. Here we combine probabilistic projections of the sea level and storm surge climatology to estimate the temporal evolution of flood hazard. We find that New York City's flood hazard has increased significantly over the past two centuries and is very likely to increase more sharply over the 21st century. Due to the effect of sea level rise, the return period of Hurricane Sandy's flood height decreased by a factor of ∼3× from year 1800 to 2000 and is estimated to decrease by a further ∼4.4× from 2000 to 2100 under a moderate-emissions pathway. When potential storm climatology change over the 21st century is also accounted for, Sandy's return period is estimated to decrease by ∼3× to 17× from 2000 to 2100.
Hurricane Sandy’s flood frequency increasing from year 1800 to 2100
Horton, Benjamin P.; Donnelly, Jeffrey P.
2016-01-01
Coastal flood hazard varies in response to changes in storm surge climatology and the sea level. Here we combine probabilistic projections of the sea level and storm surge climatology to estimate the temporal evolution of flood hazard. We find that New York City’s flood hazard has increased significantly over the past two centuries and is very likely to increase more sharply over the 21st century. Due to the effect of sea level rise, the return period of Hurricane Sandy’s flood height decreased by a factor of ∼3× from year 1800 to 2000 and is estimated to decrease by a further ∼4.4× from 2000 to 2100 under a moderate-emissions pathway. When potential storm climatology change over the 21st century is also accounted for, Sandy’s return period is estimated to decrease by ∼3× to 17× from 2000 to 2100. PMID:27790992
Kennedy, Jeffrey R.; Paretti, Nicholas V.; Veilleux, Andrea G.
2014-01-01
Regression equations, which allow predictions of n-day flood-duration flows for selected annual exceedance probabilities at ungaged sites, were developed using generalized least-squares regression and flood-duration flow frequency estimates at 56 streamgaging stations within a single, relatively uniform physiographic region in the central part of Arizona, between the Colorado Plateau and Basin and Range Province, called the Transition Zone. Drainage area explained most of the variation in the n-day flood-duration annual exceedance probabilities, but mean annual precipitation and mean elevation were also significant variables in the regression models. Standard error of prediction for the regression equations varies from 28 to 53 percent and generally decreases with increasing n-day duration. Outside the Transition Zone there are insufficient streamgaging stations to develop regression equations, but flood-duration flow frequency estimates are presented at select streamgaging stations.
NASA Astrophysics Data System (ADS)
Le Bihan, Guillaume; Payrastre, Olivier; Gaume, Eric; Moncoulon, David; Pons, Frédéric
2017-11-01
Up to now, flash flood monitoring and forecasting systems, based on rainfall radar measurements and distributed rainfall-runoff models, generally aimed at estimating flood magnitudes - typically discharges or return periods - at selected river cross sections. The approach presented here goes one step further by proposing an integrated forecasting chain for the direct assessment of flash flood possible impacts on inhabited areas (number of buildings at risk in the presented case studies). The proposed approach includes, in addition to a distributed rainfall-runoff model, an automatic hydraulic method suited for the computation of flood extent maps on a dense river network and over large territories. The resulting catalogue of flood extent maps is then combined with land use data to build a flood impact curve for each considered river reach, i.e. the number of inundated buildings versus discharge. These curves are finally used to compute estimated impacts based on forecasted discharges. The approach has been extensively tested in the regions of Alès and Draguignan, located in the south of France, where well-documented major flash floods recently occurred. The article presents two types of validation results. First, the automatically computed flood extent maps and corresponding water levels are tested against rating curves at available river gauging stations as well as against local reference or observed flood extent maps. Second, a rich and comprehensive insurance claim database is used to evaluate the relevance of the estimated impacts for some recent major floods.
Canyon formation constraints on the discharge of catastrophic outburst floods of Earth and Mars
NASA Astrophysics Data System (ADS)
Lapotre, Mathieu G. A.; Lamb, Michael P.; Williams, Rebecca M. E.
2016-07-01
Catastrophic outburst floods carved amphitheater-headed canyons on Earth and Mars, and the steep headwalls of these canyons suggest that some formed by upstream headwall propagation through waterfall erosion processes. Because topography evolves in concert with water flow during canyon erosion, we suggest that bedrock canyon morphology preserves hydraulic information about canyon-forming floods. In particular, we propose that for a canyon to form with a roughly uniform width by upstream headwall retreat, erosion must occur around the canyon head, but not along the sidewalls, such that canyon width is related to flood discharge. We develop a new theory for bedrock canyon formation by megafloods based on flow convergence of large outburst floods toward a horseshoe-shaped waterfall. The model is developed for waterfall erosion by rock toppling, a candidate erosion mechanism in well fractured rock, like columnar basalt. We apply the model to 14 terrestrial (Channeled Scablands, Washington; Snake River Plain, Idaho; and Ásbyrgi canyon, Iceland) and nine Martian (near Ares Vallis and Echus Chasma) bedrock canyons and show that predicted flood discharges are nearly 3 orders of magnitude less than previously estimated, and predicted flood durations are longer than previously estimated, from less than a day to a few months. Results also show a positive correlation between flood discharge per unit width and canyon width, which supports our hypothesis that canyon width is set in part by flood discharge. Despite lower discharges than previously estimated, the flood volumes remain large enough for individual outburst floods to have perturbed the global hydrology of Mars.
Intersection of All Top Quantile
This layer combines the Top quantiles of the CES, CEVA, and EJSM layers so that viewers can see the overlap of 00e2??hot spots00e2?? for each method. This layer was created by James Sadd of Occidental College of Los Angeles
Intersection of Screening Methods High Quantile
This layer combines the high quantiles of the CES, CEVA, and EJSM layers so that viewers can see the overlap of 00e2??hot spots00e2?? for each method. This layer was created by James Sadd of Occidental College of Los Angeles
Freni, G; La Loggia, G; Notaro, V
2010-01-01
Due to the increased occurrence of flooding events in urban areas, many procedures for flood damage quantification have been defined in recent decades. The lack of large databases in most cases is overcome by combining the output of urban drainage models and damage curves linking flooding to expected damage. The application of advanced hydraulic models as diagnostic, design and decision-making support tools has become a standard practice in hydraulic research and application. Flooding damage functions are usually evaluated by a priori estimation of potential damage (based on the value of exposed goods) or by interpolating real damage data (recorded during historical flooding events). Hydraulic models have undergone continuous advancements, pushed forward by increasing computer capacity. The details of the flooding propagation process on the surface and the details of the interconnections between underground and surface drainage systems have been studied extensively in recent years, resulting in progressively more reliable models. The same level of was advancement has not been reached with regard to damage curves, for which improvements are highly connected to data availability; this remains the main bottleneck in the expected flooding damage estimation. Such functions are usually affected by significant uncertainty intrinsically related to the collected data and to the simplified structure of the adopted functional relationships. The present paper aimed to evaluate this uncertainty by comparing the intrinsic uncertainty connected to the construction of the damage-depth function to the hydraulic model uncertainty. In this way, the paper sought to evaluate the role of hydraulic model detail level in the wider context of flood damage estimation. This paper demonstrated that the use of detailed hydraulic models might not be justified because of the higher computational cost and the significant uncertainty in damage estimation curves. This uncertainty occurs mainly because a large part of the total uncertainty is dependent on depth-damage curves. Improving the estimation of these curves may provide better results in term of uncertainty reduction than the adoption of detailed hydraulic models.
NASA Astrophysics Data System (ADS)
Horat, Christoph; Antonetti, Manuel; Wernli, Heini; Zappa, Massimiliano
2017-04-01
Flash floods evolve rapidly during and after heavy precipitation events and represent a risk for society, especially in mountainous areas. Knowledge on meteorological variables and their temporal development is often not sufficient to predict their occurrence. Therefore, information about the state of the hydrological system derived from hydrological models is used. These models rely however on strong simplifying assumptions and need therefore to be calibrated. This prevents their application on catchments, where no runoff data is available. Here we present a flash-flood forecasting chain including: (i) a nowcasting product which combines radar and rain gauge rainfall data (CombiPrecip), (ii) meteorological data from numerical weather prediction models at currently finest available resolution (COSMO-1, COSMO-E), (iii) operationally available soil moisture estimations from the PREVAH hydrological model, and (iv) a process-based runoff generation module with no need for calibration (RGM-PRO). This last component uses information on the spatial distribution of dominant runoff processes (DRPs) which can be derived with different mapping approaches, and is parameterised a priori based on expert knowledge. First, we compared the performance of RGM-PRO with the one of a traditional conceptual runoff generation module for several events on Swiss Emme catchment, as well as on their nested catchments. Different DRP-maps are furthermore tested to evaluate the sensitivity of the forecasting chain to the mapping approaches. Then, we benchmarked the new forecasting chain with the traditional chain used on the Swiss Verzasca catchment. The results show that RGM-PRO performs similarly or even better than the traditional calibrated conceptual module on the investigated catchments. The use of strongly simplified DRP mapping approaches still leads to satisfying results, due mainly to the fact that the largest uncertainty source is represented by the meteorological input data. On the Verzasca catchment, RGM-PRO outperformed the traditional forecast chain in terms of mean absolute error, independently from the lead time and threshold quantile, whereas the Brier Skill Score did not show any clear preference. Probabilistic input data led generally to better results compared with those obtained with deterministic forecasts.
Guay, Joel R.; Harmon, Jerry G.; McPherson, Kelly R.
1998-01-01
The damage caused by the January 1997 floods along the Cosumnes River and Deer Creek generated new interest in planning and managing land use in the study area. The 1997 floodflow peak, the highest on record and considered to be a 150-year flood, caused levee failures at 24 locations. In order to provide a technical basis for floodplain management practices, the U.S. Goelogical Survey, in cooperation with the Federal Emergency Management Agency, completed a flood-inundation map of the Cosumnes River and Deer Creek drainage from Dillard Road bridge to State Highway 99. Flood frequency was estimated from streamflow records for the Cosumnes River at Michigan Bar and Deer Creek near Sloughhouse. Cross sections along a study reach, where the two rivers generally flow parallel to one another, were used with a step-backwater model (WSPRO) to estimate the water-surface profile for floods of selected recurrence intervals. A flood-inundation map was developed to show flood boundaries for the 100-year flood. Water-surface profiles were developed for the 5-, 10-, 50-, 100-, and 500-year floods.
Amugsi, Dickson A; Dimbuene, Zacharie T; Kimani-Murage, Elizabeth W; Mberu, Blessing; Ezeh, Alex C
2017-04-01
To investigate the differential effects of dietary diversity (DD) and maternal characteristics on child linear growth at different points of the conditional distribution of height-for-age Z-score (HAZ) in sub-Saharan Africa. Secondary analysis of data from nationally representative cross-sectional samples of singleton children aged 0-59 months, born to mothers aged 15-49 years. The outcome variable was child HAZ. Quantile regression was used to perform the multivariate analysis. The most recent Demographic and Health Surveys from Ghana, Nigeria, Kenya, Mozambique and Democratic Republic of Congo (DRC). The present analysis was restricted to children aged 6-59 months (n 31 604). DD was associated positively with HAZ in the first four quantiles (5th, 10th, 25th and 50th) and the highest quantile (90th) in Nigeria. The largest effect occurred at the very bottom (5th quantile) and the very top (90th quantile) of the conditional HAZ distribution. In DRC, DD was significantly and positively associated with HAZ in the two lower quantiles (5th, 10th). The largest effects of maternal education occurred at the lower end of the conditional HAZ distribution in Ghana, Nigeria and DRC. Maternal BMI and height also had positive effects on HAZ at different points of the conditional distribution of HAZ. Our analysis shows that the association between DD and maternal factors and HAZ differs along the conditional HAZ distribution. Intervention measures need to take into account the heterogeneous effect of the determinants of child nutritional status along the different percentiles of the HAZ distribution.
Ensuring the consistancy of Flow Direction Curve reconstructions: the 'quantile solidarity' approach
NASA Astrophysics Data System (ADS)
Poncelet, Carine; Andreassian, Vazken; Oudin, Ludovic
2015-04-01
Flow Duration Curves (FDCs) are a hydrologic tool describing the distribution of streamflows at a catchment outlet. FDCs are usually used for calibration of hydrological models, managing water quality and classifying catchments, among others. For gauged catchments, empirical FDCs can be computed from streamflow records. For ungauged catchments, on the other hand, FDCs cannot be obtained from streamflow records and must therefore be obtained in another manner, for example through reconstructions. Regression-based reconstructions are methods relying on the evaluation of quantiles separately from catchments' attributes (climatic or physical features).The advantage of this category of methods is that it is informative about the processes and it is non-parametric. However, the large number of parameters required can cause unwanted artifacts, typically reconstructions that do not always produce increasing quantiles. In this paper we propose a new approach named Quantile Solidarity (QS), which is applied under strict proxy-basin test conditions (Klemes, 1986) to a set of 600 French catchments. Half of the catchments are considered as gauged and used to calibrate the regression and compute residuals of the regression. The QS approach consists in a three-step regionalization scheme, which first links quantile values to physical descriptors, then reduces the number of regression parameters and finally exploits the spatial correlation of the residuals. The innovation is the utilisation of the parameters continuity across the quantiles to dramatically reduce the number of parameters. The second half of catchment is used as an independent validation set over which we show that the QS approach ensures strictly growing FDC reconstructions in ungauged conditions. Reference: V. KLEMEŠ (1986) Operational testing of hydrological simulation models, Hydrological Sciences Journal, 31:1, 13-24
Flood-frequency characteristics of Wisconsin streams
Walker, John F.; Peppler, Marie C.; Danz, Mari E.; Hubbard, Laura E.
2017-05-22
Flood-frequency characteristics for 360 gaged sites on unregulated rural streams in Wisconsin are presented for percent annual exceedance probabilities ranging from 0.2 to 50 using a statewide skewness map developed for this report. Equations of the relations between flood-frequency and drainage-basin characteristics were developed by multiple-regression analyses. Flood-frequency characteristics for ungaged sites on unregulated, rural streams can be estimated by use of the equations presented in this report. The State was divided into eight areas of similar physiographic characteristics. The most significant basin characteristics are drainage area, soil saturated hydraulic conductivity, main-channel slope, and several land-use variables. The standard error of prediction for the equation for the 1-percent annual exceedance probability flood ranges from 56 to 70 percent for Wisconsin Streams; these values are larger than results presented in previous reports. The increase in the standard error of prediction is likely due to increased variability of the annual-peak discharges, resulting in increased variability in the magnitude of flood peaks at higher frequencies. For each of the unregulated rural streamflow-gaging stations, a weighted estimate based on the at-site log Pearson type III analysis and the multiple regression results was determined. The weighted estimate generally has a lower uncertainty than either the Log Pearson type III or multiple regression estimates. For regulated streams, a graphical method for estimating flood-frequency characteristics was developed from the relations of discharge and drainage area for selected annual exceedance probabilities. Graphs for the major regulated streams in Wisconsin are presented in the report.
Feaster, Toby D.; Gotvald, Anthony J.; Weaver, J. Curtis
2014-01-01
Reliable estimates of the magnitude and frequency of floods are essential for the design of transportation and water-conveyance structures, flood insurance studies, and flood-plain management. Flood-frequency estimates are particularly important in densely populated urban areas. The U.S. Geological Survey (USGS) used a multistate approach to update methods for determining the magnitude and frequency of floods in urban and small, rural streams that are not substantially affected by regulation or tidal fluctuations in Georgia, South Carolina, and North Carolina (Feaster and others, 2014). The multistate approach has the advantage over a single state approach of increasing the number of streamflow-gaging station (streamgages) available for analysis, expanding the geographical coverage that would allow for application of regional regression equations across state boundaries, and building on a previous flood-frequency investigation of rural streamgages in the Southeastern United States. This investigation was funded as part of a cooperative program of water-resources investigations between the USGS, the South Carolina Department of Transportation, and the North Carolina Department of Transportation. In addition, much of the data and information for the Georgia streamgages was funded through a similar cooperative program with the Georgia Department of Transportation.
Magnitude of flood flows for selected annual exceedance probabilities for streams in Massachusetts
Zarriello, Phillip J.
2017-05-11
The U.S. Geological Survey, in cooperation with the Massachusetts Department of Transportation, determined the magnitude of flood flows at selected annual exceedance probabilities (AEPs) at streamgages in Massachusetts and from these data developed equations for estimating flood flows at ungaged locations in the State. Flood magnitudes were determined for the 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent AEPs at 220 streamgages, 125 of which are in Massachusetts and 95 are in the adjacent States of Connecticut, New Hampshire, New York, Rhode Island, and Vermont. AEP flood flows were computed for streamgages using the expected moments algorithm weighted with a recently computed regional skewness coefficient for New England.Regional regression equations were developed to estimate the magnitude of floods for selected AEP flows at ungaged sites from 199 selected streamgages and for 60 potential explanatory basin characteristics. AEP flows for 21 of the 125 streamgages in Massachusetts were not used in the final regional regression analysis, primarily because of regulation or redundancy. The final regression equations used generalized least squares methods to account for streamgage record length and correlation. Drainage area, mean basin elevation, and basin storage explained 86 to 93 percent of the variance in flood magnitude from the 50- to 0.2-percent AEPs, respectively. The estimates of AEP flows at streamgages can be improved by using a weighted estimate that is based on the magnitude of the flood and associated uncertainty from the at-site analysis and the regional regression equations. Weighting procedures for estimating AEP flows at an ungaged site on a gaged stream also are provided that improve estimates of flood flows at the ungaged site when hydrologic characteristics do not abruptly change.Urbanization expressed as the percentage of imperviousness provided some explanatory power in the regional regression; however, it was not statistically significant at the 95-percent confidence level for any of the AEPs examined. The effect of urbanization on flood flows indicates a complex interaction with other basin characteristics. Another complicating factor is the assumption of stationarity, that is, the assumption that annual peak flows exhibit no significant trend over time. The results of the analysis show that stationarity does not prevail at all of the streamgages. About 27 percent of streamgages in Massachusetts and about 42 percent of streamgages in adjacent States with 20 or more years of systematic record used in the study show a significant positive trend at the 95-percent confidence level. The remaining streamgages had both positive and negative trends, but the trends were not statistically significant. Trends were shown to vary over time. In particular, during the past decade (2004–2013), peak flows were persistently above normal, which may give the impression of positive trends. Only continued monitoring will provide the information needed to determine whether recent increases in annual peak flows are a normal oscillation or a true trend.The analysis used 37 years of additional data obtained since the last comprehensive study of flood flows in Massachusetts. In addition, new methods for computing flood flows at streamgages and regionalization improved estimates of flood magnitudes at gaged and ungaged locations and better defined the uncertainty of the estimates of AEP floods.
Berenbrock, Charles
2003-01-01
Improved flood-frequency estimates for short-term (10 or fewer years of record) streamflow-gaging stations were needed to support instream flow studies by the U.S. Forest Service, which are focused on quantifying water rights necessary to maintain or restore productive fish habitat. Because peak-flow data for short-term gaging stations can be biased by having been collected during an unusually wet, dry, or otherwise unrepresentative period of record, the data may not represent the full range of potential floods at a site. To test whether peak-flow estimates for short-term gaging stations could be improved, the two-station comparison method was used to adjust the logarithmic mean and logarithmic standard deviation of peak flows for seven short-term gaging stations in the Salmon and Clearwater River Basins, central Idaho. Correlation coefficients determined from regression of peak flows for paired short-term and long-term (more than 10 years of record) gaging stations over a concurrent period of record indicated that the mean and standard deviation of peak flows for all short-term gaging stations would be improved. Flood-frequency estimates for seven short-term gaging stations were determined using the adjusted mean and standard deviation. The original (unadjusted) flood-frequency estimates for three of the seven short-term gaging stations differed from the adjusted estimates by less than 10 percent, probably because the data were collected during periods representing the full range of peak flows. Unadjusted flood-frequency estimates for four short-term gaging stations differed from the adjusted estimates by more than 10 percent; unadjusted estimates for Little Slate Creek and Salmon River near Obsidian differed from adjusted estimates by nearly 30 percent. These large differences probably are attributable to unrepresentative periods of peak-flow data collection.
DOT National Transportation Integrated Search
1997-06-01
This report presents: (1) calculation of flood frequency for the Ward Creek watershed using eight flood prediction models, (2) establishment of the rating curve (stage-discharge relation) for the Ward Creek watershed, (3) evaluation of these flood pr...
Flood prediction, its risk and mitigation for the Babura River with GIS
NASA Astrophysics Data System (ADS)
Tarigan, A. P. M.; Hanie, M. Z.; Khair, H.; Iskandar, R.
2018-03-01
This paper describes the flood prediction along the Babura River, the catchment of which is within the comparatively larger watershed of the Deli River which crosses the centre part of Medan City. The flood plain and ensuing inundation area were simulated using HECRAS based on the available data of rainfall, catchment, and river cross-sections. The results were shown in a GIS format in which the city map of Medan and other infrastructure layers were stacked for spatial analysis. From the resulting GIS, it can be seen that 13 sub-districts were likely affected by the flood, and then the risk calculation of the flood damage could be estimated. In the spirit of flood mitigation thoughts, 6 locations of evacuation centres were identified and 15 evacuation routes were recommended to reach the centres. It is hoped that the flood prediction and its risk estimation in this study will inspire the preparedness of the stakeholders for the probable threat of flood disaster.
Storm and flood of July 5, 1989, in northern New Castle County, Delaware
Paulachok, G.N.; Simmons, R.H.; Tallman, A.J.
1995-01-01
On July 5, 1989, intense rainfall from the remnants of Tropical Storm Allison caused severe flooding in northern New Castle County, Delaware. The flooding claimed three lives, and damage was estimated to be $5 million. Flood conditions were aggravated locally by rapid runoff from expansive urban areas. Record- breaking floods occurred on many streams in northern New Castle County. Peak discharges at three active, continuous-record streamflow-gaging stations, one active crest-stage station, and at two discontinued streamflow-gaging stations exceeded previously recorded maximums. Estimated recurrence intervals for peak flow at the three active, continuous-record streamflow stations exceeded 100 years. The U.S. Geological Survey conducted comprehensive post-flood surveys to determine peak water-surface elevations that occurred on affected streams and their tributaries during the flood of July 5, 1989. Detailed surveys were performed near bridge crossings to provide additional information on the extent and severity of the flooding and the effects of hydraulic constrictions on floodwaters.
Flood Extent Mapping Using Dual-Polarimetric SENTINEL-1 Synthetic Aperture Radar Imagery
NASA Astrophysics Data System (ADS)
Jo, M.-J.; Osmanoglu, B.; Zhang, B.; Wdowinski, S.
2018-04-01
Rapid generation of synthetic aperture radar (SAR) based flood extent maps provide valuable data in disaster response efforts thanks to the cloud penetrating ability of microwaves. We present a method using dual-polarimetric SAR imagery acquired on Sentinel-1a/b satellites. A false-colour map is generated using pre- and post- disaster imagery, allowing operators to distinguish between existing standing water pre-flooding, and recently flooded areas. The method works best in areas of standing water and provides mixed results in urban areas. A flood depth map is also estimated by using an external DEM. We will present the methodology, it's estimated accuracy as well as investigations into improving the response in urban areas.
Zhang, Qun; Zhang, Qunzhi; Sornette, Didier
2016-01-01
We augment the existing literature using the Log-Periodic Power Law Singular (LPPLS) structures in the log-price dynamics to diagnose financial bubbles by providing three main innovations. First, we introduce the quantile regression to the LPPLS detection problem. This allows us to disentangle (at least partially) the genuine LPPLS signal and the a priori unknown complicated residuals. Second, we propose to combine the many quantile regressions with a multi-scale analysis, which aggregates and consolidates the obtained ensembles of scenarios. Third, we define and implement the so-called DS LPPLS Confidence™ and Trust™ indicators that enrich considerably the diagnostic of bubbles. Using a detailed study of the "S&P 500 1987" bubble and presenting analyses of 16 historical bubbles, we show that the quantile regression of LPPLS signals contributes useful early warning signals. The comparison between the constructed signals and the price development in these 16 historical bubbles demonstrates their significant predictive ability around the real critical time when the burst/rally occurs.
NASA Astrophysics Data System (ADS)
Yang, Peng; Xia, Jun; Zhang, Yongyong; Han, Jian; Wu, Xia
2017-11-01
Because drought is a very common and widespread natural disaster, it has attracted a great deal of academic interest. Based on 12-month time scale standardized precipitation indices (SPI12) calculated from precipitation data recorded between 1960 and 2015 at 22 weather stations in the Tarim River Basin (TRB), this study aims to identify the trends of SPI and drought duration, severity, and frequency at various quantiles and to perform cluster analysis of drought events in the TRB. The results indicated that (1) both precipitation and temperature at most stations in the TRB exhibited significant positive trends during 1960-2015; (2) multiple scales of SPIs changed significantly around 1986; (3) based on quantile regression analysis of temporal drought changes, the positive SPI slopes indicated less severe and less frequent droughts at lower quantiles, but clear variation was detected in the drought frequency; and (4) significantly different trends were found in drought frequency probably between severe droughts and drought frequency.
Quantile regression for the statistical analysis of immunological data with many non-detects.
Eilers, Paul H C; Röder, Esther; Savelkoul, Huub F J; van Wijk, Roy Gerth
2012-07-07
Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical techniques currently available for the analysis of datasets with non-detects can only be used if a small percentage of the data are non-detects. Quantile regression, a generalization of percentiles to regression models, models the median or higher percentiles and tolerates very high numbers of non-detects. We present a non-technical introduction and illustrate it with an implementation to real data from a clinical trial. We show that by using quantile regression, groups can be compared and that meaningful linear trends can be computed, even if more than half of the data consists of non-detects. Quantile regression is a valuable addition to the statistical methods that can be used for the analysis of immunological datasets with non-detects.
Probabilistic forecasting for extreme NO2 pollution episodes.
Aznarte, José L
2017-10-01
In this study, we investigate the convenience of quantile regression to predict extreme concentrations of NO 2 . Contrarily to the usual point-forecasting, where a single value is forecast for each horizon, probabilistic forecasting through quantile regression allows for the prediction of the full probability distribution, which in turn allows to build models specifically fit for the tails of this distribution. Using data from the city of Madrid, including NO 2 concentrations as well as meteorological measures, we build models that predict extreme NO 2 concentrations, outperforming point-forecasting alternatives, and we prove that the predictions are accurate, reliable and sharp. Besides, we study the relative importance of the independent variables involved, and show how the important variables for the median quantile are different than those important for the upper quantiles. Furthermore, we present a method to compute the probability of exceedance of thresholds, which is a simple and comprehensible manner to present probabilistic forecasts maximizing their usefulness. Copyright © 2017 Elsevier Ltd. All rights reserved.
Influence of model reduction on uncertainty of flood inundation predictions
NASA Astrophysics Data System (ADS)
Romanowicz, R. J.; Kiczko, A.; Osuch, M.
2012-04-01
Derivation of flood risk maps requires an estimation of the maximum inundation extent for a flood with an assumed probability of exceedence, e.g. a 100 or 500 year flood. The results of numerical simulations of flood wave propagation are used to overcome the lack of relevant observations. In practice, deterministic 1-D models are used for flow routing, giving a simplified image of a flood wave propagation process. The solution of a 1-D model depends on the simplifications to the model structure, the initial and boundary conditions and the estimates of model parameters which are usually identified using the inverse problem based on the available noisy observations. Therefore, there is a large uncertainty involved in the derivation of flood risk maps. In this study we examine the influence of model structure simplifications on estimates of flood extent for the urban river reach. As the study area we chose the Warsaw reach of the River Vistula, where nine bridges and several dikes are located. The aim of the study is to examine the influence of water structures on the derived model roughness parameters, with all the bridges and dikes taken into account, with a reduced number and without any water infrastructure. The results indicate that roughness parameter values of a 1-D HEC-RAS model can be adjusted for the reduction in model structure. However, the price we pay is the model robustness. Apart from a relatively simple question regarding reducing model structure, we also try to answer more fundamental questions regarding the relative importance of input, model structure simplification, parametric and rating curve uncertainty to the uncertainty of flood extent estimates. We apply pseudo-Bayesian methods of uncertainty estimation and Global Sensitivity Analysis as the main methodological tools. The results indicate that the uncertainties have a substantial influence on flood risk assessment. In the paper we present a simplified methodology allowing the influence of that uncertainty to be assessed. This work was supported by National Science Centre of Poland (grant 2011/01/B/ST10/06866).
Coupling Radar Rainfall Estimation and Hydrological Modelling For Flash-flood Hazard Mitigation
NASA Astrophysics Data System (ADS)
Borga, M.; Creutin, J. D.
Flood risk mitigation is accomplished through managing either or both the hazard and vulnerability. Flood hazard may be reduced through structural measures which alter the frequency of flood levels in the area. The vulnerability of a community to flood loss can be mitigated through changing or regulating land use and through flood warning and effective emergency response. When dealing with flash-flood hazard, it is gener- ally accepted that the most effective way (and in many instances the only affordable in a sustainable perspective) to mitigate the risk is by reducing the vulnerability of the involved communities, in particular by implementing flood warning systems and community self-help programs. However, both the inherent characteristics of the at- mospheric and hydrologic processes involved in flash-flooding and the changing soci- etal needs provide a tremendous challenge to traditional flood forecasting and warning concepts. In fact, the targets of these systems are traditionally localised like urbanised sectors or hydraulic structures. Given the small spatial scale that characterises flash floods and the development of dispersed urbanisation, transportation, green tourism and water sports, human lives and property are exposed to flash flood risk in a scat- tered manner. This must be taken into consideration in flash flood warning strategies and the investigated region should be considered as a whole and every section of the drainage network as a potential target for hydrological warnings. Radar technology offers the potential to provide information describing rain intensities almost contin- uously in time and space. Recent research results indicate that coupling radar infor- mation to distributed hydrologic modelling can provide hydrologic forecasts at all potentially flooded points of a region. Nevertheless, very few flood warning services use radar data more than on a qualitative basis. After a short review of current under- standing in this area, two issues are examined: advantages and caveats of using radar rainfall estimates in operational flash flood forecasting, methodological problems as- sociated to the use of hydrological models for distributed flash flood forecasting with rainfall input estimated from radar.
Estimating parameter values of a socio-hydrological flood model
NASA Astrophysics Data System (ADS)
Holkje Barendrecht, Marlies; Viglione, Alberto; Kreibich, Heidi; Vorogushyn, Sergiy; Merz, Bruno; Blöschl, Günter
2018-06-01
Socio-hydrological modelling studies that have been published so far show that dynamic coupled human-flood models are a promising tool to represent the phenomena and the feedbacks in human-flood systems. So far these models are mostly generic and have not been developed and calibrated to represent specific case studies. We believe that applying and calibrating these type of models to real world case studies can help us to further develop our understanding about the phenomena that occur in these systems. In this paper we propose a method to estimate the parameter values of a socio-hydrological model and we test it by applying it to an artificial case study. We postulate a model that describes the feedbacks between floods, awareness and preparedness. After simulating hypothetical time series with a given combination of parameters, we sample few data points for our variables and try to estimate the parameters given these data points using Bayesian Inference. The results show that, if we are able to collect data for our case study, we would, in theory, be able to estimate the parameter values for our socio-hydrological flood model.
Tortorelli, R.L.
1996-01-01
The flash flood in southwestern Oklahoma City, Oklahoma, May 8, 1993, was the result of an intense 3-hour rainfall on saturated ground or impervious surfaces. The total precipitation of 5.28 inches was close to the 3-hour, 100-year frequency and produced extensive flooding. The most serious flooding was on Twin, Brock, and Lightning Creeks. Four people died in this flood. Over 1,900 structures were damaged along the 3 creeks. There were about $3 million in damages to Oklahoma City public facilities, the majority of which were in the three basins. A study was conducted to determine the magnitude of the May 8, 1993, flood peak discharge in these three creeks in southwestern Oklahoma City and compare these peaks with published flood estimates. Flood peak-discharge estimates for these creeks were determined at 11 study sites using a step-backwater analysis to match the flood water-surface profiles defined by high-water marks. The unit discharges during peak runoff ranged from 881 cubic feet per second per square mile for Lightning Creek at SW 44th Street to 3,570 cubic feet per second per square mile for Brock Creek at SW 59th Street. The ratios of the 1993 flood peak discharges to the Federal Emergency Management Agency 100-year flood peak discharges ranged from 1.25 to 3.29. The water-surface elevations ranged from 0.2 foot to 5.9 feet above the Federal Emergency Management Agency 500-year flood water-surface elevations. The very large flood peaks in these 3 small urban basins were the result of very intense rainfall in a short period of time, close to 100 percent runoff due to ground surfaces being essentially impervious, and the city streets acting as efficient conveyances to the main channels. The unit discharges compare in magnitude to other extraordinary Oklahoma urban floods.
Adige river in Trento flooding map, 1892: private or public risk transfer?
NASA Astrophysics Data System (ADS)
Ranzi, Roberto
2016-04-01
For the determination of the flood risk hydrologist and hydraulic engineers focuse their attention mainly to the estimation of physical factors determining the flood hazard, while economists and experts of social sciences deal mainly with the estimation of vulnerability and exposure. The fact that flood zoning involves both hydrological and socio-economic aspects, however, was clear already in the XIX century when the impact of floods on inundated areas started to appear in flood maps, for instance in the UK and in Italy. A pioneering 'flood risk' map for the Adige river in Trento, Italy, was already published in 1892, taking into account in detail both hazard intensity in terms of velocity and depth, frequency of occurrence, vulnerability and economic costs for flood protection with river embankments. This map is likely to be the reinterpreted certainly as a pioneering, and possibly as the first flood risk map for an Italian river and worldwide. Risk levels were divided in three categories and seven sub-categories, depending on flood water depth, velocity, frequency and damage costs. It is interesting to notice the fact that at that time the map was used to share the cost of levees' reparation and enhancement after the severe September 1882 flood as a function of the estimated level of protection of the respective areas against the flood risk. The sharing of costs between public bodies, the railway company and private owners was debated for about 20 years and at the end the public sustained the major costs. This shows how already at that time the economic assessment of structural flood protections was based on objective and rational cost-benefit criteria, that hydraulic risk mapping was perceived by the society as fundamental for the design of flood protection systems and that a balanced cost sharing between public and private was an accepted approach although some protests arose at that time.
Shao, Wanyun; Xian, Siyuan; Lin, Ning; Kunreuther, Howard; Jackson, Nida; Goidel, Kirby
2017-01-01
Over the past several decades, the economic damage from flooding in the coastal areas has greatly increased due to rapid coastal development coupled with possible climate change impacts. One effective way to mitigate excessive economic losses from flooding is to purchase flood insurance. Only a minority of coastal residents however have taken this preventive measure. Using original survey data for all coastal counties of the United States Gulf Coast merged with contextual data, this study examines the effects of external influences and perceptions of flood-related risks on individuals' voluntary behaviors to purchase flood insurance. It is found that the estimated flood hazard conveyed through the U.S. Federal Emergency Management Agency's (FEMA's) flood maps, the intensities and consequences of past storms and flooding events, and perceived flood-related risks significantly affect individual's voluntary purchase of flood insurance. This behavior is also influenced by home ownership, trust in local government, education, and income. These findings have several important policy implications. First, FEMA's flood maps have been effective in conveying local flood risks to coastal residents, and correspondingly influencing their decisions to voluntarily seek flood insurance in the U.S. Gulf Coast. Flood maps therefore should be updated frequently to reflect timely and accurate information about flood hazards. Second, policy makers should design strategies to increase homeowners' trust in the local government, to better communicate flood risks with residents, to address the affordability issue for the low-income, and better inform less educated homeowners through various educational programs. Future studies should examine the voluntary flood insurance behavior across countries that are vulnerable to flooding. Copyright © 2016 Elsevier Ltd. All rights reserved.
Nonlinear, discrete flood event models, 1. Bayesian estimation of parameters
NASA Astrophysics Data System (ADS)
Bates, Bryson C.; Townley, Lloyd R.
1988-05-01
In this paper (Part 1), a Bayesian procedure for parameter estimation is applied to discrete flood event models. The essence of the procedure is the minimisation of a sum of squares function for models in which the computed peak discharge is nonlinear in terms of the parameters. This objective function is dependent on the observed and computed peak discharges for several storms on the catchment, information on the structure of observation error, and prior information on parameter values. The posterior covariance matrix gives a measure of the precision of the estimated parameters. The procedure is demonstrated using rainfall and runoff data from seven Australian catchments. It is concluded that the procedure is a powerful alternative to conventional parameter estimation techniques in situations where a number of floods are available for parameter estimation. Parts 2 and 3 will discuss the application of statistical nonlinearity measures and prediction uncertainty analysis to calibrated flood models. Bates (this volume) and Bates and Townley (this volume).
Flooding and Atmospheric Rivers across the Western United States
NASA Astrophysics Data System (ADS)
Villarini, G.; Barth, N. A.; White, K. D.
2017-12-01
Flood frequency analysis across the western United States is complicated by annual peak flow records that frequently contain flows generated from distinctly different flood generating mechanisms. Among the different flood agents, atmospheric rivers (ARs) are responsible for large, regional scale floods. USGS streamgaging stations in the central Columbia River Basin in the Pacific Northwest, the Sierra Nevada, the central and southern California coast, and central Arizona show a mixture of 30-70% AR-generated flood peaks among the complete period of record. Bulletin17B and its proposed update (Draft Bulletin 17C) continue to recognize difficulties in determining flood frequency estimates among streamflow records that contain flood peaks coming from different flood-generating mechanisms, as is the case in the western United States. They recommend developing separate frequency curves when the hydrometeorologic mechanisms that generated the annual peak flows can be separated into distinct subpopulations. Yet challenges arise when trying to consistently quantify the physical (hydrometeorologic) processes that generated the observed flows, and even more when trying to account for them in flood frequency estimation. This study provides a general statistical framework to perform a process-driven flood frequency analysis using a weighted mixed population approach, highlighting the role that ARs play on the flood peak distribution.
NASA Astrophysics Data System (ADS)
Adler, R. F.; Wu, H.
2016-12-01
The Global Flood Monitoring System (GFMS) (http://flood.umd.edu) has been developed and used in recent years to provide real-time flood detection, streamflow estimates and inundation calculations for most of the globe. The GFMS is driven by satellite-based precipitation, with the accuracy of the flood estimates being primarily dependent on the accuracy of the precipitation analyses and the land surface and routing models used. The routing calculations are done at both 12 km and 1 km resolution. Users of GFMS results include international and national flood response organizations. The devastating floods in October 2015 in South Carolina are analyzed indicating that the GFMS estimated streamflow is accurate and useful indicating significant flooding in the upstream basins. Further downstream the GFMS streamflow underestimates due to the presence of dams which are not accounted for in GFMS. Other examples are given for Yemen and Somalia and for Sri Lanka and southern India. A forecast flood event associated with a typhoon hitting Taiwan is also examined. One-kilometer resolution inundation mapping from GFMS holds the promise of highly useful information for flood disaster response. The algorithm is briefly described and examples are shown for recent cases where inundation estimates available from optical and Synthetic Aperture Radar (SAR) satellite sensors are available. For a case of significant flooding in Texas in May and June along the Brazos River the GFMS calculated streamflow compares favorably with the observed. Available Landsat-based (May 28) and MODIS-based (June 2) inundation analyses from U. of Colorado shows generally good agreement with the GFMS inundation calculation in most of the area where skies were clear and the optical techniques could be applied. The GFMS provides very useful disaster response information on a timely basis. However, there is still significant room for improvement, including improved precipitation information from NASA's Global Precipitation Measurement (GPM) mission, inclusion of dam algorithms in the routing model and integration with or assimilation of observed flood extent from satellite optical and SAR sensors.
Mao, Fei; Zhu, Xiaoming; Lu, Bin; Li, Yiming
2018-04-01
SUDOSCAN (Impeto Medical, Paris, France) has been proved to be a new and non-invasive method in detecting renal dysfunction in type 2 diabetes mellitus (T2DM) patients. In this study, we sought to compare the result of diabetic kidney dysfunction score (DKD-score) of SUDOSCAN with estimated glomerular filtration rate (eGFR) by using quantile regression analysis, which was completely different from previous studies. A total number of 223 Chinese T2DM patients were enrolled in the study. SUDOSCAN, renal function test (including blood urea nitrogen, creatinine and uric acid) and 99 mTc-diethylenetriamine pentaacetic acid ( 99 mTc-DTPA) renal dynamic imaging were performed in all T2DM patients. DKD-score of SUDOSCAN was compared with eGFR detected by 99 mTc-DTPA renal dynamic imaging through quantile regression analysis. Its validation and utility was further determined through bias and precision test. The quantile regression analysis demonstrated the relationship with eGFR was inverse and significant for almost all percentiles of DKD-score. The coefficients decreased as the percentile of DKD-score increased. And in validation data set, both the bias and precision were increased with the eGFR (median difference, -21.2 ml/min/1.73 m 2 for all individuals vs. -4.6 ml/min/1.73 m 2 for eGFR between 0 and 59 ml/min/1.73 m 2 ; interquartile range [IQR] for the difference, -25.4 ml/min/1.73 m 2 vs. -14.7 ml/min/1.73 m 2 ). The eGFR category misclassification rate were 10% in eGFR 0-59 ml/min/1.73 m 2 group, 57.3% in 60-90 group, and 87.2% in eGFR > 90 ml/min/1.73 m 2 group. DKD-score of SUDOSCAN could be used to detect renal dysfunction in T2DM patients. A higher prognostic value of DKD-score was detected when eGFR level was lower. Copyright © 2018 Elsevier B.V. All rights reserved.
Developing a GIS based integrated approach to flood management in Trinidad, West Indies.
Ramlal, Bheshem; Baban, Serwan M J
2008-09-01
Trinidad and Tobago is plagued with a perennial flooding problem. The higher levels of rainfall in the wet season often lead to extensive flooding in the low-lying areas of the country. This has lead to significant damage to livestock, agricultural produce, homes and businesses particularly in the Caparo River Basin. Clearly, there is a need for developing flood mitigation and management strategies to manage flooding in the areas most affected. This paper utilizes geographic information systems to map the extent of the flooding, estimate soil loss due to erosion and estimate sediment loading in the rivers in the Caparo River Basin. In addition, the project required the development of a watershed management plan and a flood control plan. The results indicate that flooding was caused by several factors including clear cutting of vegetative cover, especially in areas of steep slopes that lead to sediment filled rivers and narrow waterways. Other factors include poor agricultural practices, and uncontrolled development in floodplains. Recommendations to manage floods in the Caparo River Basin have been provided.
Quantile regression in the presence of monotone missingness with sensitivity analysis
Liu, Minzhao; Daniels, Michael J.; Perri, Michael G.
2016-01-01
In this paper, we develop methods for longitudinal quantile regression when there is monotone missingness. In particular, we propose pattern mixture models with a constraint that provides a straightforward interpretation of the marginal quantile regression parameters. Our approach allows sensitivity analysis which is an essential component in inference for incomplete data. To facilitate computation of the likelihood, we propose a novel way to obtain analytic forms for the required integrals. We conduct simulations to examine the robustness of our approach to modeling assumptions and compare its performance to competing approaches. The model is applied to data from a recent clinical trial on weight management. PMID:26041008
Quantile Regression with Censored Data
ERIC Educational Resources Information Center
Lin, Guixian
2009-01-01
The Cox proportional hazards model and the accelerated failure time model are frequently used in survival data analysis. They are powerful, yet have limitation due to their model assumptions. Quantile regression offers a semiparametric approach to model data with possible heterogeneity. It is particularly powerful for censored responses, where the…
Flood-hazard mapping in Honduras in response to Hurricane Mitch
Mastin, M.C.
2002-01-01
The devastation in Honduras due to flooding from Hurricane Mitch in 1998 prompted the U.S. Agency for International Development, through the U.S. Geological Survey, to develop a country-wide systematic approach of flood-hazard mapping and a demonstration of the method at selected sites as part of a reconstruction effort. The design discharge chosen for flood-hazard mapping was the flood with an average return interval of 50 years, and this selection was based on discussions with the U.S. Agency for International Development and the Honduran Public Works and Transportation Ministry. A regression equation for estimating the 50-year flood discharge using drainage area and annual precipitation as the explanatory variables was developed, based on data from 34 long-term gaging sites. This equation, which has a standard error of prediction of 71.3 percent, was used in a geographic information system to estimate the 50-year flood discharge at any location for any river in the country. The flood-hazard mapping method was demonstrated at 15 selected municipalities. High-resolution digital-elevation models of the floodplain were obtained using an airborne laser-terrain mapping system. Field verification of the digital elevation models showed that the digital-elevation models had mean absolute errors ranging from -0.57 to 0.14 meter in the vertical dimension. From these models, water-surface elevation cross sections were obtained and used in a numerical, one-dimensional, steady-flow stepbackwater model to estimate water-surface profiles corresponding to the 50-year flood discharge. From these water-surface profiles, maps of area and depth of inundation were created at the 13 of the 15 selected municipalities. At La Lima only, the area and depth of inundation of the channel capacity in the city was mapped. At Santa Rose de Aguan, no numerical model was created. The 50-year flood and the maps of area and depth of inundation are based on the estimated 50-year storm tide.
A coupled weather generator - rainfall-runoff approach on hourly time steps for flood risk analysis
NASA Astrophysics Data System (ADS)
Winter, Benjamin; Schneeberger, Klaus; Dung Nguyen, Viet; Vorogushyn, Sergiy; Huttenlau, Matthias; Merz, Bruno; Stötter, Johann
2017-04-01
The evaluation of potential monetary damage of flooding is an essential part of flood risk management. One possibility to estimate the monetary risk is to analyze long time series of observed flood events and their corresponding damages. In reality, however, only few flood events are documented. This limitation can be overcome by the generation of a set of synthetic, physically and spatial plausible flood events and subsequently the estimation of the resulting monetary damages. In the present work, a set of synthetic flood events is generated by a continuous rainfall-runoff simulation in combination with a coupled weather generator and temporal disaggregation procedure for the study area of Vorarlberg (Austria). Most flood risk studies focus on daily time steps, however, the mesoscale alpine study area is characterized by short concentration times, leading to large differences between daily mean and daily maximum discharge. Accordingly, an hourly time step is needed for the simulations. The hourly metrological input for the rainfall-runoff model is generated in a two-step approach. A synthetic daily dataset is generated by a multivariate and multisite weather generator and subsequently disaggregated to hourly time steps with a k-Nearest-Neighbor model. Following the event generation procedure, the negative consequences of flooding are analyzed. The corresponding flood damage for each synthetic event is estimated by combining the synthetic discharge at representative points of the river network with a loss probability relation for each community in the study area. The loss probability relation is based on exposure and susceptibility analyses on a single object basis (residential buildings) for certain return periods. For these impact analyses official inundation maps of the study area are used. Finally, by analyzing the total event time series of damages, the expected annual damage or losses associated with a certain probability of occurrence can be estimated for the entire study area.
Regional flood frequency analysis in Triveneto (Italy): climate and scale controls
NASA Astrophysics Data System (ADS)
Persiano, Simone; Castellarin, Attilio; Domeneghetti, Alessio; Brath, Armando
2016-04-01
The growing concern about the possible effects of climate change on flood frequency regime is leading Authorities to review previously proposed procedures for design-flood estimation, such as national regionalization approaches. Our study focuses on the Triveneto region, a broad geographical area in North-eastern Italy consisting of the administrative regions of Trentino-Alto Adige, Veneto and Friuli-Venezia Giulia. A reference procedure for design flood estimation in Triveneto is available from the Italian NCR research project "VA.PI.", which developed a regional model using annual maximum series (AMS) of peak discharges that were collected up to the 80s by the former Italian Hydrometeorological Service. We consider a very detailed AMS database that we recently compiled for ~80 catchments located in Triveneto. Our dataset includes the historical data mentioned above, together with more recent data obtained from Regional Services and annual maximum peak streamflows extracted from inflow series to artificial reservoirs and provided by dam managers. All ~80 study catchments are characterized in terms of several geomorphologic and climatic descriptors. The main objectives of our study are: (1) to check whether climatic and scale controls on flood frequency regime in Triveneto are similar to the controls that were recently found in Europe; (2) to verify the possible presence of trends as well as abrupt changes in the intensity and frequency of flood extremes by looking at changes in time of regional L-moments of annual maximum floods; (3) to assess the reliability and representativeness of the reference procedure for design flood estimation relative to flood data that were not included in the VA.PI. dataset (i.e. more recent data collected after the 80s and historical data provided by dam managers); (4) to develop an updated reference procedure for design flood estimation in Triveneto by using a focused-pooling approach (i.e. Region of Influence, RoI).
Do regional methods really help reduce uncertainties in flood frequency analyses?
NASA Astrophysics Data System (ADS)
Cong Nguyen, Chi; Payrastre, Olivier; Gaume, Eric
2013-04-01
Flood frequency analyses are often based on continuous measured series at gauge sites. However, the length of the available data sets is usually too short to provide reliable estimates of extreme design floods. To reduce the estimation uncertainties, the analyzed data sets have to be extended either in time, making use of historical and paleoflood data, or in space, merging data sets considered as statistically homogeneous to build large regional data samples. Nevertheless, the advantage of the regional analyses, the important increase of the size of the studied data sets, may be counterbalanced by the possible heterogeneities of the merged sets. The application and comparison of four different flood frequency analysis methods to two regions affected by flash floods in the south of France (Ardèche and Var) illustrates how this balance between the number of records and possible heterogeneities plays in real-world applications. The four tested methods are: (1) a local statistical analysis based on the existing series of measured discharges, (2) a local analysis valuating the existing information on historical floods, (3) a standard regional flood frequency analysis based on existing measured series at gauged sites and (4) a modified regional analysis including estimated extreme peak discharges at ungauged sites. Monte Carlo simulations are conducted to simulate a large number of discharge series with characteristics similar to the observed ones (type of statistical distributions, number of sites and records) to evaluate to which extent the results obtained on these case studies can be generalized. These two case studies indicate that even small statistical heterogeneities, which are not detected by the standard homogeneity tests implemented in regional flood frequency studies, may drastically limit the usefulness of such approaches. On the other hand, these result show that the valuation of information on extreme events, either historical flood events at gauged sites or estimated extremes at ungauged sites in the considered region, is an efficient way to reduce uncertainties in flood frequency studies.
Quality of life in breast cancer patients--a quantile regression analysis.
Pourhoseingholi, Mohamad Amin; Safaee, Azadeh; Moghimi-Dehkordi, Bijan; Zeighami, Bahram; Faghihzadeh, Soghrat; Tabatabaee, Hamid Reza; Pourhoseingholi, Asma
2008-01-01
Quality of life study has an important role in health care especially in chronic diseases, in clinical judgment and in medical resources supplying. Statistical tools like linear regression are widely used to assess the predictors of quality of life. But when the response is not normal the results are misleading. The aim of this study is to determine the predictors of quality of life in breast cancer patients, using quantile regression model and compare to linear regression. A cross-sectional study conducted on 119 breast cancer patients that admitted and treated in chemotherapy ward of Namazi hospital in Shiraz. We used QLQ-C30 questionnaire to assessment quality of life in these patients. A quantile regression was employed to assess the assocciated factors and the results were compared to linear regression. All analysis carried out using SAS. The mean score for the global health status for breast cancer patients was 64.92+/-11.42. Linear regression showed that only grade of tumor, occupational status, menopausal status, financial difficulties and dyspnea were statistically significant. In spite of linear regression, financial difficulties were not significant in quantile regression analysis and dyspnea was only significant for first quartile. Also emotion functioning and duration of disease statistically predicted the QOL score in the third quartile. The results have demonstrated that using quantile regression leads to better interpretation and richer inference about predictors of the breast cancer patient quality of life.
Probabilistic modelling of flood events using the entropy copula
NASA Astrophysics Data System (ADS)
Li, Fan; Zheng, Qian
2016-11-01
The estimation of flood frequency is vital for the flood control strategies and hydraulic structure design. Generating synthetic flood events according to statistical properties of observations is one of plausible methods to analyze the flood frequency. Due to the statistical dependence among the flood event variables (i.e. the flood peak, volume and duration), a multidimensional joint probability estimation is required. Recently, the copula method is widely used for multivariable dependent structure construction, however, the copula family should be chosen before application and the choice process is sometimes rather subjective. The entropy copula, a new copula family, employed in this research proposed a way to avoid the relatively subjective process by combining the theories of copula and entropy. The analysis shows the effectiveness of the entropy copula for probabilistic modelling the flood events of two hydrological gauges, and a comparison of accuracy with the popular copulas was made. The Gibbs sampling technique was applied for trivariate flood events simulation in order to mitigate the calculation difficulties of extending to three dimension directly. The simulation results indicate that the entropy copula is a simple and effective copula family for trivariate flood simulation.
A method for mapping flood hazard along roads.
Kalantari, Zahra; Nickman, Alireza; Lyon, Steve W; Olofsson, Bo; Folkeson, Lennart
2014-01-15
A method was developed for estimating and mapping flood hazard probability along roads using road and catchment characteristics as physical catchment descriptors (PCDs). The method uses a Geographic Information System (GIS) to derive candidate PCDs and then identifies those PCDs that significantly predict road flooding using a statistical modelling approach. The method thus allows flood hazards to be estimated and also provides insights into the relative roles of landscape characteristics in determining road-related flood hazards. The method was applied to an area in western Sweden where severe road flooding had occurred during an intense rain event as a case study to demonstrate its utility. The results suggest that for this case study area three categories of PCDs are useful for prediction of critical spots prone to flooding along roads: i) topography, ii) soil type, and iii) land use. The main drivers among the PCDs considered were a topographical wetness index, road density in the catchment, soil properties in the catchment (mainly the amount of gravel substrate) and local channel slope at the site of a road-stream intersection. These can be proposed as strong indicators for predicting the flood probability in ungauged river basins in this region, but some care is needed in generalising the case study results other potential factors are also likely to influence the flood hazard probability. Overall, the method proposed represents a straightforward and consistent way to estimate flooding hazards to inform both the planning of future roadways and the maintenance of existing roadways. Copyright © 2013 Elsevier Ltd. All rights reserved.
Fifty-year flood-inundation maps for El Progreso, Honduras
Kresch, David L.; Mastin, Mark C.; Olsen, T.D.
2002-01-01
After the devastating floods caused by Hurricane Mitch in 1998, maps of the areas and depths of the 50-year-flood inundation at 15 municipalities in Honduras were prepared as a tool for agencies involved in reconstruction and planning. This report, which is one in a series of 15, presents maps of areas in the municipality of El Progreso that would be inundated by a 50-year flood of Rio Pelo. Geographic Information System (GIS) coverages of the flood inundation are available on a computer in the municipality of El Progreso as part of the Municipal GIS project and on the Internet at the Flood Hazard Mapping Web page (http://mitchnts1.cr.usgs.gov/projects/floodhazard.html). These coverages allow users to view the flood inundation in much more detail than is possible using the maps in this report. Water-surface elevations for a 50-year-flood on Rio Pelo at El Progreso were estimated using HEC-RAS, a one-dimensional, steady-flow, step-backwater computer program. The channel and floodplain cross sections used in HEC-RAS were developed from an airborne light-detection-and-ranging (LIDAR) topographic survey of the area. There are no nearby long-term stream-gaging stations on Rio Pelo; therefore, the 50-year-flood discharge for Rio Pelo, 235 cubic meters per second, was estimated using a regression equation that relates the 50-year-flood discharge to drainage area and mean annual precipitation. The drainage area and mean annual precipitation estimated for Rio Pelo at El Progreso are 47.4 square kilometers and 1,920 millimeters, respectively.
Fifty-year flood-inundation maps for Olanchito, Honduras
Kresch, David L.; Mastin, M.C.; Olsen, T.D.
2002-01-01
After the devastating floods caused by Hurricane Mitch in 1998, maps of the areas and depths of the 50-year-flood inundation at 15 municipalities in Honduras were prepared as a tool for agencies involved in reconstruction and planning. This report, which is one in a series of 15, presents maps of areas in the municipality of Olanchito that would be inundated by a 50-year-flood of Rio Uchapa. Geographic Information System (GIS) coverages of the flood inundation are available on a computer in the municipality of Olanchito as part of the Municipal GIS project and on the Internet at the Flood Hazard Mapping Web page (http://mitchnts1.cr.usgs.gov/projects/floodhazard.html). These coverages allow users to view the flood inundation in much more detail than is possible using the maps in this report. Water-surface elevations for a 50-year-flood discharge of 243 cubic meters per second on Rio Uchapa at Olanchito were estimated using HEC-RAS, a one-dimensional, steady-flow, step-backwater computer program. The channel and floodplain cross sections used in HEC-RAS were developed from an airborne light-detection-and-ranging (LIDAR) topographic survey of the area. There are no nearby long-term stream-gaging stations on Rio Uchapa; therefore, the 50-year-flood discharge for Rio Uchapa was estimated using a regression equation that relates the 50-year-flood discharge to drainage area and mean annual precipitation. The drainage area and mean annual precipitation estimated for Rio Uchapa at Olanchito are 97.1 square kilometers and 1,178 millimeters, respectively.
NASA Astrophysics Data System (ADS)
Vittal, H.; Singh, Jitendra; Kumar, Pankaj; Karmakar, Subhankar
2015-06-01
In watershed management, flood frequency analysis (FFA) is performed to quantify the risk of flooding at different spatial locations and also to provide guidelines for determining the design periods of flood control structures. The traditional FFA was extensively performed by considering univariate scenario for both at-site and regional estimation of return periods. However, due to inherent mutual dependence of the flood variables or characteristics [i.e., peak flow (P), flood volume (V) and flood duration (D), which are random in nature], analysis has been further extended to multivariate scenario, with some restrictive assumptions. To overcome the assumption of same family of marginal density function for all flood variables, the concept of copula has been introduced. Although, the advancement from univariate to multivariate analyses drew formidable attention to the FFA research community, the basic limitation was that the analyses were performed with the implementation of only parametric family of distributions. The aim of the current study is to emphasize the importance of nonparametric approaches in the field of multivariate FFA; however, the nonparametric distribution may not always be a good-fit and capable of replacing well-implemented multivariate parametric and multivariate copula-based applications. Nevertheless, the potential of obtaining best-fit using nonparametric distributions might be improved because such distributions reproduce the sample's characteristics, resulting in more accurate estimations of the multivariate return period. Hence, the current study shows the importance of conjugating multivariate nonparametric approach with multivariate parametric and copula-based approaches, thereby results in a comprehensive framework for complete at-site FFA. Although the proposed framework is designed for at-site FFA, this approach can also be applied to regional FFA because regional estimations ideally include at-site estimations. The framework is based on the following steps: (i) comprehensive trend analysis to assess nonstationarity in the observed data; (ii) selection of the best-fit univariate marginal distribution with a comprehensive set of parametric and nonparametric distributions for the flood variables; (iii) multivariate frequency analyses with parametric, copula-based and nonparametric approaches; and (iv) estimation of joint and various conditional return periods. The proposed framework for frequency analysis is demonstrated using 110 years of observed data from Allegheny River at Salamanca, New York, USA. The results show that for both univariate and multivariate cases, the nonparametric Gaussian kernel provides the best estimate. Further, we perform FFA for twenty major rivers over continental USA, which shows for seven rivers, all the flood variables followed nonparametric Gaussian kernel; whereas for other rivers, parametric distributions provide the best-fit either for one or two flood variables. Thus the summary of results shows that the nonparametric method cannot substitute the parametric and copula-based approaches, but should be considered during any at-site FFA to provide the broadest choices for best estimation of the flood return periods.
Flood impacts on a water distribution network
NASA Astrophysics Data System (ADS)
Arrighi, Chiara; Tarani, Fabio; Vicario, Enrico; Castelli, Fabio
2017-12-01
Floods cause damage to people, buildings and infrastructures. Water distribution systems are particularly exposed, since water treatment plants are often located next to the rivers. Failure of the system leads to both direct losses, for instance damage to equipment and pipework contamination, and indirect impact, since it may lead to service disruption and thus affect populations far from the event through the functional dependencies of the network. In this work, we present an analysis of direct and indirect damages on a drinking water supply system, considering the hazard of riverine flooding as well as the exposure and vulnerability of active system components. The method is based on interweaving, through a semi-automated GIS procedure, a flood model and an EPANET-based pipe network model with a pressure-driven demand approach, which is needed when modelling water distribution networks in highly off-design conditions. Impact measures are defined and estimated so as to quantify service outage and potential pipe contamination. The method is applied to the water supply system of the city of Florence, Italy, serving approximately 380 000 inhabitants. The evaluation of flood impact on the water distribution network is carried out for different events with assigned recurrence intervals. Vulnerable elements exposed to the flood are identified and analysed in order to estimate their residual functionality and to simulate failure scenarios. Results show that in the worst failure scenario (no residual functionality of the lifting station and a 500-year flood), 420 km of pipework would require disinfection with an estimated cost of EUR 21 million, which is about 0.5 % of the direct flood losses evaluated for buildings and contents. Moreover, if flood impacts on the water distribution network are considered, the population affected by the flood is up to 3 times the population directly flooded.
Probabilistic Flood Mapping using Volunteered Geographical Information
NASA Astrophysics Data System (ADS)
Rivera, S. J.; Girons Lopez, M.; Seibert, J.; Minsker, B. S.
2016-12-01
Flood extent maps are widely used by decision makers and first responders to provide critical information that prevents economic impacts and the loss of human lives. These maps are usually obtained from sensory data and/or hydrologic models, which often have limited coverage in space and time. Recent developments in social media and communication technology have created a wealth of near-real-time, user-generated content during flood events in many urban areas, such as flooded locations, pictures of flooding extent and height, etc. These data could improve decision-making and response operations as events unfold. However, the integration of these data sources has been limited due to the need for methods that can extract and translate the data into useful information for decision-making. This study presents an approach that uses volunteer geographic information (VGI) and non-traditional data sources (i.e., Twitter, Flicker, YouTube, and 911 and 311 calls) to generate/update the flood extent maps in areas where no models and/or gauge data are operational. The approach combines Web-crawling and computer vision techniques to gather information about the location, extent, and water height of the flood from unstructured textual data, images, and videos. These estimates are then used to provide an updated flood extent map for areas surrounding the geo-coordinate of the VGI through the application of a Hydro Growing Region Algorithm (HGRA). HGRA combines hydrologic and image segmentation concepts to estimate a probabilistic flooding extent along the corresponding creeks. Results obtained for a case study in Austin, TX (i.e., 2015 Memorial Day flood) were comparable to those obtained by a calibrated hydrologic model and had good spatial correlation with flooding extents estimated by the Federal Emergency Management Agency (FEMA).
Storm Duration and Antecedent Moisture Conditions for Flood Discharge Estimation
DOT National Transportation Integrated Search
2003-11-01
Design flows estimated by flood hydrograph simulation can be reasonably accurate or greatly in error, depending upon the modeling procedures and inputs selected. The objectives of this research project were (1) to determine which combinations of mode...
NASA Technical Reports Server (NTRS)
Ahamed, Aakash; Bolten, John; Doyle, C.; Fayne, Jessica
2016-01-01
Floods are the costliest natural disaster (United Nations 2004), causing approximately6.8 million deaths in the twentieth century alone (Doocy et al. 2013).Worldwide economic flood damage estimates in 2012 exceed $19 Billion USD(Munich Re 2013). Extended duration floods also pose longer term threats to food security, water, sanitation, hygiene, and community livelihoods, particularly in developing countries (Davies et al. 2014).Projections by the Intergovernmental Panel on Climate Change (IPCC) suggest that precipitation extremes, rainfall intensity, storm intensity, and variability are increasing due to climate change (IPCC 2007). Increasing hydrologic uncertainty will likely lead to unprecedented extreme flood events. As such, there is a vital need to enhance and further develop traditional techniques used to rapidly assessflooding and extend analytical methods to estimate impacted population and infrastructure.
Prediction of the flooding of a mining reservoir in NW Spain.
Álvarez, R; Ordóñez, A; De Miguel, E; Loredo, C
2016-12-15
Abandoned and flooded mines constitute underground reservoirs which must be managed. When pumping is stopped in a closed mine, the process of flooding should be anticipated in order to avoid environmentally undesirable or unexpected mine water discharges at the surface, particularly in populated areas. The Candín-Fondón mining reservoir in Asturias (NW Spain) has an estimated void volume of 8 million m 3 and some urban areas are susceptible to be flooded if the water is freely released from the lowest mine adit/pithead. A conceptual model of this reservoir was undertaken and the flooding process was numerically modelled in order to estimate the time that the flooding would take. Additionally, the maximum safe height for the filling of the reservoir is discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kourgialas, N. N.; Karatzas, G. P.
2014-03-01
A modeling system for the estimation of flash flood flow velocity and sediment transport is developed in this study. The system comprises three components: (a) a modeling framework based on the hydrological model HSPF, (b) the hydrodynamic module of the hydraulic model MIKE 11 (quasi-2-D), and (c) the advection-dispersion module of MIKE 11 as a sediment transport model. An important parameter in hydraulic modeling is the Manning's coefficient, an indicator of the channel resistance which is directly dependent on riparian vegetation changes. Riparian vegetation's effect on flood propagation parameters such as water depth (inundation), discharge, flow velocity, and sediment transport load is investigated in this study. Based on the obtained results, when the weed-cutting percentage is increased, the flood wave depth decreases while flow discharge, velocity and sediment transport load increase. The proposed modeling system is used to evaluate and illustrate the flood hazard for different riparian vegetation cutting scenarios. For the estimation of flood hazard, a combination of the flood propagation characteristics of water depth, flow velocity and sediment load was used. Next, a well-balanced selection of the most appropriate agricultural cutting practices of riparian vegetation was performed. Ultimately, the model results obtained for different agricultural cutting practice scenarios can be employed to create flood protection measures for flood-prone areas. The proposed methodology was applied to the downstream part of a small Mediterranean river basin in Crete, Greece.
Measuring disparities across the distribution of mental health care expenditures.
Le Cook, Benjamin; Manning, Willard; Alegria, Margarita
2013-03-01
Previous mental health care disparities studies predominantly compare mean mental health care use across racial/ethnic groups, leaving policymakers with little information on disparities among those with a higher level of expenditures. To identify racial/ethnic disparities among individuals at varying quantiles of mental health care expenditures. To assess whether disparities in the upper quantiles of expenditure differ by insurance status, income and education. Data were analyzed from a nationally representative sample of white, black and Latino adults 18 years and older (n=83,878). Our dependent variable was total mental health care expenditure. We measured disparities in any mental health care expenditures, disparities in mental health care expenditure at the 95th, 97.5 th, and 99 th expenditure quantiles of the full population using quantile regression, and at the 50 th, 75 th, and 95 th quantiles for positive users. In the full population, we tested interaction coefficients between race/ethnicity and income, insurance, and education levels to determine whether racial/ethnic disparities in the upper quantiles differed by income, insurance and education. Significant Black-white and Latino-white disparities were identified in any mental health care expenditures. In the full population, moving up the quantiles of mental health care expenditures, Black-White and Latino-White disparities were reduced but remained statistically significant. No statistically significant disparities were found in analyses of positive users only. The magnitude of black-white disparities was smaller among those enrolled in public insurance programs compared to the privately insured and uninsured in the 97.5 th and 99 th quantiles. Disparities persist in the upper quantiles among those in higher income categories and after excluding psychiatric inpatient and emergency department (ED) visits. Disparities exist in any mental health care and among those that use the most mental health care resources, but much of disparities seem to be driven by lack of access. The data do not allow us to disentangle whether disparities were related to white respondent's overuse or underuse as compared to minority groups. The cross-sectional data allow us to make only associational claims about the role of insurance, income, and education in disparities. With these limitations in mind, we identified a persistence of disparities in overall expenditures even among those in the highest income categories, after controlling for mental health status and observable sociodemographic characteristics. Interventions are needed to equalize resource allocation to racial/ethnic minority patients regardless of their income, with emphasis on outreach interventions to address the disparities in access that are responsible for the no/low expenditures for even Latinos at higher levels of illness severity. Increased policy efforts are needed to reduce the gap in health insurance for Latinos and improve outreach programs to enroll those in need into mental health care services. Future studies that conclusively disentangle overuse and appropriate use in these populations are warranted.
Hedgecock, T. Scott
2003-01-01
A two-dimensional finite-element surface-water model was used to study the effects of proposed modifications to the State Highway 203 corridor (proposed Elba Bypass/relocated U.S. Highway 84) on water-surface elevations and flow distributions during flooding in the Pea River and Whitewater Creek Basins at Elba, Coffee County, Alabama. Flooding was first simulated for the March 17, 1990, flood, using the 1990 flood-plain conditions to calibrate the model to match measured data collected by the U.S. Geological Survey and the U.S. Army Corps of Engineers after the flood. After model calibration, the effects of flooding were simulated for four scenarios: (1) floods having the 50- and 100-year recurrence intervals for the existing flood-plain, bridge, highway, and levee conditions; (2) floods having the 50- and 100-year recurrence intervals for the existing flood-plain and levee conditions with the State Highway 203 embankment and bridge removed; (3) floods having the 50- and 100-year recurrence intervals for the existing flood-plain, bridge, and highway conditions with proposed modifications (elevating) to the levee; and (4) floods having the 50- and 100-year recurrence intervals for the proposed conditions reflecting the Elba Bypass and modified levee. The simulation of floodflow for the Pea River and Whitewater Creek flood of March 17, 1990, in the study reach compared closely to flood profile data obtained after the flood. The flood of March 17, 1990, had an estimated peak discharge of 58,000 cubic feet per second at the gage (just below the confluence) and was estimated to be between a 50-year and 100-year flood event. The estimated peak discharge for Pea River and Whitewater Creek was 40,000 and 42,000 cubic feet per second, respectively. Simulation of floodflows for the 50-year flood (51,400 cubic feet per second) at the gage for existing flood-plain, bridge, highway, and levee conditions indicated that about 31 percent of the peak flow was conveyed by the State Highway 203 bridge over Whitewater Creek, approximately 12 percent overtopped the State Highway 203 embankment, and about 57 percent was conveyed by the Pea River flood plain east of State Highway 125. For this simulation, flow from Pea River (2,380 cubic feet per second) overtopped State Highway 125 and crossed over into the Whitewater Creek flood plain north of State Highway 203, creating one common flood plain. The water-surface elevation estimated at the downstream side of the State Highway 203 bridge crossing Whitewater Creek was 202.82 feet. The girders for both the State Highway 203 and U.S. Highway 84 bridges were partially submerged, but U.S. Highway 84 was not overtopped. For the 100-year flood (63,500 cubic feet per second) at the gage, the simulation indicated that about 25 percent of the peak flow was conveyed by the State Highway 203 bridge over Whitewater Creek, approximately 24 percent overtopped the State Highway 203 embankment, and about 51 percent was conveyed by the Pea River flood plain east of State Highway 125. The existing levee adjacent to Whitewater Creek was overtopped by a flow of 3,200 cubic feet per second during the 100-year flood. For this simulation, flow from Pea River (6,710 cubic feet per second) overtopped State Highway 125 and crossed over into the Whitewater Creek flood plain north of State Highway 203. The water-surface elevation estimated at the downstream side of the State Highway 203 bridge crossing Whitewater Creek was 205.60 feet. The girders for both the State Highway 203 and U.S. Highway 84 bridges were partially submerged, and the west end of the U.S. Highway 84 bridge was overtopped. Simulation of floodflows for the 50-year flood at the gage for existing flood-plain and levee conditions, but with the State Highway 203 embankment and bridge removed, yielded a lower water-surface elevation (202.90 feet) upstream of this bridge than that computed for the existing conditions. For the 100-year flood, the simulation indi
Shih, Ya-Chen Tina; Konrad, Thomas R
2007-10-01
Physician income is generally high, but quite variable; hence, physicians have divergent perspectives regarding health policy initiatives and market reforms that could affect their incomes. We investigated factors underlying the distribution of income within the physician population. Full-time physicians (N=10,777) from the restricted version of the 1996-1997 Community Tracking Study Physician Survey (CTS-PS), 1996 Area Resource File, and 1996 health maintenance organization penetration data. We conducted separate analyses for primary care physicians (PCPs) and specialists. We employed least square and quantile regression models to examine factors associated with physician incomes at the mean and at various points of the income distribution, respectively. We accounted for the complex survey design for the CTS-PS data using appropriate weighted procedures and explored endogeneity using an instrumental variables method. We detected widespread and subtle effects of many variables on physician incomes at different points (10th, 25th, 75th, and 90th percentiles) in the distribution that were undetected when employing regression estimations focusing on only the means or medians. Our findings show that the effects of managed care penetration are demonstrable at the mean of specialist incomes, but are more pronounced at higher levels. Conversely, a gender gap in earnings occurs at all levels of income of both PCPs and specialists, but is more pronounced at lower income levels. The quantile regression technique offers an analytical tool to evaluate policy effects beyond the means. A longitudinal application of this approach may enable health policy makers to identify winners and losers among segments of the physician workforce and assess how market dynamics and health policy initiatives affect the overall physician income distribution over various time intervals.
Bias correction of surface downwelling longwave and shortwave radiation for the EWEMBI dataset
NASA Astrophysics Data System (ADS)
Lange, Stefan
2018-05-01
Many meteorological forcing datasets include bias-corrected surface downwelling longwave and shortwave radiation (rlds and rsds). Methods used for such bias corrections range from multi-year monthly mean value scaling to quantile mapping at the daily timescale. An additional downscaling is necessary if the data to be corrected have a higher spatial resolution than the observational data used to determine the biases. This was the case when EartH2Observe (E2OBS; Calton et al., 2016) rlds and rsds were bias-corrected using more coarsely resolved Surface Radiation Budget (SRB; Stackhouse Jr. et al., 2011) data for the production of the meteorological forcing dataset EWEMBI (Lange, 2016). This article systematically compares various parametric quantile mapping methods designed specifically for this purpose, including those used for the production of EWEMBI rlds and rsds. The methods vary in the timescale at which they operate, in their way of accounting for physical upper radiation limits, and in their approach to bridging the spatial resolution gap between E2OBS and SRB. It is shown how temporal and spatial variability deflation related to bilinear interpolation and other deterministic downscaling approaches can be overcome by downscaling the target statistics of quantile mapping from the SRB to the E2OBS grid such that the sub-SRB-grid-scale spatial variability present in the original E2OBS data is retained. Cross validations at the daily and monthly timescales reveal that it is worthwhile to take empirical estimates of physical upper limits into account when adjusting either radiation component and that, overall, bias correction at the daily timescale is more effective than bias correction at the monthly timescale if sampling errors are taken into account.
Environmental influence on mussel (Mytilus edulis) growth - A quantile regression approach
NASA Astrophysics Data System (ADS)
Bergström, Per; Lindegarth, Mats
2016-03-01
The need for methods for sustainable management and use of coastal ecosystems has increased in the last century. A key aspect for obtaining ecologically and economically sustainable aquaculture in threatened coastal areas is the requirement of geographic information of growth and potential production capacity. Growth varies over time and space and depends on a complex pattern of interactions between the bivalve and a diverse range of environmental factors (e.g. temperature, salinity, food availability). Understanding these processes and modelling the environmental control of bivalve growth has been central in aquaculture. In contrast to the most conventional modelling techniques, quantile regression can handle cases where not all factors are measured and provide the possibility to estimate the effect at different levels of the response distribution and give therefore a more complete picture of the relationship between environmental factors and biological response. Observation of the relationships between environmental factors and growth of the bivalve Mytilus edulis revealed relationships that varied both among level of growth rate and within the range of environmental variables along the Swedish west coast. The strongest patterns were found for water oxygen concentration level which had a negative effect on growth for all oxygen levels and growth levels. However, these patterns coincided with differences in growth among periods and very little of the remaining variability within periods could be explained indicating that interactive processes masked the importance of the individual variables. By using quantile regression and local regression (LOESS) this study was able to provide valuable information on environmental factors influencing the growth of M. edulis and important insight for the development of ecosystem based management tools of aquaculture activities, its use in mitigation efforts and successful management of human use of coastal areas.