NASA Astrophysics Data System (ADS)
Goyal, A.; Yadav, H.; Tyagi, H.; Gosain, A. K.; Khosa, R.
2017-12-01
Increased imperviousness due to rapid urbanization have changed the urban hydrological cycle. As watersheds are urbanized, infiltration and groundwater recharge have decreased, surface runoff hydrograph shows higher peak indicating large volumes of surface runoff in lesser time durations. The ultimate panacea is to reduce the peak of hydrograph or increase the retention time of surface flow. SWMM is widely used hydrologic and hydraulic software which helps to simulate the urban storm water management with the provision to apply different techniques to prevent flooding. A model was setup to simulate the surface runoff and channel flow in a small urban catchment. It provides the temporal and spatial information of flooding in a catchment. Incorporating the detention storages in the drainage network helps achieve reduced flooding. Detention storages provided with predefined algorithms were for controlling the pluvial flooding in urban watersheds. The algorithm based on control theory, automated the functioning of detention storages ensuring that the storages become active on occurrence of flood in the storm water drains and shuts down when flooding is over. Detention storages can be implemented either at source or at several downstream control points. The proposed piece of work helps to mitigate the wastage of rainfall water, achieve desirable groundwater and attain a controlled urban storm water management system.
Flood inundation extent mapping based on block compressed tracing
NASA Astrophysics Data System (ADS)
Shen, Dingtao; Rui, Yikang; Wang, Jiechen; Zhang, Yu; Cheng, Liang
2015-07-01
Flood inundation extent, depth, and duration are important factors affecting flood hazard evaluation. At present, flood inundation analysis is based mainly on a seeded region-growing algorithm, which is an inefficient process because it requires excessive recursive computations and it is incapable of processing massive datasets. To address this problem, we propose a block compressed tracing algorithm for mapping the flood inundation extent, which reads the DEM data in blocks before transferring them to raster compression storage. This allows a smaller computer memory to process a larger amount of data, which solves the problem of the regular seeded region-growing algorithm. In addition, the use of a raster boundary tracing technique allows the algorithm to avoid the time-consuming computations required by the seeded region-growing. Finally, we conduct a comparative evaluation in the Chin-sha River basin, results show that the proposed method solves the problem of flood inundation extent mapping based on massive DEM datasets with higher computational efficiency than the original method, which makes it suitable for practical applications.
Machine Learning for Flood Prediction in Google Earth Engine
NASA Astrophysics Data System (ADS)
Kuhn, C.; Tellman, B.; Max, S. A.; Schwarz, B.
2015-12-01
With the increasing availability of high-resolution satellite imagery, dynamic flood mapping in near real time is becoming a reachable goal for decision-makers. This talk describes a newly developed framework for predicting biophysical flood vulnerability using public data, cloud computing and machine learning. Our objective is to define an approach to flood inundation modeling using statistical learning methods deployed in a cloud-based computing platform. Traditionally, static flood extent maps grounded in physically based hydrologic models can require hours of human expertise to construct at significant financial cost. In addition, desktop modeling software and limited local server storage can impose restraints on the size and resolution of input datasets. Data-driven, cloud-based processing holds promise for predictive watershed modeling at a wide range of spatio-temporal scales. However, these benefits come with constraints. In particular, parallel computing limits a modeler's ability to simulate the flow of water across a landscape, rendering traditional routing algorithms unusable in this platform. Our project pushes these limits by testing the performance of two machine learning algorithms, Support Vector Machine (SVM) and Random Forests, at predicting flood extent. Constructed in Google Earth Engine, the model mines a suite of publicly available satellite imagery layers to use as algorithm inputs. Results are cross-validated using MODIS-based flood maps created using the Dartmouth Flood Observatory detection algorithm. Model uncertainty highlights the difficulty of deploying unbalanced training data sets based on rare extreme events.
NASA Astrophysics Data System (ADS)
Smith, B. K.; Smith, J. A.; Baeck, M. L.; Miller, A. J.
2015-03-01
A physically based model of the 14 km2 Dead Run watershed in Baltimore County, MD was created to test the impacts of detention basin storage and soil storage on the hydrologic response of a small urban watershed during flood events. The Dead Run model was created using the Gridded Surface Subsurface Hydrologic Analysis (GSSHA) algorithms and validated using U.S. Geological Survey stream gaging observations for the Dead Run watershed and 5 subbasins over the largest 21 warm season flood events during 2008-2012. Removal of the model detention basins resulted in a median peak discharge increase of 11% and a detention efficiency of 0.5, which was defined as the percent decrease in peak discharge divided by percent detention controlled area. Detention efficiencies generally decreased with increasing basin size. We tested the efficiency of detention basin networks by focusing on the "drainage network order," akin to the stream order but including storm drains, streams, and culverts. The detention efficiency increased dramatically between first-order detention and second-order detention but was similar for second and third-order detention scenarios. Removal of the soil compacted layer, a common feature in urban soils, resulted in a 7% decrease in flood peak discharges. This decrease was statistically similar to the flood peak decrease caused by existing detention. Current soil storage within the Dead Run watershed decreased flood peak discharges by a median of 60%. Numerical experiment results suggested that detention basin storage and increased soil storage have the potential to substantially decrease flood peak discharges.
A dimension reduction method for flood compensation operation of multi-reservoir system
NASA Astrophysics Data System (ADS)
Jia, B.; Wu, S.; Fan, Z.
2017-12-01
Multiple reservoirs cooperation compensation operations coping with uncontrolled flood play vital role in real-time flood mitigation. This paper come up with a reservoir flood compensation operation index (ResFCOI), which formed by elements of flood control storage, flood inflow volume, flood transmission time and cooperation operations period, then establish a flood cooperation compensation operations model of multi-reservoir system, according to the ResFCOI to determine a computational order of each reservoir, and lastly the differential evolution algorithm is implemented for computing single reservoir flood compensation optimization in turn, so that a dimension reduction method is formed to reduce computational complexity. Shiguan River Basin with two large reservoirs and an extensive uncontrolled flood area, is used as a case study, results show that (a) reservoirs' flood discharges and the uncontrolled flood are superimposed at Jiangjiaji Station, while the formed flood peak flow is as small as possible; (b) cooperation compensation operations slightly increase in usage of flood storage capacity in reservoirs, when comparing to rule-based operations; (c) it takes 50 seconds in average when computing a cooperation compensation operations scheme. The dimension reduction method to guide flood compensation operations of multi-reservoir system, can make each reservoir adjust its flood discharge strategy dynamically according to the uncontrolled flood magnitude and pattern, so as to mitigate the downstream flood disaster.
NASA Astrophysics Data System (ADS)
Zhang, Shuai; Gao, Huilin
2016-08-01
Flood mitigation in developing countries has been hindered by a lack of near real-time reservoir storage information at high temporal resolution. By leveraging satellite passive microwave observations over a reservoir and its vicinity, we present a globally applicable new algorithm to estimate reservoir storage under all-weather conditions at a 4 day time step. A weighted horizontal ratio (WHR) based on the brightness temperatures at 36.5 GHz is introduced, with its coefficients calibrated against an area training data set over each reservoir. Using a predetermined area-elevation (A-H) relationship, these coefficients are then applied to the microwave data to calculate the storage. Validation results over four reservoirs in South Asia indicate that the microwave-based storage estimations (after noise reduction) perform well (with coefficients of determination ranging from 0.41 to 0.74). This is the first time that passive microwave observations are fused with other satellite data for quantifying the storage of individual reservoirs.
NASA Astrophysics Data System (ADS)
Zhang, J.; Lei, X.; Liu, P.; Wang, H.; Li, Z.
2017-12-01
Flood control operation of multi-reservoir systems such as parallel reservoirs and hybrid reservoirs often suffer from complex interactions and trade-off among tributaries and the mainstream. The optimization of such systems is computationally intensive due to nonlinear storage curves, numerous constraints and complex hydraulic connections. This paper aims to derive the optimal flood control operating rules based on the trade-off among tributaries and the mainstream using a new algorithm known as weighted non-dominated sorting genetic algorithm II (WNSGA II). WNSGA II could locate the Pareto frontier in non-dominated region efficiently due to the directed searching by weighted crowding distance, and the results are compared with those of conventional operating rules (COR) and single objective genetic algorithm (GA). Xijiang river basin in China is selected as a case study, with eight reservoirs and five flood control sections within four tributaries and the mainstream. Furthermore, the effects of inflow uncertainty have been assessed. Results indicate that: (1) WNSGA II could locate the non-dominated solutions faster and provide better Pareto frontier than the traditional non-dominated sorting genetic algorithm II (NSGA II) due to the weighted crowding distance; (2) WNSGA II outperforms COR and GA on flood control in the whole basin; (3) The multi-objective operating rules from WNSGA II deal with the inflow uncertainties better than COR. Therefore, the WNSGA II can be used to derive stable operating rules for large-scale reservoir systems effectively and efficiently.
Monitoring Reservoir Storage in South Asia from Satellite Remote Sensing
NASA Astrophysics Data System (ADS)
Zhang, S.; Gao, H.; Naz, B.
2013-12-01
Realtime reservoir storage information is essential for accurate flood monitoring and prediction in South Asia, where the fatality rate (by area) due to floods is among the highest in the world. However, South Asia is dominated by international river basins where communications among neighboring countries about reservoir storage and management are extremely limited. In this study, we use a suite of NASA satellite observations to achieve high quality estimation of reservoir storage and storage variations at near realtime in South Asia. The monitoring approach employs vegetation indices from the Moderate Resolution Imaging Spectroradiometer (MODIS) 16-day 250 m MOD13Q1 product and the surface elevation data from the Geoscience Laser Altimeter System (GLAS) on board the Ice, Cloud and land Elevation Satellite (ICESat). This approach contains four steps: 1) identifying the reservoirs with ICESat GLAS overpasses and extracting the elevation data for these locations; 2) using the K-means method for water classification from MODIS andapplying a novel post-classification algorithm to enhance water area estimation accuracy; 3) deriving the relationship between the MODIS water surface area and the ICESat elevation; and 4) estimating the storage of reservoirs over time based on the elevation-area relationship and the MODIS water area time series. For evaluation purposes, we compared the satellite-based reservoir storage with gauge observations for 16 reservoirs in South Asia. The storage estimates were highly correlated with observations (R = 0.92 to 0.98), with values for the normalized root mean square error (NRMSE) ranging from 8.7% to 25.2%. Using this approach, storage and storage variations were estimated for 16 South Asia reservoirs from 2000 to 2012.
Automatic Boosted Flood Mapping from Satellite Data
NASA Technical Reports Server (NTRS)
Coltin, Brian; McMichael, Scott; Smith, Trey; Fong, Terrence
2016-01-01
Numerous algorithms have been proposed to map floods from Moderate Resolution Imaging Spectroradiometer (MODIS) imagery. However, most require human input to succeed, either to specify a threshold value or to manually annotate training data. We introduce a new algorithm based on Adaboost which effectively maps floods without any human input, allowing for a truly rapid and automatic response. The Adaboost algorithm combines multiple thresholds to achieve results comparable to state-of-the-art algorithms which do require human input. We evaluate Adaboost, as well as numerous previously proposed flood mapping algorithms, on multiple MODIS flood images, as well as on hundreds of non-flood MODIS lake images, demonstrating its effectiveness across a wide variety of conditions.
Wang, Mingming; Sun, Yuanxiang; Sweetapple, Chris
2017-12-15
Storage is important for flood mitigation and non-point source pollution control. However, to seek a cost-effective design scheme for storage tanks is very complex. This paper presents a two-stage optimization framework to find an optimal scheme for storage tanks using storm water management model (SWMM). The objectives are to minimize flooding, total suspended solids (TSS) load and storage cost. The framework includes two modules: (i) the analytical module, which evaluates and ranks the flooding nodes with the analytic hierarchy process (AHP) using two indicators (flood depth and flood duration), and then obtains the preliminary scheme by calculating two efficiency indicators (flood reduction efficiency and TSS reduction efficiency); (ii) the iteration module, which obtains an optimal scheme using a generalized pattern search (GPS) method based on the preliminary scheme generated by the analytical module. The proposed approach was applied to a catchment in CZ city, China, to test its capability in choosing design alternatives. Different rainfall scenarios are considered to test its robustness. The results demonstrate that the optimal framework is feasible, and the optimization is fast based on the preliminary scheme. The optimized scheme is better than the preliminary scheme for reducing runoff and pollutant loads under a given storage cost. The multi-objective optimization framework presented in this paper may be useful in finding the best scheme of storage tanks or low impact development (LID) controls. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Qu, W.; Hu, N.; Fu, J.; Lu, J.; Lu, H.; Lei, T.; Pang, Z.; Li, X.; Li, L.
2018-04-01
The economic value of the Tonle Sap Lake Floodplain to Cambodia is among the highest provided to a nation by a single ecosystem around the world. The flow of Mekong River is the primary factor affecting the Tonle Sap Lake Floodplain. The Tonle Sap Lake also plays a very important role in regulating the downstream flood of Mekong River. Hence, it is necessary to understand its temporal changes of lake surface and water storage and to analyse its relation with the flood processes of Mekong River. Monthly lake surface and water storage from July 2013 to May 2014 were first monitored based on remote sensing data. The relationship between water surface and accumulative water storage change was then established. In combination with hydrological modelling results of Mekong River Basin, the relation between the lake's water storage and the runoff of Mekong River was analysed. It is found that the water storage has a sharp increase from September to December and, after reaching its maximum in December, water storage quickly decreases with a 38.8 billion m3 of drop in only half month time from December to January, while it keeps rather stable at a lower level in other months. There is a two months' time lag between the maximum lake water storage and the Mekong River peak flood, which shows the lake's huge flood regulation role to downstream Mekong River. It shows that this remote sensing approach is feasible and reliable in quantitative monitoring of data scarce lakes.
Natural Flood Management Plus: Scaling Up Nature Based Solutions to Larger Catchments
NASA Astrophysics Data System (ADS)
Quinn, Paul; Nicholson, Alex; Adams, Russ
2017-04-01
It has been established that networks NFM features, such as ponds and wetlands, can have a significant effect on flood flow and pollution at local scales (less than 10km2). However, it is much less certain that NFM and NBS can impact at larger scales and protect larger cities. This is especially true for recent storms in the UK such as storm Desmond that caused devastation across the north of England. It is possible using observed rainfall and runoff data to estimate the amounts of storage that would be required to impact on extreme flood events. Here we will how a toolkit that will estimate the amount of storage that can be accrued through a dense networks of NFM features. The analysis suggest that the use of many hundreds of small NFM features can have a significant impact on peak flow, however we still require more storage in order to address extreme events and to satisfy flood engineers who may propose more traditional flood defences. We will also show case studies of larger NFM feature positioned on flood plains that can store significantly more flood flow. Examples designs of NFM plus feature will be shown. The storage aggregation tool will then show the degree to which storing large amounts of flood flow in NFM plus features can contribute to flood management and estimate the likely costs. Together smaller and larger NFM features if used together can produce significant flood storage and at a much lower cost than traditional schemes.
Securing mobile ad hoc networks using danger theory-based artificial immune algorithm.
Abdelhaq, Maha; Alsaqour, Raed; Abdelhaq, Shawkat
2015-01-01
A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs.
Securing Mobile Ad Hoc Networks Using Danger Theory-Based Artificial Immune Algorithm
2015-01-01
A mobile ad hoc network (MANET) is a set of mobile, decentralized, and self-organizing nodes that are used in special cases, such as in the military. MANET properties render the environment of this network vulnerable to different types of attacks, including black hole, wormhole and flooding-based attacks. Flooding-based attacks are one of the most dangerous attacks that aim to consume all network resources and thus paralyze the functionality of the whole network. Therefore, the objective of this paper is to investigate the capability of a danger theory-based artificial immune algorithm called the mobile dendritic cell algorithm (MDCA) to detect flooding-based attacks in MANETs. The MDCA applies the dendritic cell algorithm (DCA) to secure the MANET with additional improvements. The MDCA is tested and validated using Qualnet v7.1 simulation tool. This work also introduces a new simulation module for a flooding attack called the resource consumption attack (RCA) using Qualnet v7.1. The results highlight the high efficiency of the MDCA in detecting RCAs in MANETs. PMID:25946001
Remote-sensing-based rapid assessment of flood crop loss to support USDA flooding decision-making
NASA Astrophysics Data System (ADS)
Di, L.; Yu, G.; Yang, Z.; Hipple, J.; Shrestha, R.
2016-12-01
Floods often cause significant crop loss in the United States. Timely and objective assessment of flood-related crop loss is very important for crop monitoring and risk management in agricultural and disaster-related decision-making in USDA. Among all flood-related information, crop yield loss is particularly important. Decision on proper mitigation, relief, and monetary compensation relies on it. Currently USDA mostly relies on field surveys to obtain crop loss information and compensate farmers' loss claim. Such methods are expensive, labor intensive, and time consumptive, especially for a large flood that affects a large geographic area. Recent studies have demonstrated that Earth observation (EO) data are useful in post-flood crop loss assessment for a large geographic area objectively, timely, accurately, and cost effectively. There are three stages of flood damage assessment, including rapid assessment, early recovery assessment, and in-depth assessment. EO-based flood assessment methods currently rely on the time-series of vegetation index to assess the yield loss. Such methods are suitable for in-depth assessment but are less suitable for rapid assessment since the after-flood vegetation index time series is not available. This presentation presents a new EO-based method for the rapid assessment of crop yield loss immediately after a flood event to support the USDA flood decision making. The method is based on the historic records of flood severity, flood duration, flood date, crop type, EO-based both before- and immediate-after-flood crop conditions, and corresponding crop yield loss. It hypotheses that a flood of same severity occurring at the same pheonological stage of a crop will cause the similar damage to the crop yield regardless the flood years. With this hypothesis, a regression-based rapid assessment algorithm can be developed by learning from historic records of flood events and corresponding crop yield loss. In this study, historic records of MODIS-based flood and vegetation products and USDA/NASS crop type and crop yield data are used to train the regression-based rapid assessment algorithm. Validation of the rapid assessment algorithm indicates it can predict the yield loss at 90% accuracy, which is accurate enough to support USDA on flood-related quick response and mitigation.
NASA Astrophysics Data System (ADS)
Chen, Y.; Li, J.; Xu, H.
2015-10-01
Physically based distributed hydrological models discrete the terrain of the whole catchment into a number of grid cells at fine resolution, and assimilate different terrain data and precipitation to different cells, and are regarded to have the potential to improve the catchment hydrological processes simulation and prediction capability. In the early stage, physically based distributed hydrological models are assumed to derive model parameters from the terrain properties directly, so there is no need to calibrate model parameters, but unfortunately, the uncertanties associated with this model parameter deriving is very high, which impacted their application in flood forecasting, so parameter optimization may also be necessary. There are two main purposes for this study, the first is to propose a parameter optimization method for physically based distributed hydrological models in catchment flood forecasting by using PSO algorithm and to test its competence and to improve its performances, the second is to explore the possibility of improving physically based distributed hydrological models capability in cathcment flood forecasting by parameter optimization. In this paper, based on the scalar concept, a general framework for parameter optimization of the PBDHMs for catchment flood forecasting is first proposed that could be used for all PBDHMs. Then, with Liuxihe model as the study model, which is a physically based distributed hydrological model proposed for catchment flood forecasting, the improverd Particle Swarm Optimization (PSO) algorithm is developed for the parameter optimization of Liuxihe model in catchment flood forecasting, the improvements include to adopt the linear decreasing inertia weight strategy to change the inertia weight, and the arccosine function strategy to adjust the acceleration coefficients. This method has been tested in two catchments in southern China with different sizes, and the results show that the improved PSO algorithm could be used for Liuxihe model parameter optimization effectively, and could improve the model capability largely in catchment flood forecasting, thus proven that parameter optimization is necessary to improve the flood forecasting capability of physically based distributed hydrological model. It also has been found that the appropriate particle number and the maximum evolution number of PSO algorithm used for Liuxihe model catchment flood forcasting is 20 and 30, respectively.
Lamontagne, Jonathan R.; Stedinger, Jery R.; Berenbrock, Charles; Veilleux, Andrea G.; Ferris, Justin C.; Knifong, Donna L.
2012-01-01
Flood-frequency information is important in the Central Valley region of California because of the high risk of catastrophic flooding. Most traditional flood-frequency studies focus on peak flows, but for the assessment of the adequacy of reservoirs, levees, other flood control structures, sustained flood flow (flood duration) frequency data are needed. This study focuses on rainfall or rain-on-snow floods, rather than the annual maximum, because rain events produce the largest floods in the region. A key to estimating flood-duration frequency is determining the regional skew for such data. Of the 50 sites used in this study to determine regional skew, 28 sites were considered to have little to no significant regulated flows, and for the 22 sites considered significantly regulated, unregulated daily flow data were synthesized by using reservoir storage changes and diversion records. The unregulated, annual maximum rainfall flood flows for selected durations (1-day, 3-day, 7-day, 15-day, and 30-day) for all 50 sites were furnished by the U.S. Army Corps of Engineers. Station skew was determined by using the expected moments algorithm program for fitting the Pearson Type 3 flood-frequency distribution to the logarithms of annual flood-duration data. Bayesian generalized least squares regression procedures used in earlier studies were modified to address problems caused by large cross correlations among concurrent rainfall floods in California and to address the extensive censoring of low outliers at some sites, by using the new expected moments algorithm for fitting the LP3 distribution to rainfall flood-duration data. To properly account for these problems and to develop suitable regional-skew regression models and regression diagnostics, a combination of ordinary least squares, weighted least squares, and Bayesian generalized least squares regressions were adopted. This new methodology determined that a nonlinear model relating regional skew to mean basin elevation was the best model for each flood duration. The regional-skew values ranged from -0.74 for a flood duration of 1-day and a mean basin elevation less than 2,500 feet to values near 0 for a flood duration of 7-days and a mean basin elevation greater than 4,500 feet. This relation between skew and elevation reflects the interaction of snow and rain, which increases with increased elevation. The regional skews are more accurate, and the mean squared errors are less than in the Interagency Advisory Committee on Water Data's National skew map of Bulletin 17B.
NASA Astrophysics Data System (ADS)
Chen, Y.; Li, J.; Xu, H.
2016-01-01
Physically based distributed hydrological models (hereafter referred to as PBDHMs) divide the terrain of the whole catchment into a number of grid cells at fine resolution and assimilate different terrain data and precipitation to different cells. They are regarded to have the potential to improve the catchment hydrological process simulation and prediction capability. In the early stage, physically based distributed hydrological models are assumed to derive model parameters from the terrain properties directly, so there is no need to calibrate model parameters. However, unfortunately the uncertainties associated with this model derivation are very high, which impacted their application in flood forecasting, so parameter optimization may also be necessary. There are two main purposes for this study: the first is to propose a parameter optimization method for physically based distributed hydrological models in catchment flood forecasting by using particle swarm optimization (PSO) algorithm and to test its competence and to improve its performances; the second is to explore the possibility of improving physically based distributed hydrological model capability in catchment flood forecasting by parameter optimization. In this paper, based on the scalar concept, a general framework for parameter optimization of the PBDHMs for catchment flood forecasting is first proposed that could be used for all PBDHMs. Then, with the Liuxihe model as the study model, which is a physically based distributed hydrological model proposed for catchment flood forecasting, the improved PSO algorithm is developed for the parameter optimization of the Liuxihe model in catchment flood forecasting. The improvements include adoption of the linearly decreasing inertia weight strategy to change the inertia weight and the arccosine function strategy to adjust the acceleration coefficients. This method has been tested in two catchments in southern China with different sizes, and the results show that the improved PSO algorithm could be used for the Liuxihe model parameter optimization effectively and could improve the model capability largely in catchment flood forecasting, thus proving that parameter optimization is necessary to improve the flood forecasting capability of physically based distributed hydrological models. It also has been found that the appropriate particle number and the maximum evolution number of PSO algorithm used for the Liuxihe model catchment flood forecasting are 20 and 30 respectively.
NASA Astrophysics Data System (ADS)
Kumar, S.; Kaushal, D. R.; Gosain, A. K.
2017-12-01
Urban hydrology will have an increasing role to play in the sustainability of human settlements. Expansion of urban areas brings significant changes in physical characteristics of landuse. Problems with administration of urban flooding have their roots in concentration of population within a relatively small area. As watersheds are urbanized, infiltration decreases, pattern of surface runoff is changed generating high peak flows, large runoff volumes from urban areas. Conceptual rainfall-runoff models have become a foremost tool for predicting surface runoff and flood forecasting. Manual calibration is often time consuming and tedious because of the involved subjectivity, which makes automatic approach more preferable. The calibration of parameters usually includes numerous criteria for evaluating the performances with respect to the observed data. Moreover, derivation of objective function assosciat6ed with the calibration of model parameters is quite challenging. Various studies dealing with optimization methods has steered the embracement of evolution based optimization algorithms. In this paper, a systematic comparison of two evolutionary approaches to multi-objective optimization namely shuffled frog leaping algorithm (SFLA) and genetic algorithms (GA) is done. SFLA is a cooperative search metaphor, stimulated by natural memetics based on the population while, GA is based on principle of survival of the fittest and natural evolution. SFLA and GA has been employed for optimizing the major parameters i.e. width, imperviousness, Manning's coefficient and depression storage for the highly urbanized catchment of Delhi, India. The study summarizes the auto-tuning of a widely used storm water management model (SWMM), by internal coupling of SWMM with SFLA and GA separately. The values of statistical parameters such as, Nash-Sutcliffe efficiency (NSE) and Percent Bias (PBIAS) were found to lie within the acceptable limit, indicating reasonably good model performance. Overall, this study proved promising for assessing risk in urban drainage systems and should prove useful to improve integrity of the urban system, its reliability and provides guidance for inundation preparedness.Keywords: Hydrologic model, SWMM, Urbanization, SFLA and GA.
Code of Federal Regulations, 2014 CFR
2014-04-01
... or inoperative during flood and storm events (e.g., data storage centers, generating plants...” (§ 55.2(b)(5)). When FEMA provides interim flood hazard data, such as Advisory Base Flood Elevations... data may be used as “best available information” in accordance with Executive Order 11988. However, a...
NASA Astrophysics Data System (ADS)
ShiouWei, L.
2014-12-01
Reservoirs are the most important water resources facilities in Taiwan.However,due to the steep slope and fragile geological conditions in the mountain area,storm events usually cause serious debris flow and flood,and the flood then will flush large amount of sediment into reservoirs.The sedimentation caused by flood has great impact on the reservoirs life.Hence,how to operate a reservoir during flood events to increase the efficiency of sediment desilting without risk the reservoir safety and impact the water supply afterward is a crucial issue in Taiwan. Therefore,this study developed a novel optimization planning model for reservoir flood operation considering flood control and sediment desilting,and proposed easy to use operating rules represented by decision trees.The decision trees rules have considered flood mitigation,water supply and sediment desilting.The optimal planning model computes the optimal reservoir release for each flood event that minimum water supply impact and maximum sediment desilting without risk the reservoir safety.Beside the optimal flood operation planning model,this study also proposed decision tree based flood operating rules that were trained by the multiple optimal reservoir releases to synthesis flood scenarios.The synthesis flood scenarios consists of various synthesis storm events,reservoir's initial storage and target storages at the end of flood operating. Comparing the results operated by the decision tree operation rules(DTOR) with that by historical operation for Krosa Typhoon in 2007,the DTOR removed sediment 15.4% more than that of historical operation with reservoir storage only8.38×106m3 less than that of historical operation.For Jangmi Typhoon in 2008,the DTOR removed sediment 24.4% more than that of historical operation with reservoir storage only 7.58×106m3 less than that of historical operation.The results show that the proposed DTOR model can increase the sediment desilting efficiency and extend the reservoir life.
Evaluation of a physically based quasi-linear and a conceptually based nonlinear Muskingum methods
NASA Astrophysics Data System (ADS)
Perumal, Muthiah; Tayfur, Gokmen; Rao, C. Madhusudana; Gurarslan, Gurhan
2017-03-01
Two variants of the Muskingum flood routing method formulated for accounting nonlinearity of the channel routing process are investigated in this study. These variant methods are: (1) The three-parameter conceptual Nonlinear Muskingum (NLM) method advocated by Gillin 1978, and (2) The Variable Parameter McCarthy-Muskingum (VPMM) method recently proposed by Perumal and Price in 2013. The VPMM method does not require rigorous calibration and validation procedures as required in the case of NLM method due to established relationships of its parameters with flow and channel characteristics based on hydrodynamic principles. The parameters of the conceptual nonlinear storage equation used in the NLM method were calibrated using the Artificial Intelligence Application (AIA) techniques, such as the Genetic Algorithm (GA), the Differential Evolution (DE), the Particle Swarm Optimization (PSO) and the Harmony Search (HS). The calibration was carried out on a given set of hypothetical flood events obtained by routing a given inflow hydrograph in a set of 40 km length prismatic channel reaches using the Saint-Venant (SV) equations. The validation of the calibrated NLM method was investigated using a different set of hypothetical flood hydrographs obtained in the same set of channel reaches used for calibration studies. Both the sets of solutions obtained in the calibration and validation cases using the NLM method were compared with the corresponding solutions of the VPMM method based on some pertinent evaluation measures. The results of the study reveal that the physically based VPMM method is capable of accounting for nonlinear characteristics of flood wave movement better than the conceptually based NLM method which requires the use of tedious calibration and validation procedures.
Natural Flood Management in context: evaluating and enhancing the impact.
NASA Astrophysics Data System (ADS)
Metcalfe, Peter; Beven, Keith; Hankin, Barry; Lamb, Rob
2016-04-01
The series of flood events in the UK throughout December 2015 have led to calls for a reappraisal of the country's approach to flood management. In parts of Cumbria so-called "1 in 100" year floods have occurred three times in the last ten years, leading to significant infrastructure damage. Hard-engineered defences upgraded to cope with an anticipated 20% increase in peak flows and these 1% AEP events have been overwhelmed. It has become more widely acknowledged that unsympathetic agricultural and upland management practices, mainly since the Second World War, have led to a significant loss of storage in mid and upper catchments and their consequent ability to retain and slow storm run-off. Natural Flood Management (NFM) is a nature-based solution to restoring this storage and flood peak attenuation through a network of small-scale features exploiting natural topography and materials. Combined with other "soft" interventions such as restoring flood plain roughness and tree-planting, NFM offers the attractive prospect of an intervention that can target both the ecological and chemical objectives of the Water Framework Directive and the resilience demanded by the Floods Directive. We developed a simple computerised physical routing model that can account for the presence of in-channel and offline features such as would be found in a NFM scheme. These will add storage to the channel and floodplain and throttle the downstream discharge at storm flows. The model was applied to the heavily-modified channel network of an agricultural catchment in North Yorkshire using the run-off simulated for two storm events that caused flooding downstream in the autumn of 2012. Using up to 60 online features we demonstrated some gains in channel storage and a small impact on the flood hydrograph which would, however, have been insufficient to prevent the downstream floods in either of the storms. Complementary research at JBA has applied their hydrodynamic model JFLOW+ to identify areas of the catchment that will naturally retain storm run-off and quantified the effects of removing this storage on the run-off. It is suggested that enhancing the storage capacity of these areas may be a low impact approach in keeping with the ethos of NFM that has a significant, and quantifiable impact, on storm flows.
Peak reduction for commercial buildings using energy storage
NASA Astrophysics Data System (ADS)
Chua, K. H.; Lim, Y. S.; Morris, S.
2017-11-01
Battery-based energy storage has emerged as a cost-effective solution for peak reduction due to the decrement of battery’s price. In this study, a battery-based energy storage system is developed and implemented to achieve an optimal peak reduction for commercial customers with the limited energy capacity of the energy storage. The energy storage system is formed by three bi-directional power converter rated at 5 kVA and a battery bank with capacity of 64 kWh. Three control algorithms, namely fixed-threshold, adaptive-threshold, and fuzzy-based control algorithms have been developed and implemented into the energy storage system in a campus building. The control algorithms are evaluated and compared under different load conditions. The overall experimental results show that the fuzzy-based controller is the most effective algorithm among the three controllers in peak reduction. The fuzzy-based control algorithm is capable of incorporating a priori qualitative knowledge and expertise about the load characteristic of the buildings as well as the useable energy without over-discharging the batteries.
76 FR 17019 - List of Approved Spent Fuel Storage Casks: HI-STORM Flood/Wind Addition
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-28
... Storage Casks: HI-STORM Flood/Wind Addition AGENCY: Nuclear Regulatory Commission. ACTION: Direct final... regulations to add the HI-STORM Flood/Wind cask system to the ``List of Approved Spent Fuel Storage Casks... cask designs. Discussion This rule will add the Holtec HI-STORM Flood/Wind (FW) cask system to the list...
33 CFR 208.82 - Hetch Hetchy, Cherry Valley, and Don Pedro Dams and Reservoirs.
Code of Federal Regulations, 2012 CFR
2012-07-01
... flood control all as follows: (a) Storage space in Don Pedro Reservoir shall be kept available for flood-control purposes in accordance with the Flood-Control Storage Reservation Diagram currently in force for... section. The Flood-Control Storage Reservation Diagram in force as of the promulgation of this section is...
33 CFR 208.82 - Hetch Hetchy, Cherry Valley, and Don Pedro Dams and Reservoirs.
Code of Federal Regulations, 2013 CFR
2013-07-01
... flood control all as follows: (a) Storage space in Don Pedro Reservoir shall be kept available for flood-control purposes in accordance with the Flood-Control Storage Reservation Diagram currently in force for... section. The Flood-Control Storage Reservation Diagram in force as of the promulgation of this section is...
NASA Astrophysics Data System (ADS)
Lucey, J.; Reager, J. T., II; Lopez, S. R.
2017-12-01
Floods annually cause several weather-related fatalities and financial losses. According to NOAA and FEMA, there were 43 deaths and 18 billion dollars paid out in flood insurance policies during 2005. The goal of this work is to improve flood prediction and flood risk assessment by creating a general model of predictability of extreme runoff generation using various NASA products. Using satellite-based flood inundation observations, we can relate surface water formation processes to changes in other hydrological variables, such as precipitation, storage and soil moisture, and understand how runoff generation response to these forcings is modulated by local topography and land cover. Since it is known that a flood event would cause an abnormal increase in surface water, we examine these underlying physical relationships in comparison with the Dartmouth Flood Observatory archive of historic flood events globally. Using ground water storage observations (GRACE), precipitation (TRMM or GPCP), land use (MODIS), elevation (SRTM) and surface inundation levels (SWAMPS), an assessment of geological and climate conditions can be performed for any location around the world. This project utilizes multiple linear regression analysis evaluating the relationship between surface water inundation, total water storage anomalies and precipitation values, grouped by average slope or land use, to determine their statistical relationships and influences on inundation data. This research demonstrates the potential benefits of using global data products for early flood prediction and will improve our understanding of runoff generation processes.
NASA Astrophysics Data System (ADS)
Wright, N.
2015-12-01
Hydrologic restoration in urban creeks is increasingly regarded as a more sustainable option than traditional grey infrastructures in many countries including the UK and USA. Hydrologic restoration aims to recreate naturally oriented hydro-morphodynamic processes while adding ecological and amenity value to a river corridor. Nevertheless, the long-term hydraulic performance of river restorations is incompletely understood. The aim of this research was to investigate the long-term effects of river restoration on the water storage, flood attenuation and sediment dynamics of two urban creeks through detailed hydro-morphodynamic modelling. The first case study is based on Johnson Creek located at Portland, Oregon, USA, and the second case based on Ouseburn River in Newcastle upon Tyne, N.E. England. This study focuses on the downstream of the Johnson Creek, where creek is reconnected to a restored East Lents floodplain of 0.28 km2. In order to offset the increased urban runoff in the Ouseburn catchment, a number of attenuation ponds were implemented along the river. In this study, an integrated 1D and 2D flood model (ISIS - TUFLOW) and the recently updated layer-based hydro-morphodynamic model have been used to understand the long-term impacts of these restorations on the flood and sediment dynamics. The event-based simulations (500 year, 100 year, 50 year, 10 year and 5 year), as well as the continuous simulations based on the historical flow datasets were systematically undertaken. Simulation results showed that the flood storage as a result of river restoration attenuate the flood peak by up to 25% at the downstream. Results also indicated that about 30% of the sediments generated from the upstream deposited in the resorted regions. The spatial distribution and amount of short and long-term sediment deposition on the floodplain and pond are demonstrated, and the resulting potential loss of the flood storage capacity are analysed and discussed.
NASA Astrophysics Data System (ADS)
Li, Yuanbo; Cui, Xiaoqian; Wang, Hongbei; Zhao, Mengge; Ding, Hongbin
2017-10-01
Digital speckle pattern interferometry (DSPI) can diagnose the topography evolution in real-time, continuous and non-destructive, and has been considered as a most promising technique for Plasma-Facing Components (PFCs) topography diagnostic under the complicated environment of tokamak. It is important for the study of digital speckle pattern interferometry to enhance speckle patterns and obtain the real topography of the ablated crater. In this paper, two kinds of numerical model based on flood-fill algorithm has been developed to obtain the real profile by unwrapping from the wrapped phase in speckle interference pattern, which can be calculated through four intensity images by means of 4-step phase-shifting technique. During the process of phase unwrapping by means of flood-fill algorithm, since the existence of noise pollution, and other inevitable factors will lead to poor quality of the reconstruction results, this will have an impact on the authenticity of the restored topography. The calculation of the quality parameters was introduced to obtain the quality-map from the wrapped phase map, this work presents two different methods to calculate the quality parameters. Then quality parameters are used to guide the path of flood-fill algorithm, and the pixels with good quality parameters are given priority calculation, so that the quality of speckle interference pattern reconstruction results are improved. According to the comparison between the flood-fill algorithm which is suitable for speckle pattern interferometry and the quality-guided flood-fill algorithm (with two different calculation approaches), the errors which caused by noise pollution and the discontinuous of the strips were successfully reduced.
76 FR 33121 - List of Approved Spent Fuel Storage Casks: HI-STORM Flood/Wind Addition
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-08
... Storage Casks: HI-STORM Flood/Wind Addition AGENCY: Nuclear Regulatory Commission. ACTION: Direct final... regulations to add the Holtec HI-STORM Flood/Wind cask system to the ``List of Approved Spent Fuel Storage... Title 10 of the Code of Federal Regulations Section 72.214 to add the Holtec HI- STORM Flood/Wind cask...
NASA Astrophysics Data System (ADS)
Matgen, Patrick; Giustarini, Laura; Hostache, Renaud
2012-10-01
This paper introduces an automatic flood mapping application that is hosted on the Grid Processing on Demand (GPOD) Fast Access to Imagery (Faire) environment of the European Space Agency. The main objective of the online application is to deliver operationally flooded areas using both recent and historical acquisitions of SAR data. Having as a short-term target the flooding-related exploitation of data generated by the upcoming ESA SENTINEL-1 SAR mission, the flood mapping application consists of two building blocks: i) a set of query tools for selecting the "crisis image" and the optimal corresponding "reference image" from the G-POD archive and ii) an algorithm for extracting flooded areas via change detection using the previously selected "crisis image" and "reference image". Stakeholders in flood management and service providers are able to log onto the flood mapping application to get support for the retrieval, from the rolling archive, of the most appropriate reference image. Potential users will also be able to apply the implemented flood delineation algorithm. The latter combines histogram thresholding, region growing and change detection as an approach enabling the automatic, objective and reliable flood extent extraction from SAR images. Both algorithms are computationally efficient and operate with minimum data requirements. The case study of the high magnitude flooding event that occurred in July 2007 on the Severn River, UK, and that was observed with a moderateresolution SAR sensor as well as airborne photography highlights the performance of the proposed online application. The flood mapping application on G-POD can be used sporadically, i.e. whenever a major flood event occurs and there is a demand for SAR-based flood extent maps. In the long term, a potential extension of the application could consist in systematically extracting flooded areas from all SAR images acquired on a daily, weekly or monthly basis.
STREET SURFACE STORAGE FOR CONTROL OF COMBINED SEWER SURCHARGE
One type of Best Management Practices (BMPs) available is the use of street storage systems to prevent combined sewer surcharging and to mitigate basement flooding. A case study approach, based primarily on two largely implemented street storage systems, will be used to explain ...
The use of Natural Flood Management to mitigate local flooding in the rural landscape
NASA Astrophysics Data System (ADS)
Wilkinson, Mark; Quinn, Paul; Ghimire, Sohan; Nicholson, Alex; Addy, Steve
2014-05-01
The past decade has seen increases in the occurrence of flood events across Europe, putting a growing number of settlements of varying sizes at risk. The issue of flooding in smaller villages is usually not well publicised. In these small communities, the cost of constructing and maintaining traditional flood defences often outweigh the potential benefits, which has led to a growing quest for more cost effective and sustainable approaches. Here we aim to provide such an approach that alongside flood risk reduction, also has multipurpose benefits of sediment control, water quality amelioration, and habitat creation. Natural flood management (NFM) aims to reduce flooding by working with natural features and characteristics to slow down or temporarily store flood waters. NFM measures include dynamic water storage ponds and wetlands, interception bunds, channel restoration and instream wood placement, and increasing soil infiltration through soil management and tree planting. Based on integrated monitoring and modelling studies, we demonstrate the potential to manage runoff locally using NFM in rural systems by effectively managing flow pathways (hill slopes and small channels) and by exploiting floodplains and buffers strips. Case studies from across the UK show that temporary storage ponds (ranging from 300 to 3000m3) and other NFM measures can reduce peak flows in small catchments (5 to 10 km2) by up to 15 to 30 percent. In addition, increasing the overall effective storage capacity by a network of NFM measures was found to be most effective for total reduction of local flood peaks. Hydraulic modelling has shown that the positioning of such features within the catchment, and how they are connected to the main channel, may also affect their effectiveness. Field evidence has shown that these ponds can collect significant accumulations of fine sediment during flood events. On the other hand, measures such as wetlands could also play an important role during low flow conditions, by providing base flows during drought conditions. Ongoing research using hydrological datasets aims to assess how these features function during low flow conditions and how storage ponds could be used as irrigation ponds in arable areas. To allow for effective implementation and upkeep of NFM measures on the ground, demonstration sites have been developed through a process of iterative stakeholder engagement. Coupled with the use of novel visualisation techniques, results are currently being communicated to a wider community of local landowners and catchment managers. The approach of using networks of interception bunds and offline storage areas in the rural landscape could potentially provide a cost effective means to reduce flood risk in small responsive catchments across Europe. As such it could provide an alternative or addition to traditional engineering techniques, while also effectively managing catchments to achieve multiple environmental objectives.
Predicting Coastal Flood Severity using Random Forest Algorithm
NASA Astrophysics Data System (ADS)
Sadler, J. M.; Goodall, J. L.; Morsy, M. M.; Spencer, K.
2017-12-01
Coastal floods have become more common recently and are predicted to further increase in frequency and severity due to sea level rise. Predicting floods in coastal cities can be difficult due to the number of environmental and geographic factors which can influence flooding events. Built stormwater infrastructure and irregular urban landscapes add further complexity. This paper demonstrates the use of machine learning algorithms in predicting street flood occurrence in an urban coastal setting. The model is trained and evaluated using data from Norfolk, Virginia USA from September 2010 - October 2016. Rainfall, tide levels, water table levels, and wind conditions are used as input variables. Street flooding reports made by city workers after named and unnamed storm events, ranging from 1-159 reports per event, are the model output. Results show that Random Forest provides predictive power in estimating the number of flood occurrences given a set of environmental conditions with an out-of-bag root mean squared error of 4.3 flood reports and a mean absolute error of 0.82 flood reports. The Random Forest algorithm performed much better than Poisson regression. From the Random Forest model, total daily rainfall was by far the most important factor in flood occurrence prediction, followed by daily low tide and daily higher high tide. The model demonstrated here could be used to predict flood severity based on forecast rainfall and tide conditions and could be further enhanced using more complete street flooding data for model training.
NASA Astrophysics Data System (ADS)
Tellman, B.; Sullivan, J.; Kettner, A.; Brakenridge, G. R.; Slayback, D. A.; Kuhn, C.; Doyle, C.
2016-12-01
There is an increasing need to understand flood vulnerability as the societal and economic effects of flooding increases. Risk models from insurance companies and flood models from hydrologists must be calibrated based on flood observations in order to make future predictions that can improve planning and help societies reduce future disasters. Specifically, to improve these models both traditional methods of flood prediction from physically based models as well as data-driven techniques, such as machine learning, require spatial flood observation to validate model outputs and quantify uncertainty. A key dataset that is missing for flood model validation is a global historical geo-database of flood event extents. Currently, the most advanced database of historical flood extent is hosted and maintained at the Dartmouth Flood Observatory (DFO) that has catalogued 4320 floods (1985-2015) but has only mapped 5% of these floods. We are addressing this data gap by mapping the inventory of floods in the DFO database to create a first-of- its-kind, comprehensive, global and historical geospatial database of flood events. To do so, we combine water detection algorithms on MODIS and Landsat 5,7 and 8 imagery in Google Earth Engine to map discrete flood events. The created database will be available in the Earth Engine Catalogue for download by country, region, or time period. This dataset can be leveraged for new data-driven hydrologic modeling using machine learning algorithms in Earth Engine's highly parallelized computing environment, and we will show examples for New York and Senegal.
NASA Astrophysics Data System (ADS)
Brakenridge, G. R.; Birkett, C. M.
2013-12-01
Presently operating satellite-based radar altimeters have the ability to monitor variations in surface water height for large lakes and reservoirs, and future sensors will expand observational capabilities to many smaller water bodies. Such remote sensing provides objective, independent information where in situ data are lacking or access is restricted. A USDA/NASA (http://www.pecad.fas.usda.gov/cropexplorer/global_reservoir/) program is performing operational altimetric monitoring of the largest lakes and reservoirs around the world using data from the NASA/CNES, NRL, and ESA missions. Public lake-level products from the Global Reservoir and Lake Monitor (GRLM) are a combination of archived and near real time information. The USDA/FAS utilizes the products for assessing international irrigation potential and for crop production estimates; other end-users study climate trends, observe anthropogenic effects, and/or are are involved in other water resources management and regional water security issues. At the same time, the Dartmouth Flood Observatory (http://floodobservatory.colorado.edu/), its NASA GSFC partners (http://oas.gsfc.nasa.gov/floodmap/home.html), and associated MODIS data and automated processing algorithms are providing public access to a growing GIS record of the Earth's changing surface water extent, including changes related to floods and droughts. The Observatory's web site also provide both archival and near real time information, and is based mainly on the highest spatial resolution (250 m) MODIS bands. Therefore, it is now possible to provide on an international basis reservoir and lake storage change measurements entirely from remote sensing, on a frequently updating basis. The volume change values are based on standard numerical procedures used for many decades for analysis of coeval lake area and height data. We provide first results of this combination, including prototype displays for public access and data retrieval of water storage volume changes. Ground-based data can, in some cases, test the remote sensing accuracy and precision. Data accuracy requirements vary for different applications: reservoir management for flood control, agriculture, or power generation may need more accurate and timely information than (for example) regional assessments of water and food security issues. Thus, the long-term goal for the hydrological sciences community should be to efficiently mesh both types of information and with as extensive geographic coverage as possible.
NASA Astrophysics Data System (ADS)
Hostache, Renaud; Chini, Marco; Matgen, Patrick; Giustarini, Laura
2013-04-01
There is a clear need for developing innovative processing chains based on earth observation (EO) data to generate products supporting emergency response and flood management at a global scale. Here an automatic flood mapping application is introduced. The latter is currently hosted on the Grid Processing on Demand (G-POD) Fast Access to Imagery (Faire) environment of the European Space Agency. The main objective of the online application is to deliver flooded areas using both recent and historical acquisitions of SAR data in an operational framework. It is worth mentioning that the method can be applied to both medium and high resolution SAR images. The flood mapping application consists of two main blocks: 1) A set of query tools for selecting the "crisis image" and the optimal corresponding pre-flood "reference image" from the G-POD archive. 2) An algorithm for extracting flooded areas using the previously selected "crisis image" and "reference image". The proposed method is a hybrid methodology, which combines histogram thresholding, region growing and change detection as an approach enabling the automatic, objective and reliable flood extent extraction from SAR images. The method is based on the calibration of a statistical distribution of "open water" backscatter values inferred from SAR images of floods. Change detection with respect to a pre-flood reference image helps reducing over-detection of inundated areas. The algorithms are computationally efficient and operate with minimum data requirements, considering as input data a flood image and a reference image. Stakeholders in flood management and service providers are able to log onto the flood mapping application to get support for the retrieval, from the rolling archive, of the most appropriate pre-flood reference image. Potential users will also be able to apply the implemented flood delineation algorithm. Case studies of several recent high magnitude flooding events (e.g. July 2007 Severn River flood, UK and March 2010 Red River flood, US) observed by high-resolution SAR sensors as well as airborne photography highlight advantages and limitations of the online application. A mid-term target is the exploitation of ESA SENTINEL 1 SAR data streams. In the long term it is foreseen to develop a potential extension of the application for systematically extracting flooded areas from all SAR images acquired on a daily, weekly or monthly basis. On-going research activities investigate the usefulness of the method for mapping flood hazard at global scale using databases of historic SAR remote sensing-derived flood inundation maps.
Nature-based flood risk management -challenges in implementing catchment-wide management concepts
NASA Astrophysics Data System (ADS)
Thaler, Thomas; Fuchs, Sven
2017-04-01
Traditionally, flood risk management focused on coping with the flow at a given point by, for example, building dikes or straightening the watercourse. Increasingly the emphasis has shifted to measures within the flood plain to delay the flow through storage. As such the fluent boundaries imposed by the behaviour of the catchment at a certain point are relocated upstream by the human intervention. Therefore, the implementation of flood storages and the use of natural retention areas are promoted as mitigation measures to support sustainable flood risk management. They aimed at reducing the effluent boundaries on the floodplain by increasing the effluent boundaries upstream. However, beyond the simple change of practices it is indeed often a question of land use change which is at stake in water management. As such, it poses the questions on how to govern both water and land to satisfy the different stakeholders. Nature-based strategies often follow with voluntary agreements, which are promoted as an alternative instrument to the traditional top-down command and control regulation. Voluntary agreements aim at bringing more efficiency, participatory and transparency in solving problems between different social groups. In natural hazard risk management voluntary agreements are now receiving high interests to complement the existing policy instruments in order to achieve the objectives the EU WFD and of the Floods Directive. This paper investigates the use of voluntary agreements as an alternative instrument to the traditional top-down command and control regulation in the implementation of flood storages in Austria. The paper provides a framework of analysis to reveal barriers and opportunities associated with such approach. The paper concludes that institution and power are the central elements to tackle for allowing the success of voluntary agreement.
NASA Astrophysics Data System (ADS)
ChePa, Noraziah; Hashim, Nor Laily; Yusof, Yuhanis; Hussain, Azham
2016-08-01
Flood evacuation centre is defined as a temporary location or area of people from disaster particularly flood as a rescue or precautionary measure. Gazetted evacuation centres are normally located at secure places which have small chances from being drowned by flood. However, due to extreme flood several evacuation centres in Kelantan were unexpectedly drowned. Currently, there is no study done on proposing a decision support aid to reallocate victims and resources of the evacuation centre when the situation getting worsens. Therefore, this study proposes a decision aid model to be utilized in realizing an adaptive emergency evacuation centre management system. This study undergoes two main phases; development of algorithm and models, and development of a web-based and mobile app. The proposed model operates using Firefly multi-objective optimization algorithm that creates an optimal schedule for the relocation of victims and resources for an evacuation centre. The proposed decision aid model and the adaptive system can be applied in supporting the National Security Council's respond mechanisms for handling disaster management level II (State level) especially in providing better management of the flood evacuating centres.
Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.
Pan, Shaoming; Li, Yongkai; Xu, Zhengquan; Chong, Yanwen
2015-01-01
Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.
Optimal Hedging Rule for Reservoir Refill Operation
NASA Astrophysics Data System (ADS)
Wan, W.; Zhao, J.; Lund, J. R.; Zhao, T.; Lei, X.; Wang, H.
2015-12-01
This paper develops an optimal reservoir Refill Hedging Rule (RHR) for combined water supply and flood operation using mathematical analysis. A two-stage model is developed to formulate the trade-off between operations for conservation benefit and flood damage in the reservoir refill season. Based on the probability distribution of the maximum refill water availability at the end of the second stage, three zones are characterized according to the relationship among storage capacity, expected storage buffer (ESB), and maximum safety excess discharge (MSED). The Karush-Kuhn-Tucker conditions of the model show that the optimality of the refill operation involves making the expected marginal loss of conservation benefit from unfilling (i.e., ending storage of refill period less than storage capacity) as nearly equal to the expected marginal flood damage from levee overtopping downstream as possible while maintaining all constraints. This principle follows and combines the hedging rules for water supply and flood management. A RHR curve is drawn analogously to water supply hedging and flood hedging rules, showing the trade-off between the two objectives. The release decision result has a linear relationship with the current water availability, implying the linearity of RHR for a wide range of water conservation functions (linear, concave, or convex). A demonstration case shows the impacts of factors. Larger downstream flood conveyance capacity and empty reservoir capacity allow a smaller current release and more water can be conserved. Economic indicators of conservation benefit and flood damage compete with each other on release, the greater economic importance of flood damage is, the more water should be released in the current stage, and vice versa. Below a critical value, improving forecasts yields less water release, but an opposing effect occurs beyond this critical value. Finally, the Danjiangkou Reservoir case study shows that the RHR together with a rolling horizon decision approach can lead to a gradual dynamic refilling, indicating its potential for practical use.
American River Watershed Investigation, California. Reconnaisance Report
1988-01-01
studies, and (4) identification of a non-federal sponsor for the feasibility study. The primary study area included the lower American River between Nimbus...FEMA), is r’esponsible for administering the National Flood Insurance Program (NFIP).. A basic goal of the NFIP is the identification of flood plain...RESERVO]R - RE:QUIRED FLOOD COVfIROL SPACI (1,000 ac--ft) Level of Protection Total Flood Folsom Flood New Upstream (Return Period - Storage Storage 2
Understanding Flood Seasonality and Its Temporal Shifts within the Contiguous United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ye, Sheng; Li, Hong-Yi; Leung, L. Ruby
2017-07-01
Understanding the causes of flood seasonality is critical for better flood management. This study examines the seasonality of annual maximum floods (AMF) and its changes before and after 1980 at over 250 natural catchments across the contiguous United States. Using circular statistics to define a seasonality index, our analysis focuses on the variability of the flood occurrence date. Generally, catchments with more synchronized seasonal water and energy cycles largely inherit their seasonality of AMF from that of annual maximum rainfall (AMR). In contrast, the seasonality of AMF in catchments with loosely synchronized water and energy cycles are more influenced bymore » high antecedent storage, which is responsible for the amplification of the seasonality of AMF over that of AMR. This understanding then effectively explains a statistically significant shift of flood seasonality detected in some catchments in the recent decades. Catchments where the antecedent soil water storage has increased since 1980 exhibit increasing flood seasonality while catchments that have experienced increases in storm rainfall before the floods have shifted towards floods occurring more variably across the seasons. In the eastern catchments, a concurrent widespread increase in event rainfall magnitude and reduced soil water storage have led to a more variable timing of floods. Our findings of the role of antecedent storage and event rainfall on the flood seasonality provide useful insights for understanding future changes in flood seasonality as climate models projected changes in extreme precipitation and aridity over land.« less
Research on classified real-time flood forecasting framework based on K-means cluster and rough set.
Xu, Wei; Peng, Yong
2015-01-01
This research presents a new classified real-time flood forecasting framework. In this framework, historical floods are classified by a K-means cluster according to the spatial and temporal distribution of precipitation, the time variance of precipitation intensity and other hydrological factors. Based on the classified results, a rough set is used to extract the identification rules for real-time flood forecasting. Then, the parameters of different categories within the conceptual hydrological model are calibrated using a genetic algorithm. In real-time forecasting, the corresponding category of parameters is selected for flood forecasting according to the obtained flood information. This research tests the new classified framework on Guanyinge Reservoir and compares the framework with the traditional flood forecasting method. It finds that the performance of the new classified framework is significantly better in terms of accuracy. Furthermore, the framework can be considered in a catchment with fewer historical floods.
Topography-based Flood Planning and Optimization Capability Development Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judi, David R.; Tasseff, Byron A.; Bent, Russell W.
2014-02-26
Globally, water-related disasters are among the most frequent and costly natural hazards. Flooding inflicts catastrophic damage on critical infrastructure and population, resulting in substantial economic and social costs. NISAC is developing LeveeSim, a suite of nonlinear and network optimization models, to predict optimal barrier placement to protect critical regions and infrastructure during flood events. LeveeSim currently includes a high-performance flood model to simulate overland flow, as well as a network optimization model to predict optimal barrier placement during a flood event. The LeveeSim suite models the effects of flooding in predefined regions. By manipulating a domain’s underlying topography, developers alteredmore » flood propagation to reduce detrimental effects in areas of interest. This numerical altering of a domain’s topography is analogous to building levees, placing sandbags, etc. To induce optimal changes in topography, NISAC used a novel application of an optimization algorithm to minimize flooding effects in regions of interest. To develop LeveeSim, NISAC constructed and coupled hydrodynamic and optimization algorithms. NISAC first implemented its existing flood modeling software to use massively parallel graphics processing units (GPUs), which allowed for the simulation of larger domains and longer timescales. NISAC then implemented a network optimization model to predict optimal barrier placement based on output from flood simulations. As proof of concept, NISAC developed five simple test scenarios, and optimized topographic solutions were compared with intuitive solutions. Finally, as an early validation example, barrier placement was optimized to protect an arbitrary region in a simulation of the historic Taum Sauk dam breach.« less
An R package for the design, analysis and operation of reservoir systems
NASA Astrophysics Data System (ADS)
Turner, Sean; Ng, Jia Yi; Galelli, Stefano
2016-04-01
We present a new R package - named "reservoir" - which has been designed for rapid and easy routing of runoff through storage. The package comprises well-established tools for capacity design (e.g., the sequent peak algorithm), performance analysis (storage-yield-reliability and reliability-resilience-vulnerability analysis) and release policy optimization (Stochastic Dynamic Programming). Operating rules can be optimized for water supply, flood control and amenity objectives, as well as for maximum hydropower production. Storage-depth-area relationships are in-built, allowing users to incorporate evaporation from the reservoir surface. We demonstrate the capabilities of the software for global studies using thousands of reservoirs from the Global Reservoir and Dam (GRanD) database fed by historical monthly inflow time series from a 0.5 degree gridded global runoff dataset. The package is freely available through the Comprehensive R Archive Network (CRAN).
NASA Astrophysics Data System (ADS)
Leitão, J. P.; Carbajal, J. P.; Rieckermann, J.; Simões, N. E.; Sá Marques, A.; de Sousa, L. M.
2018-01-01
The activation of available in-sewer storage volume has been suggested as a low-cost flood and combined sewer overflow mitigation measure. However, it is currently unknown what the attributes for suitable objective functions to identify the best location for flow control devices are and the impact of those attributes on the results. In this study, we present a novel location model and efficient algorithm to identify the best location(s) to install flow limiters. The model is a screening tool that does not require hydraulic simulations but rather considers steady state instead of simplistic static flow conditions. It also maximises in-sewer storage according to different reward functions that also considers the potential impact of flow control device failure. We demonstrate its usefulness on two real sewer networks, for which an in-sewer storage potential of approximately 2,000 m3 and 500 m3 was estimated with five flow control devices installed.
Hydrological controls on transient aquifer storage in a karst watershed
NASA Astrophysics Data System (ADS)
Spellman, P.; Martin, J.; Gulley, J. D.
2017-12-01
While surface storage of floodwaters is well-known to attenuate flood peaks, transient storage of floodwaters in aquifers is a less recognized mechanism of flood peak attenuation. The hydraulic gradient from aquifer to river controls the magnitude of transient aquifer storage and is ultimately a function of aquifer hydraulic conductivity, and effective porosity. Because bedrock and granular aquifers tend to have lower hydraulic conductivities and porosities, their ability to attenuate flood peaks is generally small. In karst aquifers, however, extensive cave systems create high hydraulic conductivities and porosities that create low antecedent hydraulic gradients between aquifers and rivers. Cave springs can reverse flow during high discharges in rivers, temporarily storing floodwaters in the aquifer thus reducing the magnitude of flood discharge downstream. To date however, very few studies have quantified the magnitude or controls of transient aquifer storage in karst watersheds. We therefore investigate controls on transient aquifer storage by using 10 years of river and groundwater data from the Suwannee River Basin, which flows over the karstic upper Floridan aquifer in north-central Florida. We use multiple linear regression to compare the effects of three hydrological controls on the magnitude of transient aquifer storage: antecedent stage, recharge and slope of hydrograph rise. We show the dominant control on transient aquifer storage is antecedent stage, whereby lower stages result in greater magnitudes of transient aquifer storage. Our results suggest that measures of groundwater levels prior to an event can be useful in determining whether transient aquifer storage will occur and may provide a useful metric for improving predictions of flood magnitudes.
Flood characteristics of urban watersheds in the United States
Sauer, Vernon B.; Thomas, W.O.; Stricker, V.A.; Wilson, K.V.
1983-01-01
A nationwide study of flood magnitude and frequency in urban areas was made for the purpose of reviewing available literature, compiling an urban flood data base, and developing methods of estimating urban floodflow characteristics in ungaged areas. The literature review contains synopses of 128 recent publications related to urban floodflow. A data base of 269 gaged basins in 56 cities and 31 States, including Hawaii, contains a wide variety of topographic and climatic characteristics, land-use variables, indices of urbanization, and flood-frequency estimates. Three sets of regression equations were developed to estimate flood discharges for ungaged sites for recurrence intervals of 2, 5, 10, 25, 50, 100, and 500 years. Two sets of regression equations are based on seven independent parameters and the third is based on three independent parameters. The only difference in the two sets of seven-parameter equations is the use of basin lag time in one and lake and reservoir storage in the other. Of primary importance in these equations is an independent estimate of the equivalent rural discharge for the ungaged basin. The equations adjust the equivalent rural discharge to an urban condition. The primary adjustment factor, or index of urbanization, is the basin development factor, a measure of the extent of development of the drainage system in the basin. This measure includes evaluations of storm drains (sewers), channel improvements, and curb-and-gutter streets. The basin development factor is statistically very significant and offers a simple and effective way of accounting for drainage development and runoff response in urban areas. Percentage of impervious area is also included in the seven-parameter equations as an additional measure of urbanization and apparently accounts for increased runoff volumes. This factor is not highly significant for large floods, which supports the generally held concept that imperviousness is not a dominant factor when soils become more saturated during large storms. Other parameters in the seven-parameter equations include drainage area size, channel slope, rainfall intensity, lake and reservoir storage, and basin lag time. These factors are all statistically significant and provide logical indices of basin conditions. The three-parameter equations include only the three most significant parameters: rural discharge, basin-development factor, and drainage area size. All three sets of regression equations provide unbiased estimates of urban flood frequency. The seven-parameter regression equations without basin lag time have average standard errors of regression varying from ? 37 percent for the 5-year flood to ? 44 percent for the 100-year flood and ? 49 percent for the 500-year flood. The other two sets of regression equations have similar accuracy. Several tests for bias, sensitivity, and hydrologic consistency are included which support the conclusion that the equations are useful throughout the United States. All estimating equations were developed from data collected on drainage basins where temporary in-channel storage, due to highway embankments, was not significant. Consequently, estimates made with these equations do not account for the reducing effect of this temporary detention storage.
Disk storage management for LHCb based on Data Popularity estimator
NASA Astrophysics Data System (ADS)
Hushchyn, Mikhail; Charpentier, Philippe; Ustyuzhanin, Andrey
2015-12-01
This paper presents an algorithm providing recommendations for optimizing the LHCb data storage. The LHCb data storage system is a hybrid system. All datasets are kept as archives on magnetic tapes. The most popular datasets are kept on disks. The algorithm takes the dataset usage history and metadata (size, type, configuration etc.) to generate a recommendation report. This article presents how we use machine learning algorithms to predict future data popularity. Using these predictions it is possible to estimate which datasets should be removed from disk. We use regression algorithms and time series analysis to find the optimal number of replicas for datasets that are kept on disk. Based on the data popularity and the number of replicas optimization, the algorithm minimizes a loss function to find the optimal data distribution. The loss function represents all requirements for data distribution in the data storage system. We demonstrate how our algorithm helps to save disk space and to reduce waiting times for jobs using this data.
Global SWOT Data Assimilation of River Hydrodynamic Model; the Twin Simulation Test of CaMa-Flood
NASA Astrophysics Data System (ADS)
Ikeshima, D.; Yamazaki, D.; Kanae, S.
2016-12-01
CaMa-Flood is a global scale model for simulating hydrodynamics in large scale rivers. It can simulate river hydrodynamics such as river discharge, flooded area, water depth and so on by inputting water runoff derived from land surface model. Recently many improvements at parameters or terrestrial data are under process to enhance the reproducibility of true natural phenomena. However, there are still some errors between nature and simulated result due to uncertainties in each model. SWOT (Surface water and Ocean Topography) is a satellite, which is going to be launched in 2021, can measure open water surface elevation. SWOT observed data can be used to calibrate hydrodynamics model at river flow forecasting and is expected to improve model's accuracy. Combining observation data into model to calibrate is called data assimilation. In this research, we developed data-assimilated river flow simulation system in global scale, using CaMa-Flood as river hydrodynamics model and simulated SWOT as observation data. Generally at data assimilation, calibrating "model value" with "observation value" makes "assimilated value". However, the observed data of SWOT satellite will not be available until its launch in 2021. Instead, we simulated the SWOT observed data using CaMa-Flood. Putting "pure input" into CaMa-Flood produce "true water storage". Extracting actual daily swath of SWOT from "true water storage" made simulated observation. For "model value", we made "disturbed water storage" by putting "noise disturbed input" to CaMa-Flood. Since both "model value" and "observation value" are made by same model, we named this twin simulation. At twin simulation, simulated observation of "true water storage" is combined with "disturbed water storage" to make "assimilated value". As the data assimilation method, we used ensemble Kalman filter. If "assimilated value" is closer to "true water storage" than "disturbed water storage", the data assimilation can be marked effective. Also by changing the input disturbance of "disturbed water storage", acceptable rate of uncertainty at the input may be discussed.
Mohanasundaram, Ranganathan; Periasamy, Pappampalayam Sanmugam
2015-01-01
The current high profile debate with regard to data storage and its growth have become strategic task in the world of networking. It mainly depends on the sensor nodes called producers, base stations, and also the consumers (users and sensor nodes) to retrieve and use the data. The main concern dealt here is to find an optimal data storage position in wireless sensor networks. The works that have been carried out earlier did not utilize swarm intelligence based optimization approaches to find the optimal data storage positions. To achieve this goal, an efficient swam intelligence approach is used to choose suitable positions for a storage node. Thus, hybrid particle swarm optimization algorithm has been used to find the suitable positions for storage nodes while the total energy cost of data transmission is minimized. Clustering-based distributed data storage is utilized to solve clustering problem using fuzzy-C-means algorithm. This research work also considers the data rates and locations of multiple producers and consumers to find optimal data storage positions. The algorithm is implemented in a network simulator and the experimental results show that the proposed clustering and swarm intelligence based ODS strategy is more effective than the earlier approaches.
Mapping the Recent US Hurricanes Triggered Flood Events in Near Real Time
NASA Astrophysics Data System (ADS)
Shen, X.; Lazin, R.; Anagnostou, E. N.; Wanik, D. W.; Brakenridge, G. R.
2017-12-01
Synthetic Aperture Radar (SAR) observations is the only reliable remote sensing data source to map flood inundation during severe weather events. Unfortunately, since state-of-art data processing algorithms cannot meet the automation and quality standard of a near-real-time (NRT) system, quality controlled inundation mapping by SAR currently depends heavily on manual processing, which limits our capability to quickly issue flood inundation maps at global scale. Specifically, most SAR-based inundation mapping algorithms are not fully automated, while those that are automated exhibit severe over- and/or under-detection errors that limit their potential. These detection errors are primarily caused by the strong overlap among the SAR backscattering probability density functions (PDF) of different land cover types. In this study, we tested a newly developed NRT SAR-based inundation mapping system, named Radar Produced Inundation Diary (RAPID), using Sentinel-1 dual polarized SAR data over recent flood events caused by Hurricanes Harvey, Irma, and Maria (2017). The system consists of 1) self-optimized multi-threshold classification, 2) over-detection removal using land-cover information and change detection, 3) under-detection compensation, and 4) machine-learning based correction. Algorithm details are introduced in another poster, H53J-1603. Good agreements were obtained by comparing the result from RAPID with visual interpretation of SAR images and manual processing from Dartmouth Flood Observatory (DFO) (See Figure 1). Specifically, the over- and under-detections that is typically noted in automated methods is significantly reduced to negligible levels. This performance indicates that RAPID can address the automation and accuracy issues of current state-of-art algorithms and has the potential to apply operationally on a number of satellite SAR missions, such as SWOT, ALOS, Sentinel etc. RAPID data can support many applications such as rapid assessment of damage losses and disaster alleviation/rescue at global scale.
NASA Astrophysics Data System (ADS)
Metcalfe, Peter; Beven, Keith; Hankin, Barry; Lamb, Rob
2018-04-01
Enhanced hillslope storage is utilised in natural
flood management in order to retain overland storm run-off and to reduce connectivity between fast surface flow pathways and the channel. Examples include excavated ponds, deepened or bunded accumulation areas, and gullies and ephemeral channels blocked with wooden barriers or debris dams. The performance of large, distributed networks of such measures is poorly understood. Extensive schemes can potentially retain large quantities of run-off, but there are indications that much of their effectiveness can be attributed to desynchronisation of sub-catchment flood waves. Inappropriately sited measures may therefore increase, rather than mitigate, flood risk. Fully distributed hydrodynamic models have been applied in limited studies but introduce significant computational complexity. The longer run times of such models also restrict their use for uncertainty estimation or evaluation of the many potential configurations and storm sequences that may influence the timings and magnitudes of flood waves. Here a simplified overland flow-routing module and semi-distributed representation of enhanced hillslope storage is developed. It is applied to the headwaters of a large rural catchment in Cumbria, UK, where the use of an extensive network of storage features is proposed as a flood mitigation strategy. The models were run within a Monte Carlo framework against data for a 2-month period of extreme flood events that caused significant damage in areas downstream. Acceptable realisations and likelihood weightings were identified using the GLUE uncertainty estimation framework. Behavioural realisations were rerun against the catchment model modified with the addition of the hillslope storage. Three different drainage rate parameters were applied across the network of hillslope storage. The study demonstrates that schemes comprising widely distributed hillslope storage can be modelled effectively within such a reduced complexity framework. It shows the importance of drainage rates from storage features while operating through a sequence of events. We discuss limitations in the simplified representation of overland flow-routing and representation and storage, and how this could be improved using experimental evidence. We suggest ways in which features could be grouped more strategically and thus improve the performance of such schemes.
33 CFR 208.82 - Hetch Hetchy, Cherry Valley, and Don Pedro Dams and Reservoirs.
Code of Federal Regulations, 2014 CFR
2014-07-01
...-control purposes in accordance with the Flood-Control Storage Reservation Diagram currently in force for... section. The Flood-Control Storage Reservation Diagram in force as of the promulgation of this section is...-Control Storage Reservation Diagram may be developed from time to time as necessary by the Corps of...
Large-scale runoff generation - parsimonious parameterisation using high-resolution topography
NASA Astrophysics Data System (ADS)
Gong, L.; Halldin, S.; Xu, C.-Y.
2011-08-01
World water resources have primarily been analysed by global-scale hydrological models in the last decades. Runoff generation in many of these models are based on process formulations developed at catchments scales. The division between slow runoff (baseflow) and fast runoff is primarily governed by slope and spatial distribution of effective water storage capacity, both acting at very small scales. Many hydrological models, e.g. VIC, account for the spatial storage variability in terms of statistical distributions; such models are generally proven to perform well. The statistical approaches, however, use the same runoff-generation parameters everywhere in a basin. The TOPMODEL concept, on the other hand, links the effective maximum storage capacity with real-world topography. Recent availability of global high-quality, high-resolution topographic data makes TOPMODEL attractive as a basis for a physically-based runoff-generation algorithm at large scales, even if its assumptions are not valid in flat terrain or for deep groundwater systems. We present a new runoff-generation algorithm for large-scale hydrology based on TOPMODEL concepts intended to overcome these problems. The TRG (topography-derived runoff generation) algorithm relaxes the TOPMODEL equilibrium assumption so baseflow generation is not tied to topography. TRG only uses the topographic index to distribute average storage to each topographic index class. The maximum storage capacity is proportional to the range of topographic index and is scaled by one parameter. The distribution of storage capacity within large-scale grid cells is obtained numerically through topographic analysis. The new topography-derived distribution function is then inserted into a runoff-generation framework similar VIC's. Different basin parts are parameterised by different storage capacities, and different shapes of the storage-distribution curves depend on their topographic characteristics. The TRG algorithm is driven by the HydroSHEDS dataset with a resolution of 3" (around 90 m at the equator). The TRG algorithm was validated against the VIC algorithm in a common model framework in 3 river basins in different climates. The TRG algorithm performed equally well or marginally better than the VIC algorithm with one less parameter to be calibrated. The TRG algorithm also lacked equifinality problems and offered a realistic spatial pattern for runoff generation and evaporation.
Large-scale runoff generation - parsimonious parameterisation using high-resolution topography
NASA Astrophysics Data System (ADS)
Gong, L.; Halldin, S.; Xu, C.-Y.
2010-09-01
World water resources have primarily been analysed by global-scale hydrological models in the last decades. Runoff generation in many of these models are based on process formulations developed at catchments scales. The division between slow runoff (baseflow) and fast runoff is primarily governed by slope and spatial distribution of effective water storage capacity, both acting a very small scales. Many hydrological models, e.g. VIC, account for the spatial storage variability in terms of statistical distributions; such models are generally proven to perform well. The statistical approaches, however, use the same runoff-generation parameters everywhere in a basin. The TOPMODEL concept, on the other hand, links the effective maximum storage capacity with real-world topography. Recent availability of global high-quality, high-resolution topographic data makes TOPMODEL attractive as a basis for a physically-based runoff-generation algorithm at large scales, even if its assumptions are not valid in flat terrain or for deep groundwater systems. We present a new runoff-generation algorithm for large-scale hydrology based on TOPMODEL concepts intended to overcome these problems. The TRG (topography-derived runoff generation) algorithm relaxes the TOPMODEL equilibrium assumption so baseflow generation is not tied to topography. TGR only uses the topographic index to distribute average storage to each topographic index class. The maximum storage capacity is proportional to the range of topographic index and is scaled by one parameter. The distribution of storage capacity within large-scale grid cells is obtained numerically through topographic analysis. The new topography-derived distribution function is then inserted into a runoff-generation framework similar VIC's. Different basin parts are parameterised by different storage capacities, and different shapes of the storage-distribution curves depend on their topographic characteristics. The TRG algorithm is driven by the HydroSHEDS dataset with a resolution of 3'' (around 90 m at the equator). The TRG algorithm was validated against the VIC algorithm in a common model framework in 3 river basins in different climates. The TRG algorithm performed equally well or marginally better than the VIC algorithm with one less parameter to be calibrated. The TRG algorithm also lacked equifinality problems and offered a realistic spatial pattern for runoff generation and evaporation.
NASA Astrophysics Data System (ADS)
Brogan, D. J.; Nelson, P. A.; MacDonald, L. H.
2016-12-01
Considerable advances have been made in understanding post-wildfire runoff, erosion, and mass wasting at the hillslope and small watershed scale, but the larger-scale effects on flooding, water quality, and sedimentation are often the most significant impacts. The problem is that we have virtually no watershed-specific tools to quantify the proportion of eroded sediment that is stored or delivered from watersheds larger than about 2-5 km2. In this study we are quantifying how channel and valley bottom characteristics affect post-wildfire sediment storage and delivery. Our research is based on intensive monitoring of sediment storage over time in two 15 km2 watersheds (Skin Gulch and Hill Gulch) burned in the 2012 High Park Fire using repeated cross section and longitudinal surveys from fall 2012 through summer 2016, five airborne laser scanning (ALS) datasets from fall 2012 through summer 2015, and both radar and ground-based precipitation measurements. We have computed changes in sediment storage by differencing successive cross sections, and computed spatially explicit changes in successive ALS point clouds using the multiscale model to model cloud comparison (M3C2) algorithm. These channel changes are being related to potential morphometric controls, including valley width, valley slope, confinement, contributing area, valley expansion or contraction, topographic curvature (planform and profile), and estimated sediment inputs. We hypothesize that maximum rainfall intensity and lateral confinement will be the primary independent variables that describe observed patterns of erosion and deposition, and that the results can help predict post-wildfire sediment delivery and identify high priority areas for restoration.
Cohn, T.A.; Lane, W.L.; Baier, W.G.
1997-01-01
This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record; information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters, of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.
NASA Astrophysics Data System (ADS)
Cohn, T. A.; Lane, W. L.; Baier, W. G.
This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record; information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters, of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.
General Quantum Meet-in-the-Middle Search Algorithm Based on Target Solution of Fixed Weight
NASA Astrophysics Data System (ADS)
Fu, Xiang-Qun; Bao, Wan-Su; Wang, Xiang; Shi, Jian-Hong
2016-10-01
Similar to the classical meet-in-the-middle algorithm, the storage and computation complexity are the key factors that decide the efficiency of the quantum meet-in-the-middle algorithm. Aiming at the target vector of fixed weight, based on the quantum meet-in-the-middle algorithm, the algorithm for searching all n-product vectors with the same weight is presented, whose complexity is better than the exhaustive search algorithm. And the algorithm can reduce the storage complexity of the quantum meet-in-the-middle search algorithm. Then based on the algorithm and the knapsack vector of the Chor-Rivest public-key crypto of fixed weight d, we present a general quantum meet-in-the-middle search algorithm based on the target solution of fixed weight, whose computational complexity is \\sumj = 0d {(O(\\sqrt {Cn - k + 1d - j }) + O(C_kj log C_k^j))} with Σd i =0 Ck i memory cost. And the optimal value of k is given. Compared to the quantum meet-in-the-middle search algorithm for knapsack problem and the quantum algorithm for searching a target solution of fixed weight, the computational complexity of the algorithm is lower. And its storage complexity is smaller than the quantum meet-in-the-middle-algorithm. Supported by the National Basic Research Program of China under Grant No. 2013CB338002 and the National Natural Science Foundation of China under Grant No. 61502526
Mohanasundaram, Ranganathan; Periasamy, Pappampalayam Sanmugam
2015-01-01
The current high profile debate with regard to data storage and its growth have become strategic task in the world of networking. It mainly depends on the sensor nodes called producers, base stations, and also the consumers (users and sensor nodes) to retrieve and use the data. The main concern dealt here is to find an optimal data storage position in wireless sensor networks. The works that have been carried out earlier did not utilize swarm intelligence based optimization approaches to find the optimal data storage positions. To achieve this goal, an efficient swam intelligence approach is used to choose suitable positions for a storage node. Thus, hybrid particle swarm optimization algorithm has been used to find the suitable positions for storage nodes while the total energy cost of data transmission is minimized. Clustering-based distributed data storage is utilized to solve clustering problem using fuzzy-C-means algorithm. This research work also considers the data rates and locations of multiple producers and consumers to find optimal data storage positions. The algorithm is implemented in a network simulator and the experimental results show that the proposed clustering and swarm intelligence based ODS strategy is more effective than the earlier approaches. PMID:25734182
NASA Astrophysics Data System (ADS)
Adler, R. F.; Wu, H.
2016-12-01
The Global Flood Monitoring System (GFMS) (http://flood.umd.edu) has been developed and used in recent years to provide real-time flood detection, streamflow estimates and inundation calculations for most of the globe. The GFMS is driven by satellite-based precipitation, with the accuracy of the flood estimates being primarily dependent on the accuracy of the precipitation analyses and the land surface and routing models used. The routing calculations are done at both 12 km and 1 km resolution. Users of GFMS results include international and national flood response organizations. The devastating floods in October 2015 in South Carolina are analyzed indicating that the GFMS estimated streamflow is accurate and useful indicating significant flooding in the upstream basins. Further downstream the GFMS streamflow underestimates due to the presence of dams which are not accounted for in GFMS. Other examples are given for Yemen and Somalia and for Sri Lanka and southern India. A forecast flood event associated with a typhoon hitting Taiwan is also examined. One-kilometer resolution inundation mapping from GFMS holds the promise of highly useful information for flood disaster response. The algorithm is briefly described and examples are shown for recent cases where inundation estimates available from optical and Synthetic Aperture Radar (SAR) satellite sensors are available. For a case of significant flooding in Texas in May and June along the Brazos River the GFMS calculated streamflow compares favorably with the observed. Available Landsat-based (May 28) and MODIS-based (June 2) inundation analyses from U. of Colorado shows generally good agreement with the GFMS inundation calculation in most of the area where skies were clear and the optical techniques could be applied. The GFMS provides very useful disaster response information on a timely basis. However, there is still significant room for improvement, including improved precipitation information from NASA's Global Precipitation Measurement (GPM) mission, inclusion of dam algorithms in the routing model and integration with or assimilation of observed flood extent from satellite optical and SAR sensors.
Rapid Mapping Of Floods Using SAR Data: Opportunities And Critical Aspects
NASA Astrophysics Data System (ADS)
Pulvirenti, Luca; Pierdicca, Nazzareno; Chini, Marco
2013-04-01
The potentiality of spaceborne Synthetic Aperture Radar (SAR) for flood mapping was demonstrated by several past investigations. The synoptic view, the capability to operate in almost all-weather conditions and during both day time and night time and the sensitivity of the microwave band to water are the key features that make SAR data useful for monitoring inundation events. In addition, their high spatial resolution, which can reach 1m with the new generation of X-band instruments such as TerraSAR-X and COSMO-SkyMed (CSK), allows emergency managers to use flood maps at very high spatial resolution. CSK gives also the possibility of performing frequent observations of regions hit by floods, thanks to the four-satellite constellation. Current research on flood mapping using SAR is focused on the development of automatic algorithms to be used in near real time applications. The approaches are generally based on the low radar return from smooth open water bodies that behave as specular reflectors and appear dark in SAR images. The major advantage of automatic algorithms is the computational efficiency that makes them suitable for rapid mapping purposes. The choice of the threshold value that, in this kind of algorithms, separates flooded from non-flooded areas is a critical aspect because it depends on the characteristics of the observed scenario and on system parameters. To deal with this aspect an algorithm for automatic detection of the regions of low backscatter has been developed. It basically accomplishes three steps: 1) division of the SAR image in a set of non-overlapping sub-images or splits; 2) selection of inhomogeneous sub-images that contain (at least) two populations of pixels, one of which is formed by dark pixels; 3) the application in sequence of an automatic thresholding algorithm and a region growing algorithm in order to produce a homogeneous map of flooded areas. Besides the aforementioned choice of the threshold, rapid mapping of floods may present other critical aspects. Searching for low SAR backscatter areas only may cause inaccuracies because flooded soils do not always act as smooth open water bodies. The presence of wind or of vegetation emerging above the water surface may give rise to an increase of the radar backscatter. In particular, mapping flooded vegetation using SAR data may represent a difficult task since backscattering phenomena in the volume between canopy, trunks and floodwater are quite complex in the presence of vegetation. A typical phenomenon is the double-bounce effect involving soil and stems or trunks, which is generally enhanced by the floodwater, so that flooded vegetation may appear very bright in a SAR image. Even in the absence of dense vegetation or wind, some regions may appear dark because of artefacts due to topography (shadowing), absorption caused by wet snow, and attenuation caused by heavy precipitating clouds (X-band SARs). Examples of the aforementioned effects that may limit the reliability of flood maps will be presented at the conference and some indications to deal with these effects (e.g. presence of vegetation and of artefacts) will be provided.
NASA Astrophysics Data System (ADS)
Moon, Y. I.; Kim, M. S.; Choi, J. H.; Yuk, G. M.
2017-12-01
eavy rainfall has become a recent major cause of urban area flooding due to the climate change and urbanization. To prevent property damage along with casualties, a system which can alert and forecast urban flooding must be developed. Optimal performance of reducing flood damage can be expected of urban drainage facilities when operated in smaller rainfall events over extreme ones. Thus, the purpose of this study is to execute: A) flood forecasting system using runoff analysis based on short term rainfall; and B) flood warning system which operates based on the data from pump stations and rainwater storage in urban basins. In result of the analysis, it is shown that urban drainage facilities using short term rainfall forecasting data by radar will be more effective to reduce urban flood damage than using only the inflow data of the facility. Keywords: Heavy Rainfall, Urban Flood, Short-term Rainfall Forecasting, Optimal operating of urban drainage facilities. AcknowledgmentsThis research was supported by a grant (17AWMP-B066744-05) from Advanced Water Management Research Program (AWMP) funded by Ministry of Land, Infrastructure and Transport of Korean government.
Developing a New Wireless Sensor Network Platform and Its Application in Precision Agriculture
Aquino-Santos, Raúl; González-Potes, Apolinar; Edwards-Block, Arthur; Virgen-Ortiz, Raúl Alejandro
2011-01-01
Wireless sensor networks are gaining greater attention from the research community and industrial professionals because these small pieces of “smart dust” offer great advantages due to their small size, low power consumption, easy integration and support for “green” applications. Green applications are considered a hot topic in intelligent environments, ubiquitous and pervasive computing. This work evaluates a new wireless sensor network platform and its application in precision agriculture, including its embedded operating system and its routing algorithm. To validate the technological platform and the embedded operating system, two different routing strategies were compared: hierarchical and flat. Both of these routing algorithms were tested in a small-scale network applied to a watermelon field. However, we strongly believe that this technological platform can be also applied to precision agriculture because it incorporates a modified version of LORA-CBF, a wireless location-based routing algorithm that uses cluster-based flooding. Cluster-based flooding addresses the scalability concerns of wireless sensor networks, while the modified LORA-CBF routing algorithm includes a metric to monitor residual battery energy. Furthermore, results show that the modified version of LORA-CBF functions well with both the flat and hierarchical algorithms, although it functions better with the flat algorithm in a small-scale agricultural network. PMID:22346622
Developing a new wireless sensor network platform and its application in precision agriculture.
Aquino-Santos, Raúl; González-Potes, Apolinar; Edwards-Block, Arthur; Virgen-Ortiz, Raúl Alejandro
2011-01-01
Wireless sensor networks are gaining greater attention from the research community and industrial professionals because these small pieces of "smart dust" offer great advantages due to their small size, low power consumption, easy integration and support for "green" applications. Green applications are considered a hot topic in intelligent environments, ubiquitous and pervasive computing. This work evaluates a new wireless sensor network platform and its application in precision agriculture, including its embedded operating system and its routing algorithm. To validate the technological platform and the embedded operating system, two different routing strategies were compared: hierarchical and flat. Both of these routing algorithms were tested in a small-scale network applied to a watermelon field. However, we strongly believe that this technological platform can be also applied to precision agriculture because it incorporates a modified version of LORA-CBF, a wireless location-based routing algorithm that uses cluster-based flooding. Cluster-based flooding addresses the scalability concerns of wireless sensor networks, while the modified LORA-CBF routing algorithm includes a metric to monitor residual battery energy. Furthermore, results show that the modified version of LORA-CBF functions well with both the flat and hierarchical algorithms, although it functions better with the flat algorithm in a small-scale agricultural network.
Confidence intervals for expected moments algorithm flood quantile estimates
Cohn, Timothy A.; Lane, William L.; Stedinger, Jery R.
2001-01-01
Historical and paleoflood information can substantially improve flood frequency estimates if appropriate statistical procedures are properly applied. However, the Federal guidelines for flood frequency analysis, set forth in Bulletin 17B, rely on an inefficient “weighting” procedure that fails to take advantage of historical and paleoflood information. This has led researchers to propose several more efficient alternatives including the Expected Moments Algorithm (EMA), which is attractive because it retains Bulletin 17B's statistical structure (method of moments with the Log Pearson Type 3 distribution) and thus can be easily integrated into flood analyses employing the rest of the Bulletin 17B approach. The practical utility of EMA, however, has been limited because no closed‐form method has been available for quantifying the uncertainty of EMA‐based flood quantile estimates. This paper addresses that concern by providing analytical expressions for the asymptotic variance of EMA flood‐quantile estimators and confidence intervals for flood quantile estimates. Monte Carlo simulations demonstrate the properties of such confidence intervals for sites where a 25‐ to 100‐year streamgage record is augmented by 50 to 150 years of historical information. The experiments show that the confidence intervals, though not exact, should be acceptable for most purposes.
76 FR 17037 - List of Approved Spent Fuel Storage Casks: HI-STORM Flood/Wind Addition
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-28
...-0007] RIN 3150-AI90 List of Approved Spent Fuel Storage Casks: HI-STORM Flood/Wind Addition AGENCY... or the Commission) is proposing to amend its spent fuel storage cask regulations to add the HI-STORM...: June 13, 2011. SAR Submitted by: Holtec International, Inc. SAR Title: Safety Analysis Report on the HI...
NASA Astrophysics Data System (ADS)
Yassin, F.; Anis, M. R.; Razavi, S.; Wheater, H. S.
2017-12-01
Water management through reservoirs, diversions, and irrigation have significantly changed river flow regimes and basin-wide energy and water balance cycles. Failure to represent these effects limits the performance of land surface-hydrology models not only for streamflow prediction but also for the estimation of soil moisture, evapotranspiration, and feedbacks to the atmosphere. Despite recent research to improve the representation of water management in land surface models, there remains a need to develop improved modeling approaches that work in complex and highly regulated basins such as the 406,000 km2 Saskatchewan River Basin (SaskRB). A particular challenge for regional and global application is a lack of local information on reservoir operational management. To this end, we implemented a reservoir operation, water abstraction, and irrigation algorithm in the MESH land surface-hydrology model and tested it over the SaskRB. MESH is Environment Canada's Land Surface-hydrology modeling system that couples Canadian Land Surface Scheme (CLASS) with hydrological routing model. The implemented reservoir algorithm uses an inflow-outflow relationship that accounts for the physical characteristics of reservoirs (e.g., storage-area-elevation relationships) and includes simplified operational characteristics based on local information (e.g., monthly target volume and release under limited, normal, and flood storage zone). The irrigation algorithm uses the difference between actual and potential evapotranspiration to estimate irrigation water demand. This irrigation demand is supplied from the neighboring reservoirs/diversion in the river system. We calibrated the model enabled with the new reservoir and irrigation modules in a multi-objective optimization setting. Results showed that the reservoir and irrigation modules significantly improved the MESH model performance in generating streamflow and evapotranspiration across the SaskRB and that this our approach provides a basis for improved large scale hydrological modelling.
1983-12-01
therefore, any possible changes in floodplain regulation would be independent of project implementation. The existing regulation affects properties...to 0.4. Based on engineering experience there is a tendency toward independence as tributary drainage area size decreases. Frequency-discharge...stages on the Wisconsin River. Similarly the storage areas are analyzed as independent syste,, o thereby, reduction in flood elevations (routing) and
Process-based model with flood control measures towards more realistic global flood modeling
NASA Astrophysics Data System (ADS)
Tang, Q.; Zhang, X.; Wang, Y.; Mu, M.; Lv, A.; Li, Z.
2017-12-01
In the profoundly human-influenced era, the Anthropocene, increased amount of land was developed in flood plains and many flood control measures were implemented to protect people and infrastructures placed in the flood-prone areas. These human influences (for example, dams and dykes) have altered peak streamflow and flood risk, and are already an integral part of flood. However, most of the process-based flood models have yet to taken into account the human influences. In this study, we used a hydrological model together with an advanced hydrodynamic model to assess flood risk at the Baiyangdian catchment. The Baiyangdian Lake is the largest shallow freshwater lake in North China, and it was used as a flood storage area in the past. A new development hub for the Beijing-Tianjin-Hebei economic triangle, namely the Xiongan new area, was recently established in the flood-prone area around the lake. The shuttle radar topography mission (SRTM) digital elevation model (DEMs) was used to parameterize the hydrodynamic model simulation, and the inundation estimates were compared with published flood maps and observed inundation area during the extreme historical flood events. A simple scheme was carried out to consider the impacts of flood control measures, including the reservoirs in the headwaters and the dykes to be built. By comparing model simulations with and without the influences of flood control measures, we demonstrated the importance of human influences in altering the inundated area and depth under design flood conditions. Based on the SRTM DEM and dam and reservoir data in the Global Reservoir and Dam (GRanD) database, we further discuss the potential to develop a global flood model with human influences.
Validation of satellite-based operational flood monitoring in Southern Queensland, Australia
NASA Astrophysics Data System (ADS)
Gouweleeuw, Ben; Ticehurst, Catherine; Lerat, Julien; Thew, Peter
2010-05-01
The integration of remote sensing observations with stage data and flood modeling has the potential to provide improved support to a number of disciplines, such as flood warning emergency response and operational water resources management. The ability of remote sensing technology to monitor the dynamics of hydrological events lies in its capacity to map surface water. For flood monitoring, remote sensing imagery needs to be available sufficiently frequently to capture subsequent inundation stages. MODIS optical data are available at a moderately high spatial and temporal resolution (250m-1km, twice daily), but are affected by cloud cover. AMSR-E passive microwave observations are available at comparable temporal resolution, but coarse spatial resolution (5-70km), where the smaller footprints corresponds with the higher frequency bands, which are affected by precipitating clouds. A novel operational technique to monitor flood extent combines MODIS reflectance and AMSR-E passive microwave imagery to optimize data continuity. Flood extent is subsequently combined with a DEM to obtain total flood water volume. The flood extent and volume product is operational for the lower-Balonne floodplain in Southern Queensland, Australia. For validation purposes, two moderate flood events coinciding with the MODIS and AMSR-E sensor lifetime are evaluated. The flood volume estimated from MODIS/AMSR-E images gives an accurate indication of both the timing and the magnitude of the flood peak compared to the net volume from recorded flow. In the flood recession, however, satellite-derived water volume declines rapidly, while the net flow volume remains level. This may be explained by a combination of ungauged outflows, soil infiltration, evaporation and diversion of flood water into many large open reservoirs for irrigation purposes. The open water storage extent unchanged, the water volume product is not sensitive enough to capture the change in storage water level. Additional information on the latter, e.g. via telemetered buoys, may circumvent this limitation.
Risk-trading in flood management: An economic model.
Chang, Chiung Ting
2017-09-15
Although flood management is no longer exclusively a topic of engineering, flood mitigation continues to be associated with hard engineering options. Flood adaptation or the capacity to adapt to flood risk, as well as a demand for internalizing externalities caused by flood risk between regions, complicate flood management activities. Even though integrated river basin management has long been recommended to resolve the above issues, it has proven difficult to apply widely, and sometimes even to bring into existence. This article explores how internalization of externalities as well as the realization of integrated river basin management can be encouraged via the use of a market-based approach, namely a flood risk trading program. In addition to maintaining efficiency of optimal resource allocation, a flood risk trading program may also provide a more equitable distribution of benefits by facilitating decentralization. This article employs a graphical analysis to show how flood risk trading can be implemented to encourage mitigation measures that increase infiltration and storage capacity. A theoretical model is presented to demonstrate the economic conditions necessary for flood risk trading. Copyright © 2017 Elsevier Ltd. All rights reserved.
The August 1975 Flood over Central China
NASA Astrophysics Data System (ADS)
Yang, Long; Smith, James; Liu, Maofeng; Baeck, MaryLynn
2016-04-01
The August 1975 flood in Central China was one of the most destructive floods in history, resulting in 26 000 fatalities, leaving about 10 million people with insufficient shelter, and producing long-lasting famine and disease. Extreme rainfall responsible for this flood event was associated with typhoon Nina during 5-7 August 1975. Despite the prominence of the August 1975 flood, analyses of the storms producing the flood and the resulting flood are sparse. Even fewer attempts were made from the perspective of numerical simulations. We examine details of extreme rainfall for the August 1975 flood based on downscaling simulations using the Weather Research and Forecasting (WRF) model driven by 20th Century Reanalysis fields. We further placed key hydrometeorological features for the flood event in a climatological context through the analyses of the 20th Century Reanalysis fields. Results indicate interrelated roles of multiple mesoscale ingredients for deep, moist convection in producing extreme rainfall for the August 1975 flood, superimposed over an anomalous synoptic environment. Attribution analyses on the source of water vapor for this flood event will be conducted based on a Lagrangian parcel tracking algorithm LAGRANTO. Analytical framework developed in this study aims to explore utilization of hydrometeorological approach in flood-control engineering designs by providing details on key elements of flood-producing storms.
A hybrid deep neural network and physically based distributed model for river stage prediction
NASA Astrophysics Data System (ADS)
hitokoto, Masayuki; sakuraba, Masaaki
2016-04-01
We developed the real-time river stage prediction model, using the hybrid deep neural network and physically based distributed model. As the basic model, 4 layer feed-forward artificial neural network (ANN) was used. As a network training method, the deep learning technique was applied. To optimize the network weight, the stochastic gradient descent method based on the back propagation method was used. As a pre-training method, the denoising autoencoder was used. Input of the ANN model is hourly change of water level and hourly rainfall, output data is water level of downstream station. In general, the desirable input of the ANN has strong correlation with the output. In conceptual hydrological model such as tank model and storage-function model, river discharge is governed by the catchment storage. Therefore, the change of the catchment storage, downstream discharge subtracted from rainfall, can be the potent input candidate of the ANN model instead of rainfall. From this point of view, the hybrid deep neural network and physically based distributed model was developed. The prediction procedure of the hybrid model is as follows; first, downstream discharge was calculated by the distributed model, and then estimates the hourly change of catchment storage form rainfall and calculated discharge as the input of the ANN model, and finally the ANN model was calculated. In the training phase, hourly change of catchment storage can be calculated by the observed rainfall and discharge data. The developed model was applied to the one catchment of the OOYODO River, one of the first-grade river in Japan. The modeled catchment is 695 square km. For the training data, 5 water level gauging station and 14 rain-gauge station in the catchment was used. The training floods, superior 24 events, were selected during the period of 2005-2014. Prediction was made up to 6 hours, and 6 models were developed for each prediction time. To set the proper learning parameters and network architecture of the ANN model, sensitivity analysis was done by the case study approach. The prediction result was evaluated by the superior 4 flood events by the leave-one-out cross validation. The prediction result of the basic 4 layer ANN was better than the conventional 3 layer ANN model. However, the result did not reproduce well the biggest flood event, supposedly because the lack of the sufficient high-water level flood event in the training data. The result of the hybrid model outperforms the basic ANN model and distributed model, especially improved the performance of the basic ANN model in the biggest flood event.
NASA Astrophysics Data System (ADS)
Boyko, Oleksiy; Zheleznyak, Mark
2015-04-01
The original numerical code TOPKAPI-IMMS of the distributed rainfall-runoff model TOPKAPI ( Todini et al, 1996-2014) is developed and implemented in Ukraine. The parallel version of the code has been developed recently to be used on multiprocessors systems - multicore/processors PC and clusters. Algorithm is based on binary-tree decomposition of the watershed for the balancing of the amount of computation for all processors/cores. Message passing interface (MPI) protocol is used as a parallel computing framework. The numerical efficiency of the parallelization algorithms is demonstrated for the case studies for the flood predictions of the mountain watersheds of the Ukrainian Carpathian regions. The modeling results is compared with the predictions based on the lumped parameters models.
NASA Technical Reports Server (NTRS)
Reager, John T.; Thomas, Alys C.; Sproles, Eric A.; Rodell, Matthew; Beaudoing, Hiroko K.; Li, Bailing; Famiglietti, James S.
2015-01-01
We evaluate performance of the Catchment Land Surface Model (CLSM) under flood conditions after the assimilation of observations of the terrestrial water storage anomaly (TWSA) from NASA's Gravity Recovery and Climate Experiment (GRACE). Assimilation offers three key benefits for the viability of GRACE observations to operational applications: (1) near-real time analysis; (2) a downscaling of GRACE's coarse spatial resolution; and (3) state disaggregation of the vertically-integrated TWSA. We select the 2011 flood event in the Missouri river basin as a case study, and find that assimilation generally made the model wetter in the months preceding flood. We compare model outputs with observations from 14 USGS groundwater wells to assess improvements after assimilation. Finally, we examine disaggregated water storage information to improve the mechanistic understanding of event generation. Validation establishes that assimilation improved the model skill substantially, increasing regional groundwater anomaly correlation from 0.58 to 0.86. For the 2011 flood event in the Missouri river basin, results show that groundwater and snow water equivalent were contributors to pre-event flood potential, providing spatially-distributed early warning information.
Detection and attribution of flood change across the United States
NASA Astrophysics Data System (ADS)
Archfield, Stacey
2017-04-01
In the United States, there have a been an increasing number of studies quantifying trends in the annual maximum flood; yet, few studies examine trends in floods that may occur more than once in a given year and even fewer assess trends in floods on rivers that have undergone substantial changes due to urbanization, land-cover change, and agricultural drainage practices. Previous research has shown that, for streamgages having minimal direct human intervention, trends in the peak magnitude, frequency, duration and volume of frequent floods (floods occurring at an average of two events per year relative to a base period) across the United States show large changes; however, few trends are found to be statistically significant. This study extends previous research to provide a comprehensive assessment of flood change across the United States that includes streamgages having experienced confounding alterations to streamflow (urbanization, storage, and land-cover changes) that provides a comprehensive assessment of flood change. Attribution of these changes is also explored.
Prioritizing the Components of Vulnerability: A Genetic Algorithm Minimization of Flood Risk
NASA Astrophysics Data System (ADS)
Bongolan, Vena Pearl; Ballesteros, Florencio; Baritua, Karessa Alexandra; Junne Santos, Marie
2013-04-01
We define a flood resistant city as an optimal arrangement of communities according to their traits, with the goal of minimizing the flooding vulnerability via a genetic algorithm. We prioritize the different components of flooding vulnerability, giving each component a weight, thus expressing vulnerability as a weighted sum. This serves as the fitness function for the genetic algorithm. We also allowed non-linear interactions among related but independent components, viz, poverty and mortality rate, and literacy and radio/ tv penetration. The designs produced reflect the relative importance of the components, and we observed a synchronicity between the interacting components, giving us a more consistent design.
Speer, Paul R.; Gamble, Charles R.
1965-01-01
This report presents a means of determining the probable magnitude and frequency of floods of any recurrence interval from 1.1 to 50 years at most points on streams in the Ohio River basin except Cumberland and Tennessee River basins. Curves are defined that show the relation between the drainage area and the mean annual flood in eight hydrologic areas, and composite frequency curves define the relation of a flood of any recurrence interval from 1.1 to 50 years to the mean annual flood. These two relations are based upon gaging-station records having 10 or more years of record not materially affected by storage or diversion, and the results obtainable from them will represent the magnitude and frequency of natural floods within the range and recurrence intervals defined by the base data. The report also contains a compilation of flood records at all sites in the area at which records have been collected for 5 or more consecutive years. As far as was possible at each location for which discharge has been determined, the tabulations include all floods above a selected base. Where only gage heights have been obtained or where the data did not warrant computation of peach discharges above a selected base, only annual peaks are shown. The maximum known flood discharges for the streamflow stations and miscellaneous points except Ohio River main stem stations, together with areal floods of 10- and 50-year recurrence intervals, are plotted against the size of drainage area for each flood region and hydrologic area to provide a convenient means of judging the frequency of the maximum known floods that have been recorded for these points.
The serial message-passing schedule for LDPC decoding algorithms
NASA Astrophysics Data System (ADS)
Liu, Mingshan; Liu, Shanshan; Zhou, Yuan; Jiang, Xue
2015-12-01
The conventional message-passing schedule for LDPC decoding algorithms is the so-called flooding schedule. It has the disadvantage that the updated messages cannot be used until next iteration, thus reducing the convergence speed . In this case, the Layered Decoding algorithm (LBP) based on serial message-passing schedule is proposed. In this paper the decoding principle of LBP algorithm is briefly introduced, and then proposed its two improved algorithms, the grouped serial decoding algorithm (Grouped LBP) and the semi-serial decoding algorithm .They can improve LBP algorithm's decoding speed while maintaining a good decoding performance.
NASA Astrophysics Data System (ADS)
Tang, Li; Liu, Jing-Ning; Feng, Dan; Tong, Wei
2008-12-01
Existing security solutions in network storage environment perform poorly because cryptographic operations (encryption and decryption) implemented in software can dramatically reduce system performance. In this paper we propose a cryptographic hardware accelerator on dynamically reconfigurable platform for the security of high performance network storage system. We employ a dynamic reconfigurable platform based on a FPGA to implement a PowerPCbased embedded system, which executes cryptographic algorithms. To reduce the reconfiguration latency, we apply prefetch scheduling. Moreover, the processing elements could be dynamically configured to support different cryptographic algorithms according to the request received by the accelerator. In the experiment, we have implemented AES (Rijndael) and 3DES cryptographic algorithms in the reconfigurable accelerator. Our proposed reconfigurable cryptographic accelerator could dramatically increase the performance comparing with the traditional software-based network storage systems.
NASA Astrophysics Data System (ADS)
Zhang, Xiaolei; Song, Yuqin
2014-11-01
Wetland restoration in floodplains is an ecological solution that can address basin-wide flooding issues and minimize flooding and damages to riverine and downstream areas. High population densities, large economic outputs, and heavy reliance on water resources make flood retention and management pressing issues in China. To balance flood control and sustainable development economically, socially, and politically, flood retention areas have been established to increase watershed flood storage capacities and enhance the public welfare for the populace living in the areas. However, conflicts between flood storage functions and human habitation appear irreconcilable. We developed a site-specific methodology for identifying potential sites and functional zones for wetland restoration in a flood retention area in middle and eastern China, optimizing the spatial distribution and functional zones to maximize flood control and human and regional development. This methodology was applied to Mengwa, one of 21 flood retention areas in China's Huaihe River Basin, using nine scenarios that reflected different flood, climatic, and hydraulic conditions. The results demonstrated improved flood retention and ecological functions, as well as increased economic benefits.
Designing and operating infrastructure for nonstationary flood risk management
NASA Astrophysics Data System (ADS)
Doss-Gollin, J.; Farnham, D. J.; Lall, U.
2017-12-01
Climate exhibits organized low-frequency and regime-like variability at multiple time scales, causing the risk associated with climate extremes such as floods and droughts to vary in time. Despite broad recognition of this nonstationarity, there has been little theoretical development of ideas for the design and operation of infrastructure considering the regime structure of such changes and their potential predictability. We use paleo streamflow reconstructions to illustrate an approach to the design and operation of infrastructure to address nonstationary flood and drought risk. Specifically, we consider the tradeoff between flood control and conservation storage, and develop design and operation principles for allocating these storage volumes considering both a m-year project planning period and a n-year historical sampling record. As n increases, the potential uncertainty in probabilistic estimates of the return periods associated with the T-year extreme event decreases. As the duration m of the future operation period decreases, the uncertainty associated with the occurrence of the T-year event also increases. Finally, given the quasi-periodic nature of the system it may be possible to offer probabilistic predictions of the conditions in the m-year future period, especially if m is small. In the context of such predictions, one can consider that a m-year prediction may have lower bias, but higher variance, than would be associated with using a stationary estimate from the preceding n years. This bias-variance trade-off, and the potential for considering risk management for multiple values of m, provides an interesting system design challenge. We use wavelet-based simulation models in a Bayesian framework to estimate these biases and uncertainty distributions and devise a risk-optimized decision rule for the allocation of flood and conservation storage. The associated theoretical development also provides a methodology for the sizing of storage for new infrastructure under nonstationarity, and an examination of risk adaptation measures which consider both short term and long term options simultaneously.
NASA Astrophysics Data System (ADS)
Lotfata, A.; Ambinakudige, S.
2017-12-01
Coastal regions face a higher risk of flooding. A rise in sea-level increases flooding chances in low-lying areas. A major concern is the effect of sea-level rise on the depth of the fresh water/salt water interface in the aquifers of the coastal regions. A sea-level change rise impacts the hydrological system of the aquifers. Salt water intrusion into fresh water aquifers increase water table levels. Flooding prone areas in the coast are at a higher risk of salt water intrusion. The Gulf coast is one of the most vulnerable flood areas due to its natural weather patterns. There is not yet a local assessment of the relation between groundwater level and sea-level rising. This study investigates the projected sea-level rise models and the anomalous groundwater level during January 2002 to December 2016. We used the NASA Gravity Recovery and Climate Experiment (GRACE) and Global Land Data Assimilation System (GLDAS) satellite data in the analysis. We accounted the leakage error and the measurement error in GRACE data. GLDAS data was used to calculate the groundwater storage from the total water storage estimated using GRACE data (ΔGW=ΔTWS (soil moisture, surface water, groundwater, and canopy water) - ΔGLDAS (soil moisture, surface water, and canopy water)). The preliminary results indicate that the total water storage is increasing in parts of the Gulf of Mexico. GRACE data show high soil wetness and groundwater levels in Mississippi, Alabama and Texas coasts. Because sea-level rise increases the probability of flooding in the Gulf coast and affects the groundwater, we will analyze probable interactions between sea-level rise and groundwater in the study area. To understand regional sea-level rise patterns, we will investigate GRACE Ocean data along the Gulf coasts. We will quantify ocean total water storage, its salinity, and its relationship with the groundwater level variations in the Gulf coast.
Damming the genomic data flood using a comprehensive analysis and storage data structure
Bouffard, Marc; Phillips, Michael S.; Brown, Andrew M.K.; Marsh, Sharon; Tardif, Jean-Claude; van Rooij, Tibor
2010-01-01
Data generation, driven by rapid advances in genomic technologies, is fast outpacing our analysis capabilities. Faced with this flood of data, more hardware and software resources are added to accommodate data sets whose structure has not specifically been designed for analysis. This leads to unnecessarily lengthy processing times and excessive data handling and storage costs. Current efforts to address this have centered on developing new indexing schemas and analysis algorithms, whereas the root of the problem lies in the format of the data itself. We have developed a new data structure for storing and analyzing genotype and phenotype data. By leveraging data normalization techniques, database management system capabilities and the use of a novel multi-table, multidimensional database structure we have eliminated the following: (i) unnecessarily large data set size due to high levels of redundancy, (ii) sequential access to these data sets and (iii) common bottlenecks in analysis times. The resulting novel data structure horizontally divides the data to circumvent traditional problems associated with the use of databases for very large genomic data sets. The resulting data set required 86% less disk space and performed analytical calculations 6248 times faster compared to a standard approach without any loss of information. Database URL: http://castor.pharmacogenomics.ca PMID:21159730
Efficient Retrieval of Massive Ocean Remote Sensing Images via a Cloud-Based Mean-Shift Algorithm.
Yang, Mengzhao; Song, Wei; Mei, Haibin
2017-07-23
The rapid development of remote sensing (RS) technology has resulted in the proliferation of high-resolution images. There are challenges involved in not only storing large volumes of RS images but also in rapidly retrieving the images for ocean disaster analysis such as for storm surges and typhoon warnings. In this paper, we present an efficient retrieval of massive ocean RS images via a Cloud-based mean-shift algorithm. Distributed construction method via the pyramid model is proposed based on the maximum hierarchical layer algorithm and used to realize efficient storage structure of RS images on the Cloud platform. We achieve high-performance processing of massive RS images in the Hadoop system. Based on the pyramid Hadoop distributed file system (HDFS) storage method, an improved mean-shift algorithm for RS image retrieval is presented by fusion with the canopy algorithm via Hadoop MapReduce programming. The results show that the new method can achieve better performance for data storage than HDFS alone and WebGIS-based HDFS. Speedup and scaleup are very close to linear changes with an increase of RS images, which proves that image retrieval using our method is efficient.
Efficient Retrieval of Massive Ocean Remote Sensing Images via a Cloud-Based Mean-Shift Algorithm
Song, Wei; Mei, Haibin
2017-01-01
The rapid development of remote sensing (RS) technology has resulted in the proliferation of high-resolution images. There are challenges involved in not only storing large volumes of RS images but also in rapidly retrieving the images for ocean disaster analysis such as for storm surges and typhoon warnings. In this paper, we present an efficient retrieval of massive ocean RS images via a Cloud-based mean-shift algorithm. Distributed construction method via the pyramid model is proposed based on the maximum hierarchical layer algorithm and used to realize efficient storage structure of RS images on the Cloud platform. We achieve high-performance processing of massive RS images in the Hadoop system. Based on the pyramid Hadoop distributed file system (HDFS) storage method, an improved mean-shift algorithm for RS image retrieval is presented by fusion with the canopy algorithm via Hadoop MapReduce programming. The results show that the new method can achieve better performance for data storage than HDFS alone and WebGIS-based HDFS. Speedup and scaleup are very close to linear changes with an increase of RS images, which proves that image retrieval using our method is efficient. PMID:28737699
A framework for global river flood risk assessments
NASA Astrophysics Data System (ADS)
Winsemius, H. C.; Van Beek, L. P. H.; Jongman, B.; Ward, P. J.; Bouwman, A.
2012-08-01
There is an increasing need for strategic global assessments of flood risks in current and future conditions. In this paper, we propose a framework for global flood risk assessment for river floods, which can be applied in current conditions, as well as in future conditions due to climate and socio-economic changes. The framework's goal is to establish flood hazard and impact estimates at a high enough resolution to allow for their combination into a risk estimate. The framework estimates hazard at high resolution (~1 km2) using global forcing datasets of the current (or in scenario mode, future) climate, a global hydrological model, a global flood routing model, and importantly, a flood extent downscaling routine. The second component of the framework combines hazard with flood impact models at the same resolution (e.g. damage, affected GDP, and affected population) to establish indicators for flood risk (e.g. annual expected damage, affected GDP, and affected population). The framework has been applied using the global hydrological model PCR-GLOBWB, which includes an optional global flood routing model DynRout, combined with scenarios from the Integrated Model to Assess the Global Environment (IMAGE). We performed downscaling of the hazard probability distributions to 1 km2 resolution with a new downscaling algorithm, applied on Bangladesh as a first case-study application area. We demonstrate the risk assessment approach in Bangladesh based on GDP per capita data, population, and land use maps for 2010 and 2050. Validation of the hazard and damage estimates has been performed using the Dartmouth Flood Observatory database and damage estimates from the EM-DAT database and World Bank sources. We discuss and show sensitivities of the estimated risks with regard to the use of different climate input sets, decisions made in the downscaling algorithm, and different approaches to establish impact models.
Magnitude and frequency of flooding on the Myakka River, Southwest Florida
Hammett, K.M.; Turner, J.F.; Murphy, W.R.
1978-01-01
Increasing numbers of urban and agricultural developments are being located on waterfront property in the Myakka River flood plain in southwest Florida. Under natural conditions, a large depression, Tatum Sawgrass, was available as a flood storage area in the upper Myakka River basin. Construction of dikes across the lower part of Tatum Sawgrass has restricted use of the depression for temporary storage of Myakka River flood water overflow, and has resulted in increased flood-peak discharges and flood heights in downstream reaches of the Myakka River. The difference between natural and diked condition flood-peak discharges and flood heights is presented to illustrate the effects of the dikes. Flood-peak discharges, water-surface elevations and flood profiles also are provided for diked conditions. Analytical procedures used to evaluate diking effects are described in detail. The study reach includes Myakka River main stem upstream from U.S. Highway 41, near Myakka Shores in Sarasota County, to State Road 70 near Myakka City in Manatee County (including Tatum Sawgrass and Clay Gully), and Blackburn Canal from Venice By-Way to Myakka River. (Woodard-USGS)
NASA Astrophysics Data System (ADS)
Chen, Y.
2017-12-01
Urbanization is the world development trend for the past century, and the developing countries have been experiencing much rapider urbanization in the past decades. Urbanization brings many benefits to human beings, but also causes negative impacts, such as increasing flood risk. Impact of urbanization on flood response has long been observed, but quantitatively studying this effect still faces great challenges. For example, setting up an appropriate hydrological model representing the changed flood responses and determining accurate model parameters are very difficult in the urbanized or urbanizing watershed. In the Pearl River Delta area, rapidest urbanization has been observed in China for the past decades, and dozens of highly urbanized watersheds have been appeared. In this study, a physically based distributed watershed hydrological model, the Liuxihe model is employed and revised to simulate the hydrological processes of the highly urbanized watershed flood in the Pearl River Delta area. A virtual soil type is then defined in the terrain properties dataset, and its runoff production and routing algorithms are added to the Liuxihe model. Based on a parameter sensitive analysis, the key hydrological processes of a highly urbanized watershed is proposed, that provides insight into the hydrological processes and for parameter optimization. Based on the above analysis, the model is set up in the Songmushan watershed where there is hydrological data observation. A model parameter optimization and updating strategy is proposed based on the remotely sensed LUC types, which optimizes model parameters with PSO algorithm and updates them based on the changed LUC types. The model parameters in Songmushan watershed are regionalized at the Pearl River Delta area watersheds based on the LUC types of the other watersheds. A dozen watersheds in the highly urbanized area of Dongguan City in the Pearl River Delta area were studied for the flood response changes due to urbanization, and the results show urbanization has big impact on the watershed flood responses. The peak flow increased a few times after urbanization which is much higher than previous reports.
Estimating floodwater depths from flood inundation maps and topography
Cohen, Sagy; Brakenridge, G. Robert; Kettner, Albert; Bates, Bradford; Nelson, Jonathan M.; McDonald, Richard R.; Huang, Yu-Fen; Munasinghe, Dinuke; Zhang, Jiaqi
2018-01-01
Information on flood inundation extent is important for understanding societal exposure, water storage volumes, flood wave attenuation, future flood hazard, and other variables. A number of organizations now provide flood inundation maps based on satellite remote sensing. These data products can efficiently and accurately provide the areal extent of a flood event, but do not provide floodwater depth, an important attribute for first responders and damage assessment. Here we present a new methodology and a GIS-based tool, the Floodwater Depth Estimation Tool (FwDET), for estimating floodwater depth based solely on an inundation map and a digital elevation model (DEM). We compare the FwDET results against water depth maps derived from hydraulic simulation of two flood events, a large-scale event for which we use medium resolution input layer (10 m) and a small-scale event for which we use a high-resolution (LiDAR; 1 m) input. Further testing is performed for two inundation maps with a number of challenging features that include a narrow valley, a large reservoir, and an urban setting. The results show FwDET can accurately calculate floodwater depth for diverse flooding scenarios but also leads to considerable bias in locations where the inundation extent does not align well with the DEM. In these locations, manual adjustment or higher spatial resolution input is required.
Mapping Flood Protection Benefits from Restored Wetlands at the Urban-Suburban Interface
Urbanization exacerbates flooding by increasing runoff and decreasing surface water storage. Restoring wetlands can enhance flood protection while providing a suite of co-benefits such as temperature regulation and access to open space. Spatial modeling of the delivery of flood p...
Computational approaches for the classification of seed storage proteins.
Radhika, V; Rao, V Sree Hari
2015-07-01
Seed storage proteins comprise a major part of the protein content of the seed and have an important role on the quality of the seed. These storage proteins are important because they determine the total protein content and have an effect on the nutritional quality and functional properties for food processing. Transgenic plants are being used to develop improved lines for incorporation into plant breeding programs and the nutrient composition of seeds is a major target of molecular breeding programs. Hence, classification of these proteins is crucial for the development of superior varieties with improved nutritional quality. In this study we have applied machine learning algorithms for classification of seed storage proteins. We have presented an algorithm based on nearest neighbor approach for classification of seed storage proteins and compared its performance with decision tree J48, multilayer perceptron neural (MLP) network and support vector machine (SVM) libSVM. The model based on our algorithm has been able to give higher classification accuracy in comparison to the other methods.
Analysis of flood inundation in ungauged basins based on multi-source remote sensing data.
Gao, Wei; Shen, Qiu; Zhou, Yuehua; Li, Xin
2018-02-09
Floods are among the most expensive natural hazards experienced in many places of the world and can result in heavy losses of life and economic damages. The objective of this study is to analyze flood inundation in ungauged basins by performing near-real-time detection with flood extent and depth based on multi-source remote sensing data. Via spatial distribution analysis of flood extent and depth in a time series, the inundation condition and the characteristics of flood disaster can be reflected. The results show that the multi-source remote sensing data can make up the lack of hydrological data in ungauged basins, which is helpful to reconstruct hydrological sequence; the combination of MODIS (moderate-resolution imaging spectroradiometer) surface reflectance productions and the DFO (Dartmouth Flood Observatory) flood database can achieve the macro-dynamic monitoring of the flood inundation in ungauged basins, and then the differential technique of high-resolution optical and microwave images before and after floods can be used to calculate flood extent to reflect spatial changes of inundation; the monitoring algorithm for the flood depth combining RS and GIS is simple and easy and can quickly calculate the depth with a known flood extent that is obtained from remote sensing images in ungauged basins. Relevant results can provide effective help for the disaster relief work performed by government departments.
Lead/acid batteries in systems to improve power quality
NASA Astrophysics Data System (ADS)
Taylor, P.; Butler, P.; Nerbun, W.
Increasing dependence on computer technology is driving needs for extremely high-quality power to prevent loss of information, material, and workers' time that represent billions of dollars annually. This cost has motivated commercial and Federal research and development of energy storage systems that detect and respond to power-quality failures in milliseconds. Electrochemical batteries are among the storage media under investigation for these systems. Battery energy storage systems that employ either flooded lead/acid or valve-regulated lead/acid battery technologies are becoming commercially available to capture a share of this emerging market. Cooperative research and development between the US Department of Energy and private industry have led to installations of lead/acid-based battery energy storage systems to improve power quality at utility and industrial sites and commercial development of fully integrated, modular battery energy storage system products for power quality. One such system by AC Battery Corporation, called the PQ2000, is installed at a test site at Pacific Gas and Electric Company (San Ramon, CA, USA) and at a customer site at Oglethorpe Power Corporation (Tucker, GA, USA). The PQ2000 employs off-the-shelf power electronics in an integrated methodology to control the factors that affect the performance and service life of production-model, low-maintenance, flooded lead/acid batteries. This system, and other members of this first generation of lead/acid-based energy storage systems, will need to compete vigorously for a share of an expanding, yet very aggressive, power quality market.
Floods in south-central Oklahoma and north-central Texas, October 1981
Buckner, Harold D.; Kurklin, Joanne K.
1984-01-01
Substantial reductions in peak stages and discharges on the West Fork Trinity River downstream from Eagle Mountain Reservoir were attained as a result of reservoir storage. All floodwater on the Elm Fork Trinity River was contained by reservoir storage thus preventing a potentially devastating flood downstream on the Trinity River. Maximum stages and discharges and/or contents were recorded during and after this major flood at 83 gaging stations, crest-stage stations, reservoir stations, and a miscellaneous site.
Charge scheduling of an energy storage system under time-of-use pricing and a demand charge.
Yoon, Yourim; Kim, Yong-Hyuk
2014-01-01
A real-coded genetic algorithm is used to schedule the charging of an energy storage system (ESS), operated in tandem with renewable power by an electricity consumer who is subject to time-of-use pricing and a demand charge. Simulations based on load and generation profiles of typical residential customers show that an ESS scheduled by our algorithm can reduce electricity costs by approximately 17%, compared to a system without an ESS and by 8% compared to a scheduling algorithm based on net power.
Charge Scheduling of an Energy Storage System under Time-of-Use Pricing and a Demand Charge
Yoon, Yourim
2014-01-01
A real-coded genetic algorithm is used to schedule the charging of an energy storage system (ESS), operated in tandem with renewable power by an electricity consumer who is subject to time-of-use pricing and a demand charge. Simulations based on load and generation profiles of typical residential customers show that an ESS scheduled by our algorithm can reduce electricity costs by approximately 17%, compared to a system without an ESS and by 8% compared to a scheduling algorithm based on net power. PMID:25197720
SeqCompress: an algorithm for biological sequence compression.
Sardaraz, Muhammad; Tahir, Muhammad; Ikram, Ataul Aziz; Bajwa, Hassan
2014-10-01
The growth of Next Generation Sequencing technologies presents significant research challenges, specifically to design bioinformatics tools that handle massive amount of data efficiently. Biological sequence data storage cost has become a noticeable proportion of total cost in the generation and analysis. Particularly increase in DNA sequencing rate is significantly outstripping the rate of increase in disk storage capacity, which may go beyond the limit of storage capacity. It is essential to develop algorithms that handle large data sets via better memory management. This article presents a DNA sequence compression algorithm SeqCompress that copes with the space complexity of biological sequences. The algorithm is based on lossless data compression and uses statistical model as well as arithmetic coding to compress DNA sequences. The proposed algorithm is compared with recent specialized compression tools for biological sequences. Experimental results show that proposed algorithm has better compression gain as compared to other existing algorithms. Copyright © 2014 Elsevier Inc. All rights reserved.
Pixel-based flood mapping from SAR imagery: a comparison of approaches
NASA Astrophysics Data System (ADS)
Landuyt, Lisa; Van Wesemael, Alexandra; Van Coillie, Frieke M. B.; Verhoest, Niko E. C.
2017-04-01
Due to their all-weather, day and night capabilities, SAR sensors have been shown to be particularly suitable for flood mapping applications. Thus, they can provide spatially-distributed flood extent data which are valuable for calibrating, validating and updating flood inundation models. These models are an invaluable tool for water managers, to take appropriate measures in times of high water levels. Image analysis approaches to delineate flood extent on SAR imagery are numerous. They can be classified into two categories, i.e. pixel-based and object-based approaches. Pixel-based approaches, e.g. thresholding, are abundant and in general computationally inexpensive. However, large discrepancies between these techniques exist and often subjective user intervention is needed. Object-based approaches require more processing but allow for the integration of additional object characteristics, like contextual information and object geometry, and thus have significant potential to provide an improved classification result. As means of benchmark, a selection of pixel-based techniques is applied on a ERS-2 SAR image of the 2006 flood event of River Dee, United Kingdom. This selection comprises Otsu thresholding, Kittler & Illingworth thresholding, the Fine To Coarse segmentation algorithm and active contour modelling. The different classification results are evaluated and compared by means of several accuracy measures, including binary performance measures.
NASA Astrophysics Data System (ADS)
Guo, Aijun; Chang, Jianxia; Wang, Yimin; Huang, Qiang; Zhou, Shuai
2018-05-01
Traditional flood risk analysis focuses on the probability of flood events exceeding the design flood of downstream hydraulic structures while neglecting the influence of sedimentation in river channels on regional flood control systems. This work advances traditional flood risk analysis by proposing a univariate and copula-based bivariate hydrological risk framework which incorporates both flood control and sediment transport. In developing the framework, the conditional probabilities of different flood events under various extreme precipitation scenarios are estimated by exploiting the copula-based model. Moreover, a Monte Carlo-based algorithm is designed to quantify the sampling uncertainty associated with univariate and bivariate hydrological risk analyses. Two catchments located on the Loess plateau are selected as study regions: the upper catchments of the Xianyang and Huaxian stations (denoted as UCX and UCH, respectively). The univariate and bivariate return periods, risk and reliability in the context of uncertainty for the purposes of flood control and sediment transport are assessed for the study regions. The results indicate that sedimentation triggers higher risks of damaging the safety of local flood control systems compared with the event that AMF exceeds the design flood of downstream hydraulic structures in the UCX and UCH. Moreover, there is considerable sampling uncertainty affecting the univariate and bivariate hydrologic risk evaluation, which greatly challenges measures of future flood mitigation. In addition, results also confirm that the developed framework can estimate conditional probabilities associated with different flood events under various extreme precipitation scenarios aiming for flood control and sediment transport. The proposed hydrological risk framework offers a promising technical reference for flood risk analysis in sandy regions worldwide.
18 CFR 1304.407 - Development within flood control storage zones of TVA reservoirs.
Code of Federal Regulations, 2010 CFR
2010-04-01
... zones of TVA reservoirs. (a) Activities involving development within the flood control storage zone on TVA reservoirs will be reviewed to determine if the proposed activity qualifies as a repetitive action... (v) The nature and significance of any economic and/or natural resource benefits that would be...
NASA Astrophysics Data System (ADS)
Herman, J. D.; Steinschneider, S.; Nayak, M. A.
2017-12-01
Short-term weather forecasts are not codified into the operating policies of federal, multi-purpose reservoirs, despite their potential to improve service provision. This is particularly true for facilities that provide flood protection and water supply, since the potential flood damages are often too severe to accept the risk of inaccurate forecasts. Instead, operators must maintain empty storage capacity to mitigate flood risk, even if the system is currently in drought, as occurred in California from 2012-2016. This study investigates the potential for forecast-informed operating rules to improve water supply efficiency while maintaining flood protection, combining state-of-the-art weather hindcasts with a novel tree-based policy optimization framework. We hypothesize that forecasts need only accurately predict the occurrence of a storm, rather than its intensity, to be effective in regions like California where wintertime, synoptic-scale storms dominate the flood regime. We also investigate the potential for downstream groundwater injection to improve the utility of forecasts. These hypotheses are tested in a case study of Folsom Reservoir on the American River. Because available weather hindcasts are relatively short (10-20 years), we propose a new statistical framework to develop synthetic forecasts to assess the risk associated with inaccurate forecasts. The efficiency of operating policies is tested across a range of scenarios that include varying forecast skill and additional groundwater pumping capacity. Results suggest that the combined use of groundwater storage and short-term weather forecasts can substantially improve the tradeoff between water supply and flood control objectives in large, multi-purpose reservoirs in California.
NASA Astrophysics Data System (ADS)
Park, J. H.; Jun, S. M.; Park, C. G.
2014-12-01
Recently abnormal climate phenomena and urbanization recently causes the changes of the hydrological environment. To restore the hydrological cycle in urban area some fundamental solutions such as decentralized rainwater management system and Low Impact Development (LID) techniques may be choosed. In this study, SWMM 5 was used to analyze the effects of decentralized stormwater retention for preventing the urban flood and securing the instreamflow. The Chunggyechun stream watershed(21.29㎢) which is located in Seoul city(Korea) and fully developed as urban area was selected as the study watershed, and the runoff characteristics of urban stream with various methods of LID techniques (Permeable pavement, small rainwater storage tank, large rainwater storage tank) were analyzed. By the simulation results, the permeability of pavement materials and detention storage at the surface soil layer make high effect to the flood discharge, and the initial rainfall retention at the rainwater storage tank effected to reduce the flood peak. The peak discharge was decreased as 22% for the design precipitation. Moreover the instreamflow was increased as 55% by using adequate LID techniques These kind of data could be used as the basis data for designing urban flood prevention facilities, urban regeneration planning in the view of the integrated watershed management.
NASA Astrophysics Data System (ADS)
Galantowicz, J. F.; Picton, J.; Root, B.
2017-12-01
Passive microwave remote sensing can provided a distinct perspective on flood events by virtue of wide sensor fields of view, frequent observations from multiple satellites, and sensitivity through clouds and vegetation. During Hurricanes Harvey and Irma, we used AMSR2 (Advanced Microwave Scanning Radiometer 2, JAXA) data to map flood extents starting from the first post-storm rain-free sensor passes. Our standard flood mapping algorithm (FloodScan) derives flooded fraction from 22-km microwave data (AMSR2 or NASA's GMI) in near real time and downscales it to 90-m resolution using a database built from topography, hydrology, and Global Surface Water Explorer data and normalized to microwave data footprint shapes. During Harvey and Irma we tested experimental versions of the algorithm designed to map the maximum post-storm flood extent rapidly and made a variety of map products available immediately for use in storm monitoring and response. The maps have several unique features including spanning the entire storm-affected area and providing multiple post-storm updates as flood water shifted and receded. From the daily maps we derived secondary products such as flood duration, maximum flood extent (Figure 1), and flood depth. In this presentation, we describe flood extent evolution, maximum extent, and local details as detected by the FloodScan algorithm in the wake of Harvey and Irma. We compare FloodScan results to other available flood mapping resources, note observed shortcomings, and describe improvements made in response. We also discuss how best-estimate maps could be updated in near real time by merging FloodScan products and data from other remote sensing systems and hydrological models.
NASA Astrophysics Data System (ADS)
Bianchi, Thomas S.; Butman, David; Raymond, Peter A.; Ward, Nicholas D.; Kates, Rory J. S.; Flessa, Karl W.; Zamora, Hector; Arellano, Ana R.; Ramirez, Jorge; Rodriguez, Eliana
2017-03-01
Here we report on the effects of an experimental flood on the carbon cycling dynamics in the dry watercourse of the Colorado River in Mexico. We observed post-flood differences in the degree of decay, age, and concentration of dissolved organic carbon (DOC), as well as dissolved CH4 and CO2 concentrations throughout the study site. Our results indicate that this flooded waterway was a limited source of CH4 and CO2 to the atmosphere during the event and that DOC age increased with time of flooding. Based on our findings, we suggest that the interplay between storage and mobilization of carbon and greenhouse gases in arid and semiarid regions is potentially sensitive to changing climate conditions, particularly hydrologic variability. Changes in the radiocarbon age of DOC throughout the flooding event suggest that organic matter (OM) that had been stored for long periods (e.g., millennial) was mobilized by the flooding event along with CO2. The OM residing in the dry riverbed that was mobilized into floodwaters had a signature reflective of degraded vascular plant OM and microbial biomass. Whether this microbial OM was living or dead, our findings support previous work in soils and natural waters showing that microbial OM can remain stable and stored in ecosystems for long time periods. As human appropriation of water resources continues to increase, the episodic drying and rewetting of once natural riverbeds and deltas may fundamentally alter the processing and storage of carbon in such systems.
NASA Astrophysics Data System (ADS)
Wilkinson, Mark; Quinn, Paul; Hewett, Caspar; Stutter, Marc
2017-04-01
Over the past decade economic losses from fluvial floods have greatly increased and it is becoming less viable to use traditional measures for managing flooding solely. This has given rise to increasing interest in alternative, nature based solutions (NBS) for reducing flood risk that aim to manage runoff at the catchment source and deliver multiple benefits. In many cases these measures need to work with current agricultural practices. Intensive agriculture often results in increases in local runoff rates, water quality issues, soil erosion/loss and local flooding problems. However, there is potential for agriculture to play a part in reducing flood risk. This requires knowledge on the effectiveness of NBS at varying scales and tools to communicate the risk of runoff associated with farming. This paper assesses the placement, management and effectiveness of a selection of nature-based measures in the rural landscape. Measures which disconnect overland flow pathways and improve soil infiltration are discussed. Case study examples are presented from the UK where a large number of nature-based measures have been constructed as part of flood protection schemes in catchment scales varying from 50 ha to 25 km2. Practical tools to help locate measures in agricultural landscapes are highlighted including the Floods and Agriculture Risk Matrix (FARM), an interactive communication/visualization tool and FARMPLOT, a GIS mapping tool. These have been used to promote such measures, by showing how and where temporary ponded areas can be located to reduce flood and erosion risk whilst minimising disruption to farming practices. In most cases land managers prefer small ( 100-1000m3) temporary ponding areas which fill during moderate to large storm events since they incur minimal loss of land. They also provide greater resillience to multi-day storm events, as they are designed to drain over 1-2 days and therefore allow for storage capacity for proceeding events. However, the performance of isolated temporary storage areas can be limited during extreme events. At larger scales taking a treatment train approach using a network of measures has been shown to achieve greater benefits, e.g. by reducing local flood peaks and capturing sediments. Current local scale evidence presented here has been used to inform environmental policy on the correct placement and design of flood reduction measures. Further long term data collection is required to assess the larger scale impact of these measures. These data can be used to inform scenario-based modelling approaches. By holding and attenuating runoff in rural landscapes, benefits for local flood peak reduction, water quality improvement and sediment management can be achieved. However, there is still a need to examine the sustainability of such measures through long term environmental payment schemes, considering how they could be funded across generational timescales rather than political cycles, and to monitor these measures over longer timescales and in multiple settings.
Real-time updating of the flood frequency distribution through data assimilation
NASA Astrophysics Data System (ADS)
Aguilar, Cristina; Montanari, Alberto; Polo, María-José
2017-07-01
We explore the memory properties of catchments for predicting the likelihood of floods based on observations of average flows in pre-flood seasons. Our approach assumes that flood formation is driven by the superimposition of short- and long-term perturbations. The former is given by the short-term meteorological forcing leading to infiltration and/or saturation excess, while the latter is originated by higher-than-usual storage in the catchment. To exploit the above sensitivity to long-term perturbations, a meta-Gaussian model and a data assimilation approach are implemented for updating the flood frequency distribution a season in advance. Accordingly, the peak flow in the flood season is predicted in probabilistic terms by exploiting its dependence on the average flow in the antecedent seasons. We focus on the Po River at Pontelagoscuro and the Danube River at Bratislava. We found that the shape of the flood frequency distribution is noticeably impacted by higher-than-usual flows occurring up to several months earlier. The proposed technique may allow one to reduce the uncertainty associated with the estimation of flood frequency.
NASA Astrophysics Data System (ADS)
Xuejiao, M.; Chang, J.; Wang, Y.
2017-12-01
Flood risk reduction with non-engineering measures has become the main idea for flood management. It is more effective for flood risk management to take various non-engineering measures. In this paper, a flood control operation model for cascade reservoirs in the Upper Yellow River was proposed to lower the flood risk of the water system with multi-reservoir by combining the reservoir flood control operation (RFCO) and flood early warning together. Specifically, a discharge control chart was employed to build the joint RFCO simulation model for cascade reservoirs in the Upper Yellow River. And entropy-weighted fuzzy comprehensive evaluation method was adopted to establish a multi-factorial risk assessment model for flood warning grade. Furthermore, after determining the implementing mode of countermeasures with future inflow, an intelligent optimization algorithm was used to solve the optimization model for applicable water release scheme. In addition, another model without any countermeasure was set to be a comparative experiment. The results show that the model developed in this paper can further decrease the flood risk of water system with cascade reservoirs. It provides a new approach to flood risk management by coupling flood control operation and flood early warning of cascade reservoirs.
Harvey, J.W.; Drummond, J.D.; Martin, R.L.; McPhillips, L.E.; Packman, A.I.; Jerolmack, D.J.; Stonedahl, S.H.; Aubeneau, A.F.; Sawyer, A.H.; Larsen, L.G.; Tobias, C.R.
2012-01-01
Hyporheic flow in streams has typically been studied separately from geomorphic processes. We investigated interactions between bed mobility and dynamic hyporheic storage of solutes and fine particles in a sand-bed stream before, during, and after a flood. A conservatively transported solute tracer (bromide) and a fine particles tracer (5 μm latex particles), a surrogate for fine particulate organic matter, were co-injected during base flow. The tracers were differentially stored, with fine particles penetrating more shallowly in hyporheic flow and retained more efficiently due to the high rate of particle filtration in bed sediment compared to solute. Tracer injections lasted 3.5 h after which we released a small flood from an upstream dam one hour later. Due to shallower storage in the bed, fine particles were rapidly entrained during the rising limb of the flood hydrograph. Rather than being flushed by the flood, we observed that solutes were stored longer due to expansion of hyporheic flow paths beneath the temporarily enlarged bedforms. Three important timescales determined the fate of solutes and fine particles: (1) flood duration, (2) relaxation time of flood-enlarged bedforms back to base flow dimensions, and (3) resulting adjustments and lag times of hyporheic flow. Recurrent transitions between these timescales explain why we observed a peak accumulation of natural particulate organic matter between 2 and 4 cm deep in the bed, i.e., below the scour layer of mobile bedforms but above the maximum depth of particle filtration in hyporheic flow paths. Thus, physical interactions between bed mobility and hyporheic transport influence how organic matter is stored in the bed and how long it is retained, which affects decomposition rate and metabolism of this southeastern Coastal Plain stream. In summary we found that dynamic interactions between hyporheic flow, bed mobility, and flow variation had strong but differential influences on base flow retention and flood mobilization of solutes and fine particulates. These hydrogeomorphic relationships have implications for microbial respiration of organic matter, carbon and nutrient cycling, and fate of contaminants in streams.
Evaluating the effect of online data compression on the disk cache of a mass storage system
NASA Technical Reports Server (NTRS)
Pentakalos, Odysseas I.; Yesha, Yelena
1994-01-01
A trace driven simulation of the disk cache of a mass storage system was used to evaluate the effect of an online compression algorithm on various performance measures. Traces from the system at NASA's Center for Computational Sciences were used to run the simulation and disk cache hit ratios, number of files and bytes migrating to tertiary storage were measured. The measurements were performed for both an LRU and a size based migration algorithm. In addition to seeing the effect of online data compression on the disk cache performance measure, the simulation provided insight into the characteristics of the interactive references, suggesting that hint based prefetching algorithms are the only alternative for any future improvements to the disk cache hit ratio.
Mueller, Erich R.; Grams, Paul E.; Schmidt, John C.; Hazel, Joseph E.; Alexander, Jason S.; Kaplinski, Matt
2014-01-01
Prior to the construction of large dams on the Green and Colorado Rivers, annual floods aggraded sandbars in lateral flow-recirculation eddies with fine sediment scoured from the bed and delivered from upstream. Flows greater than normal dam operations may be used to mimic this process in an attempt to increase time-averaged sandbar size. These controlled floods may rebuild sandbars, but sediment deficit conditions downstream from the dams restrict the frequency that controlled floods produce beneficial results. Here, we integrate complimentary, long-term monitoring data sets from the Colorado River in Marble and Grand Canyons downstream from Glen Canyon dam and the Green River in the Canyon of Lodore downstream from Flaming Gorge dam. Since the mid-1990s, several controlled floods have occurred in these canyon rivers. These controlled floods scour fine sediment from the bed and build sandbars in eddies, thus increasing channel relief. These changes are short-lived, however, as interflood dam operations erode sandbars within several months to years. Controlled flood response and interflood changes in bed elevation are more variable in Marble Canyon and Grand Canyon, likely reflecting more variable fine sediment supply and stronger transience in channel bed sediment storage. Despite these differences, neither system shows a trend in fine-sediment storage during the period in which controlled floods were monitored. These results demonstrate that controlled floods build eddy sandbars and increase channel relief for short interflood periods, and this response may be typical in other dam-influenced canyon rivers. The degree to which these features persist depends on the frequency of controlled floods, but careful consideration of sediment supply is necessary to avoid increasing the long-term sediment deficit.
Calibration of HEC-Ras hydrodynamic model using gauged discharge data and flood inundation maps
NASA Astrophysics Data System (ADS)
Tong, Rui; Komma, Jürgen
2017-04-01
The estimation of flood is essential for disaster alleviation. Hydrodynamic models are implemented to predict the occurrence and variance of flood in different scales. In practice, the calibration of hydrodynamic models aims to search the best possible parameters for the representation the natural flow resistance. Recent years have seen the calibration of hydrodynamic models being more actual and faster following the advance of earth observation products and computer based optimization techniques. In this study, the Hydrologic Engineering River Analysis System (HEC-Ras) model was set up with high-resolution digital elevation model from Laser scanner for the river Inn in Tyrol, Austria. 10 largest flood events from 19 hourly discharge gauges and flood inundation maps were selected to calibrate the HEC-Ras model. Manning roughness values and lateral inflow factors as parameters were automatically optimized with the Shuffled complex with Principal component analysis (SP-UCI) algorithm developed from the Shuffled Complex Evolution (SCE-UA). Different objective functions (Nash-Sutcliffe model efficiency coefficient, the timing of peak, peak value and Root-mean-square deviation) were used in single or multiple way. It was found that the lateral inflow factor was the most sensitive parameter. SP-UCI algorithm could avoid the local optimal and achieve efficient and effective parameters in the calibration of HEC-Ras model using flood extension images. As results showed, calibration by means of gauged discharge data and flood inundation maps, together with objective function of Nash-Sutcliffe model efficiency coefficient, was very robust to obtain more reliable flood simulation, and also to catch up with the peak value and the timing of peak.
NASA Astrophysics Data System (ADS)
Liu, Li; Xu, Yue-Ping
2017-04-01
Ensemble flood forecasting driven by numerical weather prediction products is becoming more commonly used in operational flood forecasting applications.In this study, a hydrological ensemble flood forecasting system based on Variable Infiltration Capacity (VIC) model and quantitative precipitation forecasts from TIGGE dataset is constructed for Lanjiang Basin, Southeast China. The impacts of calibration strategies and ensemble methods on the performance of the system are then evaluated.The hydrological model is optimized by parallel programmed ɛ-NSGAII multi-objective algorithm and two respectively parameterized models are determined to simulate daily flows and peak flows coupled with a modular approach.The results indicatethat the ɛ-NSGAII algorithm permits more efficient optimization and rational determination on parameter setting.It is demonstrated that the multimodel ensemble streamflow mean have better skills than the best singlemodel ensemble mean (ECMWF) and the multimodel ensembles weighted on members and skill scores outperform other multimodel ensembles. For typical flood event, it is proved that the flood can be predicted 3-4 days in advance, but the flows in rising limb can be captured with only 1-2 days ahead due to the flash feature. With respect to peak flows selected by Peaks Over Threshold approach, the ensemble means from either singlemodel or multimodels are generally underestimated as the extreme values are smoothed out by ensemble process.
NASA Astrophysics Data System (ADS)
Aierken, A.; Lee, H.; Hossain, F.; Bui, D. D.; Nguyen, L. D.
2016-12-01
The Mekong Delta, home to almost 20 million inhabitants, is considered one of the most important region for Vietnam as it is the agricultural and industrial production base of the nation. However, in recent decades, the region is seriously threatened by variety of environmental hazards, such as floods, saline water intrusion, arsenic contamination, and land subsidence, which raise its vulnerability to sea level rise due to global climate change. All these hazards are related to groundwater depletion, which is the result of dramatically increased over-exploitation. Therefore, monitoring groundwater is critical to sustainable development and most importantly, to people's life in the region. In most countries, groundwater is monitored using well observations. However, because of its spatial and temporal gaps and cost, it is typically difficult to obtain large scale, continuous observations. Since 2002, the Gravity Recovery and Climate Experiment (GRACE) satellite gravimetry mission has delivered freely available Earth's gravity variation data, which can be used to obtain terrestrial water storage (TWS) changes. In this study, the TWS anomalies over the Mekong Delta, which are the integrated sum of anomalies of soil moisture storage (SMS), surface water storage (SWS), canopy water storage (CWS), groundwater storage (GWS), have been obtained using GRACE CSR RL05 data. The leakage error occurred due to GRACE signal processing has been corrected using several different approaches. The groundwater storage anomalies were then derived from TWS anomalies by removing SMS, and CWS anomalies simulated by the four land surface models (NOAH, CLM, VIC and MOSAIC) in the Global Land Data Assimilation System (GLDAS), as well as SWS anomalies estimated using ENVISAT satellite altimetry and MODIS imagery. Then, the optimal GRACE signal restoration method for the Mekong Delta is determined with available in-situ well data. The estimated GWS anomalies revealed continuously decreasing trend, and the flood and drought occurred in 2004 and 2012, respectively. Our study reveals the ability of GRACE to monitor groundwater depletion as well as flood and drought in regional scale.
Modelling maximum river flow by using Bayesian Markov Chain Monte Carlo
NASA Astrophysics Data System (ADS)
Cheong, R. Y.; Gabda, D.
2017-09-01
Analysis of flood trends is vital since flooding threatens human living in terms of financial, environment and security. The data of annual maximum river flows in Sabah were fitted into generalized extreme value (GEV) distribution. Maximum likelihood estimator (MLE) raised naturally when working with GEV distribution. However, previous researches showed that MLE provide unstable results especially in small sample size. In this study, we used different Bayesian Markov Chain Monte Carlo (MCMC) based on Metropolis-Hastings algorithm to estimate GEV parameters. Bayesian MCMC method is a statistical inference which studies the parameter estimation by using posterior distribution based on Bayes’ theorem. Metropolis-Hastings algorithm is used to overcome the high dimensional state space faced in Monte Carlo method. This approach also considers more uncertainty in parameter estimation which then presents a better prediction on maximum river flow in Sabah.
33 CFR 208.32 - Sanford Dam and Lake Meredith, Canadian River, Tex.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Lake Meredith in the interest of flood control as follows: (a) Flood control storage in the reservoir... control pool) initially amounts to 462,100 acre-feet. Whenever the reservoir level is within this... as much as practicable the flood damage below the reservoir. All flood control releases shall be made...
33 CFR 208.32 - Sanford Dam and Lake Meredith, Canadian River, Tex.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Lake Meredith in the interest of flood control as follows: (a) Flood control storage in the reservoir... control pool) initially amounts to 462,100 acre-feet. Whenever the reservoir level is within this... as much as practicable the flood damage below the reservoir. All flood control releases shall be made...
An assessment of flood mitigation measures - "room for the river
NASA Astrophysics Data System (ADS)
Komma, J.; Blöschl, G.; Habereder, C.
2009-04-01
In this paper we analyse the relative effect of different flood mitigation measures for the example of the Kamp catchment in Austria. The main idea is to decrease flood peaks through (a) retaining water in the landscape and (b) providing additional inundation areas along the main stream (room for the river). To increase the retention of excess rainfall in the landscape we introduced two different measures. One measure is the increase of water storage capacity in the study catchment through the change of land use from agriculture to forest. The second measure is the installation of many small sized retention basins without an outlet (micro ponds). The micro ponds are situated at the hill slopes to intercept surface runoff. In case of the room for the river scenario the additional retention volume is gained due to the installation of retention basins along the Kamp river and its tributary Zwettl. Three flood retention basins with culverts at each river are envisaged. The geometry of the bottom outlets is defined for design discharges in a way to gain the greatest flood peak reduction for large flood events (above a 100 yr flood). The study catchment at the Kamp river with a size of 622 km² is located in north-eastern Austria. For the simulation of the different scenarios (retaining water in the landscape) a well calibrated continuous hydrologic model is available. The hydrological model consists of a spatially distributed soil moisture accounting scheme and a flood routing component. To analyse the effect of the room for the river scenario with retention basins along the river reaches a linked 1D/2D hydrodynamic model (TUFLOW) is used. In the river channels a one dimensional simulation is carried out. The flow conditions in the flood plains are represented by two dimensional model elements. The model domain incorporates 18 km of the Kamp and 12 km of the Zwettl river valley. For the assessment of the land use change scenario the hydrologic model parameters for wooded areas are transferred to areas that are currently not forested. Through higher storage capacities in the wooded areas the scenario of afforestation helps to reduce flood peaks. The micro ponds are represented in the hydrological model by a bucket storage component. It is filled by a fraction of the simulated direct runoff and drains into the groundwater with a constant percolation rate. For the scenarios of flood mitigation with retention basins along the river reaches three locations at the Kamp and three locations at the Zwettl river have been chosen for hypothetical retention basins or polders with bottom outlets. The main difference between the "room for the river" method and the "retaining water in the landscape" methods is the magnitude of the flood event for which the retention is maximised. For the case of retaining water in the landscape (either by land use change or microponds) the storage capacity obtained by these measures is filled at the beginning of the event. For small event magnitudes, the flood peak reduction is hence maximised. In the Kamp catchment, significant reductions in the flood peaks can be obtained when retention basins along the main stream are constructed and the flood plains are inundated. The main advantage of the room for the river methodology is that the polders/retention basins can be designed in a way that there is no retention for small flood discharges which leaves the full storage capacity for larger floods at the time of peak. In contrast, for the retaining water in the landscape measures, the storage is exhausted at an early stage of medium and large events, resulting in very small flood peak reductions.
NASA Astrophysics Data System (ADS)
LI, J.; Chen, Y.; Wang, H. Y.
2016-12-01
In large basin flood forecasting, the forecasting lead time is very important. Advances in numerical weather forecasting in the past decades provides new input to extend flood forecasting lead time in large rivers. Challenges for fulfilling this goal currently is that the uncertainty of QPF with these kinds of NWP models are still high, so controlling the uncertainty of QPF is an emerging technique requirement.The Weather Research and Forecasting (WRF) model is one of these NWPs, and how to control the QPF uncertainty of WRF is the research topic of many researchers among the meteorological community. In this study, the QPF products in the Liujiang river basin, a big river with a drainage area of 56,000 km2, was compared with the ground observation precipitation from a rain gauge networks firstly, and the results show that the uncertainty of the WRF QPF is relatively high. So a post-processed algorithm by correlating the QPF with the observed precipitation is proposed to remove the systematical bias in QPF. With this algorithm, the post-processed WRF QPF is close to the ground observed precipitation in area-averaged precipitation. Then the precipitation is coupled with the Liuxihe model, a physically based distributed hydrological model that is widely used in small watershed flash flood forecasting. The Liuxihe Model has the advantage with gridded precipitation from NWP and could optimize model parameters when there are some observed hydrological data even there is only a few, it also has very high model resolution to improve model performance, and runs on high performance supercomputer with parallel algorithm if executed in large rivers. Two flood events in the Liujiang River were collected, one was used to optimize the model parameters and another is used to validate the model. The results show that the river flow simulation has been improved largely, and could be used for real-time flood forecasting trail in extending flood forecasting leading time.
Pump Hydro Energy Storage systems (PHES) in groundwater flooded quarries
NASA Astrophysics Data System (ADS)
Poulain, Angélique; de Dreuzy, Jean-Raynald; Goderniaux, Pascal
2018-04-01
Pump storage hydroelectricity is an efficient way to temporarily store energy. This technique requires to store temporarily a large volume of water in an upper reservoir, and to release it through turbines to the lower reservoir, to produce electricity. Recently, the idea of using old flooded quarries as a lower reservoir has been evoked. However, these flooded quarries are generally connected to unconfined aquifers. Consequently, pumping or injecting large volumes of water, within short time intervals, will have an impact on the adjacent aquifers. Conversely, water exchanges between the quarry and the aquifer may also influence the water level fluctuations in the lower reservoir. Using numerical modelling, this study investigates the interactions between generic flooded open pit quarries and adjacent unconfined aquifers, during various pump-storage cyclic stresses. The propagation of sinusoidal stresses in the adjacent porous media and the amplitude of water level fluctuations in the quarry are studied. Homogeneous rock media and the presence of fractures in the vicinity of the quarry are considered. Results show that hydrological quarry - rock interactions must be considered with caution, when implementing pump - storage systems. For rock media characterized by high hydraulic conductivity and porosity values, water volumes exchanges during cycles may affect significantly the amplitude of the water level fluctuations in the quarry, and as a consequence, the instantaneous electricity production. Regarding the impact of the pump - storage cyclic stresses on the surrounding environment, the distance of influence is potentially high under specific conditions, and is enhanced with the occurrence of rock heterogeneities, such as fractures. The impact around the quarry used as a lower reservoir thus appears as an important constraining factor regarding the feasibility of pump - storage systems, to be assessed carefully if groundwater level fluctuations around the quarry are expected to bring up adverse effects. Results highlight opportunities and challenges to be faced, to implement pump - storage hydroelectricity systems in old flooded open pit quarries.
Rainfall estimation for real time flood monitoring using geostationary meteorological satellite data
NASA Astrophysics Data System (ADS)
Veerakachen, Watcharee; Raksapatcharawong, Mongkol
2015-09-01
Rainfall estimation by geostationary meteorological satellite data provides good spatial and temporal resolutions. This is advantageous for real time flood monitoring and warning systems. However, a rainfall estimation algorithm developed in one region needs to be adjusted for another climatic region. This work proposes computationally-efficient rainfall estimation algorithms based on an Infrared Threshold Rainfall (ITR) method calibrated with regional ground truth. Hourly rain gauge data collected from 70 stations around the Chao-Phraya river basin were used for calibration and validation of the algorithms. The algorithm inputs were derived from FY-2E satellite observations consisting of infrared and water vapor imagery. The results were compared with the Global Satellite Mapping of Precipitation (GSMaP) near real time product (GSMaP_NRT) using the probability of detection (POD), root mean square error (RMSE) and linear correlation coefficient (CC) as performance indices. Comparison with the GSMaP_NRT product for real time monitoring purpose shows that hourly rain estimates from the proposed algorithm with the error adjustment technique (ITR_EA) offers higher POD and approximately the same RMSE and CC with less data latency.
Evaluation of the wind pumped hydropower storage integrated flood mitigation system
NASA Astrophysics Data System (ADS)
Safi, Aishah; Basrawi, Firdaus
2018-04-01
As Wind Pumped Hydropower Storage (WPHS) need high cost to construct, it is important to study their impacts on economic and environmental aspects. Thus, this research aims to evaluate their economic and environmental performances. First, Hybrid Optimization Model for Electric Renewable (HOMER) was used to simulate power generation system with and without the flood reservoir. Next, the total amount of emitted air pollutant was used to evaluate the environmental impacts. It was found the wind-diesel with reservoir storage system (A-III) will have much lower NPC than other systems that do not include reservoir for flood mitigation when the cost of flood losses are included in the total Net Present Cost (NPC). The NPC for system A-III was RM 1.52 million and for diesel standalone system (A-I) is RM 10.8 million when the cost of flood losses are included in the total NPC. Between both energy systems, the amount of pollutants emitted by the A-III system was only 408 kg-CO2/year which is much less than the A-I system which is 99, 754 kg of carbon dioxide per year. To conclude, the WPHS integrated with flood mitigation system seems promising in the aspects of economic and environment.
NASA Astrophysics Data System (ADS)
Molinari, Daniela; Ballio, Francesco; Mazuran, Mirjana; Arias, Carolina; Minucci, Guido; Atun, Funda; Ardagna, Danilo
2015-04-01
According to a recent JRC report (De Groeve et al., Recording disaster losses, 2013), no measure better than loss over time can provide objective understanding of the path towards resilience. Moreover, damage data collected in the aftermath of floods supply the knowledge base on which a blend of actions can be performed, both in the short and mid time after the occurrence of a flood; among them: the identification of priorities for intervention during emergencies, the definition of compensation schemes, the understanding of damage mechanisms and of the fragilities of the flooded areas so as to improve/reform current risk mitigation strategies (also by means of improved flood damage models). Objective "measurement" of flood losses remains inadequate to meet the above objectives. This is due to a number of reasons that include: the diversity of intent for data collection, the lack of standardization on how to collect and storage data (including the lack of agreed definitions) among responsible subjects, and last but not least a lack of legislation to support the collection process. In such a context, the aim of this contribution is to discuss the results from the Poli-RISPOSTA (stRumentI per la protezione civile a Supporto delle POpolazioni nel poST Alluvione) project, a research project founded by Politecnico di Milano which is intended to develop tools and procedures for the collection and storage of high quality, consistent and reliable flood damage data. Specific objectives of Poli-RISPOSTA are: - Develop an operational procedure for collecting, storing and analyzing all damage data, in the aftermath of flood events. Collected data are intended to support a variety of actions, namely: loss accounting, disaster forensic, damage compensation and flood risk modelling; - Develop educational material and modules for training practitioners in the use of the procedure; - Develop enhanced IT tools to support the procedure, easing as much as possible the collection of field data, the creation of databases and the connection between the latter and different regional and municipal databases that already exist for different purposes (from cadastral data, to satellite images, etc.), the processing of collected data. A key principle of Poli-RISPOSTA is developing tools with the direct involvement of all interested parties so as to reach a two-fold objective: producing feasible solutions that re-organise existing practices and integrate them with new ones (whereas they are lacking) and, directly linked to the previous point, supplying the legislative context in which developed tools can be implemented.
NASA Astrophysics Data System (ADS)
Chow, Candace; Twele, André; Martinis, Sandro
2016-10-01
Flood extent maps derived from Synthetic Aperture Radar (SAR) data can communicate spatially-explicit information in a timely and cost-effective manner to support disaster management. Automated processing chains for SAR-based flood mapping have the potential to substantially reduce the critical time delay between the delivery of post-event satellite data and the subsequent provision of satellite derived crisis information to emergency management authorities. However, the accuracy of SAR-based flood mapping can vary drastically due to the prevalent land cover and topography of a given scene. While expert-based image interpretation with the consideration of contextual information can effectively isolate flood surface features, a fully-automated feature differentiation algorithm mainly based on the grey levels of a given pixel is comparatively more limited for features with similar SAR-backscattering characteristics. The inclusion of ancillary data in the automatic classification procedure can effectively reduce instances of misclassification. In this work, a near-global `Height Above Nearest Drainage' (HAND) index [10] was calculated with digital elevation data and drainage directions from the HydroSHEDS mapping project [2]. The index can be used to separate flood-prone regions from areas with a low probability of flood occurrence. Based on the HAND-index, an exclusion mask was computed to reduce water look-alikes with respect to the hydrologictopographic setting. The applicability of this near-global ancillary data set for the thematic improvement of Sentinel-1 and TerraSAR-X based services for flood and surface water monitoring has been validated both qualitatively and quantitatively. Application of a HAND-based exclusion mask resulted in improvements to the classification accuracy of SAR scenes with high amounts of water look-alikes and considerable elevation differences.
Demand analysis of flood insurance by using logistic regression model and genetic algorithm
NASA Astrophysics Data System (ADS)
Sidi, P.; Mamat, M. B.; Sukono; Supian, S.; Putra, A. S.
2018-03-01
Citarum River floods in the area of South Bandung Indonesia, often resulting damage to some buildings belonging to the people living in the vicinity. One effort to alleviate the risk of building damage is to have flood insurance. The main obstacle is not all people in the Citarum basin decide to buy flood insurance. In this paper, we intend to analyse the decision to buy flood insurance. It is assumed that there are eight variables that influence the decision of purchasing flood assurance, include: income level, education level, house distance with river, building election with road, flood frequency experience, flood prediction, perception on insurance company, and perception towards government effort in handling flood. The analysis was done by using logistic regression model, and to estimate model parameters, it is done with genetic algorithm. The results of the analysis shows that eight variables analysed significantly influence the demand of flood insurance. These results are expected to be considered for insurance companies, to influence the decision of the community to be willing to buy flood insurance.
Flood area and damage estimation in Zhejiang, China.
Liu, Renyi; Liu, Nan
2002-09-01
A GIS-based method to estimate flood area and damage is presented in this paper, which is oriented to developing countries like China, where labor is readily available for GIS data collecting, and tools such as, HEC-GeoRAS might not be readily available. At present local authorities in developing countries are often not predisposed to pay for commercial GIS platforms. To calculate flood area, two cases, non-source flood and source flood, are distinguished and a seed-spread algorithm suitable for source-flooding is described. The flood damage estimation is calculated in raster format by overlaying the flood area range with thematic maps and relating this to other socioeconomic data. Several measures used to improve the geometric accuracy and computing efficiency are presented. The management issues related to the application of this method, including the cost-effectiveness of approximate method in practice and supplementing two technical lines (self-programming and adopting commercial GIS software) to each other, are also discussed. The applications show that this approach has practical significance to flood fighting and control in developing countries like China.
NASA Astrophysics Data System (ADS)
Alexakis, Dimitris; Hadjimitsis, Diofantos; Agapiou, Athos; Themistocleous, Kyriacos; Retalis, Adrianos
2011-11-01
The increase of flood inundation occuring in different regions all over the world have enhanced the need for effective flood risk management. As floods frequency is increasing with a steady rate due to ever increasing human activities on physical floodplains there is a respectively increasing of financial destructive impact of floods. A flood can be determined as a mass of water that produces runoff on land that is not normally covered by water. However, earth observation techniques such as satellite remote sensing can contribute toward a more efficient flood risk mapping according to EU Directives of 2007/60. This study strives to highlight the need of digital mapping of urban sprawl in a catchment area in Cyprus and the assessment of its contribution to flood risk. The Yialias river (Nicosia, Cyprus) was selected as case study where devastating flash floods events took place at 2003 and 2009. In order to search the diachronic land cover regime of the study area multi-temporal satellite imagery was processed and analyzed (e.g Landsat TMETM+, Aster). The land cover regime was examined in detail by using sophisticated post-processing classification algorithms such as Maximum Likelihood, Parallelepiped Algorithm, Minimum Distance, Spectral Angle and Isodata. Texture features were calculated using the Grey Level Co-Occurence Matrix. In addition three classification techniques were compared : multispectral classification, texture based classification and a combination of both. The classification products were compared and evaluated for their accuracy. Moreover, a knowledge-rule method is proposed based on spectral, texture and shape features in order to create efficient land use and land cover maps of the study area. Morphometric parameters such as stream frequency, drainage density and elongation ratio were calculated in order to extract the basic watershed characteristics. In terms of the impacts of land use/cover on flooding, GIS and Fragstats tool were used to detect identifying trends, both visually and statistically, resulting from land use changes in a flood prone area such as Yialias by the use of spatial metrics. The results indicated that there is a considerable increase of urban areas cover during the period of the last 30 years. All these denoted that one of the main driving force of the increasing flood risk in catchment areas in Cyprus is generally associated to human activities.
Assessing the efficiency of different CSO positions based on network graph characteristics.
Sitzenfrei, R; Urich, C; Möderl, M; Rauch, W
2013-01-01
The technical design of urban drainage systems comprises two major aspects: first, the spatial layout of the sewer system and second, the pipe-sizing process. Usually, engineers determine the spatial layout of the sewer network manually, taking into account physical features and future planning scenarios. Before the pipe-sizing process starts, it is important to determine locations of possible weirs and combined sewer overflows (CSOs) based on, e.g. distance to receiving water bodies or to a wastewater treatment plant and available space for storage units. However, positions of CSOs are also determined by topological characteristics of the sewer networks. In order to better understand the impact of placement choices for CSOs and storage units in new systems, this work aims to determine case unspecific, general rules. Therefore, based on numerous, stochastically generated virtual alpine sewer systems of different sizes it is investigated how choices for placement of CSOs and storage units have an impact on the pipe-sizing process (hence, also on investment costs) and on technical performance (CSO efficiency and flooding). To describe the impact of the topological positions of these elements in the sewer networks, graph characteristics are used. With an evaluation of 2,000 different alpine combined sewer systems, it was found that, as expected, with CSOs at more downstream positions in the network, greater construction costs and better performance regarding CSO efficiency result. At a specific point (i.e. topological network position), no significant difference (further increase) in construction costs can be identified. Contrarily, the flooding efficiency increases with more upstream positions of the CSOs. Therefore, CSO and flooding efficiency are in a trade-off conflict and a compromise is required.
Zhou, Yuting; Xiao, Xiangming; Qin, Yuanwei; Dong, Jinwei; Zhang, Geli; Kou, Weili; Jin, Cui; Wang, Jie; Li, Xiangping
2016-01-01
Accurate and up-to-date information on the spatial distribution of paddy rice fields is necessary for the studies of trace gas emissions, water source management, and food security. The phenology-based paddy rice mapping algorithm, which identifies the unique flooding stage of paddy rice, has been widely used. However, identification and mapping of paddy rice in rice-wetland coexistent areas is still a challenging task. In this study, we found that the flooding/transplanting periods of paddy rice and natural wetlands were different. The natural wetlands flood earlier and have a shorter duration than paddy rice in the Panjin Plain, a temperate region in China. We used this asynchronous flooding stage to extract the paddy rice planting area from the rice-wetland coexistent area. MODIS Land Surface Temperature (LST) data was used to derive the temperature-defined plant growing season. Landsat 8 OLI imagery was used to detect the flooding signal and then paddy rice was extracted using the difference in flooding stages between paddy rice and natural wetlands. The resultant paddy rice map was evaluated with in-situ ground-truth data and Google Earth images. The estimated overall accuracy and Kappa coefficient were 95% and 0.90, respectively. The spatial pattern of OLI-derived paddy rice map agrees well with the paddy rice layer from the National Land Cover Dataset from 2010 (NLCD-2010). The differences between RiceLandsat and RiceNLCD are in the range of ±20% for most 1-km grid cell. The results of this study demonstrate the potential of the phenology-based paddy rice mapping algorithm, via integrating MODIS and Landsat 8 OLI images, to map paddy rice fields in complex landscapes of paddy rice and natural wetland in the temperate region. PMID:27688742
Zhou, Yuting; Xiao, Xiangming; Qin, Yuanwei; Dong, Jinwei; Zhang, Geli; Kou, Weili; Jin, Cui; Wang, Jie; Li, Xiangping
2016-04-01
Accurate and up-to-date information on the spatial distribution of paddy rice fields is necessary for the studies of trace gas emissions, water source management, and food security. The phenology-based paddy rice mapping algorithm, which identifies the unique flooding stage of paddy rice, has been widely used. However, identification and mapping of paddy rice in rice-wetland coexistent areas is still a challenging task. In this study, we found that the flooding/transplanting periods of paddy rice and natural wetlands were different. The natural wetlands flood earlier and have a shorter duration than paddy rice in the Panjin Plain, a temperate region in China. We used this asynchronous flooding stage to extract the paddy rice planting area from the rice-wetland coexistent area. MODIS Land Surface Temperature (LST) data was used to derive the temperature-defined plant growing season. Landsat 8 OLI imagery was used to detect the flooding signal and then paddy rice was extracted using the difference in flooding stages between paddy rice and natural wetlands. The resultant paddy rice map was evaluated with in-situ ground-truth data and Google Earth images. The estimated overall accuracy and Kappa coefficient were 95% and 0.90, respectively. The spatial pattern of OLI-derived paddy rice map agrees well with the paddy rice layer from the National Land Cover Dataset from 2010 (NLCD-2010). The differences between Rice Landsat and Rice NLCD are in the range of ±20% for most 1-km grid cell. The results of this study demonstrate the potential of the phenology-based paddy rice mapping algorithm, via integrating MODIS and Landsat 8 OLI images, to map paddy rice fields in complex landscapes of paddy rice and natural wetland in the temperate region.
Floods, floodplains, delta plains — A satellite imaging approach
NASA Astrophysics Data System (ADS)
Syvitski, James P. M.; Overeem, Irina; Brakenridge, G. Robert; Hannon, Mark
2012-08-01
Thirty-three lowland floodplains and their associated delta plains are characterized with data from three remote sensing systems (AMSR-E, SRTM and MODIS). These data provide new quantitative information to characterize Late Quaternary floodplain landscapes and their penchant for flooding over the last decade. Daily proxy records for discharge since 2002 and for each of the 33 river systems can be derived with novel Advanced Microwave Scanning Radiometer (AMSR-E) methods. A descriptive framework based on analysis of Shuttle Radar Topography Mission (SRTM) data is used to capture the major landscape-scale floodplain elements or zones: 1) container valleys with their long and narrow pathways of largely sediment transit and bypass, 2) floodplain depressions that act as loci for frequent flooding and sediment storage, 3) zones of nodal avulsions common to many continental scale rivers, and often located seaward of container valleys, and 4) coastal floodplains and delta plains that offer both sediment bypass and storage but under the influence of marine processes. The SRTM data allow mapping of smaller-scale architectural elements in unprecedented systematic manner. Floodplain depressions were found to play a major role, which may largely be overlooked in conceptual floodplain models. Lastly, MODIS data (independently and combined with AMSR-E) allows the tracking of flood hydrographs and pathways and sedimentation patterns on a near-daily timescale worldwide. These remote-sensing data show that 85% of the studied major river systems experienced extensive flooding in the last decade. A new quantitative paradigm of floodplain processes, honoring the frequency and extent of floods, can be develop by careful analysis of these new remotely sensed data.
NASA Astrophysics Data System (ADS)
Wang, Y.; Chang, J.; Guo, A.
2017-12-01
Traditional flood risk analysis focuses on the probability of flood events exceeding the design flood of downstream hydraulic structures while neglecting the influence of sedimentation in river channels on flood control systems. Given this focus, a univariate and copula-based bivariate hydrological risk framework focusing on flood control and sediment transport is proposed in the current work. Additionally, the conditional probabilities of occurrence of different flood events under various extreme precipitation scenarios are estimated by exploiting the copula model. Moreover, a Monte Carlo-based algorithm is used to evaluate the uncertainties of univariate and bivariate hydrological risk. Two catchments located on the Loess plateau are selected as study regions: the upper catchments of the Xianyang and Huaxian stations (denoted as UCX and UCH, respectively). The results indicate that (1) 2-day and 3-day consecutive rainfall are highly correlated with the annual maximum flood discharge (AMF) in UCX and UCH, respectively; and (2) univariate and bivariate return periods, risk and reliability for the purposes of flood control and sediment transport are successfully estimated. Sedimentation triggers higher risks of damaging the safety of local flood control systems compared with the AMF, exceeding the design flood of downstream hydraulic structures in the UCX and UCH. Most importantly, there was considerable sampling uncertainty in the univariate and bivariate hydrologic risk analysis, which would greatly challenge measures of future flood mitigation. The proposed hydrological risk framework offers a promising technical reference for flood risk analysis in sandy regions worldwide.
Development of a 3D Stream Network and Topography for Improved Large-Scale Hydraulic Modeling
NASA Astrophysics Data System (ADS)
Saksena, S.; Dey, S.; Merwade, V.
2016-12-01
Most digital elevation models (DEMs) used for hydraulic modeling do not include channel bed elevations. As a result, the DEMs are complimented with additional bathymetric data for accurate hydraulic simulations. Existing methods to acquire bathymetric information through field surveys or through conceptual models are limited to reach-scale applications. With an increasing focus on large scale hydraulic modeling of rivers, a framework to estimate and incorporate bathymetry for an entire stream network is needed. This study proposes an interpolation-based algorithm to estimate bathymetry for a stream network by modifying the reach-based empirical River Channel Morphology Model (RCMM). The effect of a 3D stream network that includes river bathymetry is then investigated by creating a 1D hydraulic model (HEC-RAS) and 2D hydrodynamic model (Integrated Channel and Pond Routing) for the Upper Wabash River Basin in Indiana, USA. Results show improved simulation of flood depths and storage in the floodplain. Similarly, the impact of river bathymetry incorporation is more significant in the 2D model as compared to the 1D model.
NASA Astrophysics Data System (ADS)
Gilfedder, Benjamin; Hofmann, Harald; Cartwrighta, Ian
2014-05-01
Groundwater-surface water interactions are often conceptually and numerically modeled as a two component system: a groundwater system connected to a stream, river or lake. However, transient storage zones such as hyporheic exchange, bank storage, parafluvial flow and flood plain storage complicate the two component model by delaying the release of flood water from the catchment. Bank storage occurs when high river levels associated with flood water reverses the hydraulic gradient between surface water and groundwater. River water flows into the riparian zone, where it is stored until the flood water recede. The water held in the banks then drains back into the river over time scales ranging from days to months as the hydraulic gradient returns to pre-flood levels. If the frequency and amplitude of flood events is high enough, water held in bank storage can potentially perpetually remain between the regional groundwater system and the river. In this work we focus on the role of bank storage in buffering river salinity levels against saline regional groundwater on lowland sections of the Avon River, Victoria, Australia. We hypothesize that the frequency and magnitude of floods will strongly influence the salinity of the stream water as banks fill and drain. A bore transect (5 bores) was installed perpendicular to the river and were instrumented with head and electrical conductivity loggers measuring for two years. We also installed a continuous 222Rn system in one bore. This data was augmented with long-term monthly EC from the river. During high rainfall events very fresh flood waters from the headwaters infiltrated into the gravel river banks leading to a dilution in EC and 222Rn in the bores. Following the events the fresh water drained back into the river as head gradients reversed. However the bank water salinities remained ~10x lower than regional groundwater levels during most of the time series, and only slightly above river water. During 2012 SE Australia experienced a prolonged summer drought. A significant increase in EC was observed in the bores towards the end of the summer, which suggest that the lack of bank recharge from the river resulted in draining of the banks and connection between the regional groundwater and the river. The long-term river salinity dataset showed that when flow events are infrequent and of low magnitude (i.e. drought conditions), salinities increase significantly. Similarly this is thought to be due to draining of the banks and connection with the regional groundwater system. Thus an increase in extended dry periods is expected to result in higher salinities in Australian waterways as the climate changes.
High-Performance Integrated Control of water quality and quantity in urban water reservoirs
NASA Astrophysics Data System (ADS)
Galelli, S.; Castelletti, A.; Goedbloed, A.
2015-11-01
This paper contributes a novel High-Performance Integrated Control framework to support the real-time operation of urban water supply storages affected by water quality problems. We use a 3-D, high-fidelity simulation model to predict the main water quality dynamics and inform a real-time controller based on Model Predictive Control. The integration of the simulation model into the control scheme is performed by a model reduction process that identifies a low-order, dynamic emulator running 4 orders of magnitude faster. The model reduction, which relies on a semiautomatic procedural approach integrating time series clustering and variable selection algorithms, generates a compact and physically meaningful emulator that can be coupled with the controller. The framework is used to design the hourly operation of Marina Reservoir, a 3.2 Mm3 storm-water-fed reservoir located in the center of Singapore, operated for drinking water supply and flood control. Because of its recent formation from a former estuary, the reservoir suffers from high salinity levels, whose behavior is modeled with Delft3D-FLOW. Results show that our control framework reduces the minimum salinity levels by nearly 40% and cuts the average annual deficit of drinking water supply by about 2 times the active storage of the reservoir (about 4% of the total annual demand).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Wei; Li, Hong-Yi; Leung, L. Ruby
Anthropogenic activities, e.g., reservoir operation, may alter the characteristics of Flood Frequency Curve (FFC) and challenge the basic assumption of stationarity used in flood frequency analysis. This paper presents a combined data-modeling analysis of the nonlinear filtering effects of reservoirs on the FFCs over the contiguous United States. A dimensionless Reservoir Impact Index (RII), defined as the total upstream reservoir storage capacity normalized by the annual streamflow volume, is used to quantify reservoir regulation effects. Analyses are performed for 388 river stations with an average record length of 50 years. The first two moments of the FFC, mean annual maximummore » flood (MAF) and coefficient of variations (CV), are calculated for the pre- and post-dam periods and compared to elucidate the reservoir regulation effects as a function of RII. It is found that MAF generally decreases with increasing RII but stabilizes when RII exceeds a threshold value, and CV increases with RII until a threshold value beyond which CV decreases with RII. The processes underlying the nonlinear threshold behavior of MAF and CV are investigated using three reservoir models with different levels of complexity. All models capture the non-linear relationships of MAF and CV with RII, suggesting that the basic flood control function of reservoirs is key to the non-linear relationships. The relative roles of reservoir storage capacity, operation objectives, available storage prior to a flood event, and reservoir inflow pattern are systematically investigated. Our findings may help improve flood-risk assessment and mitigation in regulated river systems at the regional scale.« less
Flood management on the lower Yellow River: hydrological and geomorphological perspectives
NASA Astrophysics Data System (ADS)
Shu, Li; Finlayson, Brian
1993-05-01
The Yellow River, known also as "China's Sorrow", has a long history of channel changes and disastrous floods in its lower reaches. Past channel positions can be identified from historical documentary records and geomorphological and sedimentological evidence. Since 1947, government policy has been aimed at containing the floods within artificial levees and preventing the river from changing its course. Flood control is based on flood-retarding dams and off-stream retention basins as well as artificial levees lining the channel. The design flood for the system has a recurrence interval of only around 60 years and floods of this and larger magnitudes can be generated downstream of the main flood control dams at Sanmenxia and Xiaolangdi. Rapid sedimentation along the river causes problems for storage and has raised the bed of the river some 10 m above the surrounding floodplain. The present management strategy is probably not viable in the long term and to avoid a major disaster a new management approach is required. The most viable option would appear to be to breach the levees at predetermined points coupled with advanced warning and evacuation of the population thus put at risk.
Development of web-based services for an ensemble flood forecasting and risk assessment system
NASA Astrophysics Data System (ADS)
Yaw Manful, Desmond; He, Yi; Cloke, Hannah; Pappenberger, Florian; Li, Zhijia; Wetterhall, Fredrik; Huang, Yingchun; Hu, Yuzhong
2010-05-01
Flooding is a wide spread and devastating natural disaster worldwide. Floods that took place in the last decade in China were ranked the worst amongst recorded floods worldwide in terms of the number of human fatalities and economic losses (Munich Re-Insurance). Rapid economic development and population expansion into low lying flood plains has worsened the situation. Current conventional flood prediction systems in China are neither suited to the perceptible climate variability nor the rapid pace of urbanization sweeping the country. Flood prediction, from short-term (a few hours) to medium-term (a few days), needs to be revisited and adapted to changing socio-economic and hydro-climatic realities. The latest technology requires implementation of multiple numerical weather prediction systems. The availability of twelve global ensemble weather prediction systems through the ‘THORPEX Interactive Grand Global Ensemble' (TIGGE) offers a good opportunity for an effective state-of-the-art early forecasting system. A prototype of a Novel Flood Early Warning System (NEWS) using the TIGGE database is tested in the Huai River basin in east-central China. It is the first early flood warning system in China that uses the massive TIGGE database cascaded with river catchment models, the Xinanjiang hydrologic model and a 1-D hydraulic model, to predict river discharge and flood inundation. The NEWS algorithm is also designed to provide web-based services to a broad spectrum of end-users. The latter presents challenges as both databases and proprietary codes reside in different locations and converge at dissimilar times. NEWS will thus make use of a ready-to-run grid system that makes distributed computing and data resources available in a seamless and secure way. An ability to run or function on different operating systems and provide an interface or front that is accessible to broad spectrum of end-users is additional requirement. The aim is to achieve robust interoperability through strong security and workflow capabilities. A physical network diagram and a work flow scheme of all the models, codes and databases used to achieve the NEWS algorithm are presented. They constitute a first step in the development of a platform for providing real time flood forecasting services on the web to mitigate 21st century weather phenomena.
Efficient spares matrix multiplication scheme for the CYBER 203
NASA Technical Reports Server (NTRS)
Lambiotte, J. J., Jr.
1984-01-01
This work has been directed toward the development of an efficient algorithm for performing this computation on the CYBER-203. The desire to provide software which gives the user the choice between the often conflicting goals of minimizing central processing (CPU) time or storage requirements has led to a diagonal-based algorithm in which one of three types of storage is selected for each diagonal. For each storage type, an initialization sub-routine estimates the CPU and storage requirements based upon results from previously performed numerical experimentation. These requirements are adjusted by weights provided by the user which reflect the relative importance the user places on the resources. The three storage types employed were chosen to be efficient on the CYBER-203 for diagonals which are sparse, moderately sparse, or dense; however, for many densities, no diagonal type is most efficient with respect to both resource requirements. The user-supplied weights dictate the choice.
NASA Astrophysics Data System (ADS)
Ovando, A.; Martinez, J. M.; Tomasella, J.; Rodriguez, D. A.; von Randow, C.
2018-07-01
The Bolivian Amazon wetlands are extensive floodplains distributed over the Mamore, Beni, Madre de Dios and Guapore Rivers. Located within the upper Madeira River Basin, the wetlands play important roles in regulating the biogeochemical processes and hydrological cycle of the region. In addition, they have major ecological and hydrological relevance for the entire Amazon Basin. These wetlands are characterized by the occurrence of episodic floods that result from contrasting hydro-meteorological processes in the Andean Mountain region, the piedmont area and the Amazon lowlands. In this study, we characterized the flood dynamics of the region using multi-temporal flood mapping based on optical altimetry (MODIS - Moderate Resolution Imaging Spectroradiometer - M*D09A1) and satellite altimetry (ENVISAT RA-2 and SARAL AltiKa altimeters). This study provides new insights regarding the frequency, magnitude and spatial distribution of exogenous floods, which are created by flood waves from the Andes; and endogenous floods, which result from runoff originating in the lowlands. The maximum extent of flooding during 2001-2014 was 43144 km2 in the Mamore Basin and 34852 km2 in the Guapore Basin, and the total surface water storage in these floodplains reached 94 km3. The regionalization of flood regimes based on water stage time series signatures allowed those regions that are exposed to frequent floods, which are generally located along rivers without a direct connection with the Andes, to be distinguished from floodplains that are more dependent on flood waves originating in the Andes and its piedmonts. This information is of great importance for understanding the roles of these wetlands in the provision of ecosystem services.
NASA Astrophysics Data System (ADS)
Delaney, C.; Hartman, R. K.; Mendoza, J.; Evans, K. M.; Evett, S.
2016-12-01
Forecast informed reservoir operations (FIRO) is a methodology that incorporates short to mid-range precipitation or flow forecasts to inform the flood operations of reservoirs. Previous research and modeling for flood control reservoirs has shown that FIRO can reduce flood risk and increase water supply for many reservoirs. The risk-based method of FIRO presents a unique approach that incorporates flow forecasts made by NOAA's California-Nevada River Forecast Center (CNRFC) to model and assess risk of meeting or exceeding identified management targets or thresholds. Forecasted risk is evaluated against set risk tolerances to set reservoir flood releases. A water management model was developed for Lake Mendocino, a 116,500 acre-foot reservoir located near Ukiah, California. Lake Mendocino is a dual use reservoir, which is owned and operated for flood control by the United State Army Corps of Engineers and is operated by the Sonoma County Water Agency for water supply. Due to recent changes in the operations of an upstream hydroelectric facility, this reservoir has been plagued with water supply reliability issues since 2007. FIRO is applied to Lake Mendocino by simulating daily hydrologic conditions from 1985 to 2010 in the Upper Russian River from Lake Mendocino to the City of Healdsburg approximately 50 miles downstream. The risk-based method is simulated using a 15-day, 61 member streamflow hindcast by the CNRFC. Model simulation results of risk-based flood operations demonstrate a 23% increase in average end of water year (September 30) storage levels over current operations. Model results show no increase in occurrence of flood damages for points downstream of Lake Mendocino. This investigation demonstrates that FIRO may be a viable flood control operations approach for Lake Mendocino and warrants further investigation through additional modeling and analysis.
Reduced Complexity Modelling of Urban Floodplain Inundation
NASA Astrophysics Data System (ADS)
McMillan, H. K.; Brasington, J.; Mihir, M.
2004-12-01
Significant recent advances in floodplain inundation modelling have been achieved by directly coupling 1d channel hydraulic models with a raster storage cell approximation for floodplain flows. The strengths of this reduced-complexity model structure derive from its explicit dependence on a digital elevation model (DEM) to parameterize flows through riparian areas, providing a computationally efficient algorithm to model heterogeneous floodplains. Previous applications of this framework have generally used mid-range grid scales (101-102 m), showing the capacity of the models to simulate long reaches (103-104 m). However, the increasing availability of precision DEMs derived from airborne laser altimetry (LIDAR) enables their use at very high spatial resolutions (100-101 m). This spatial scale offers the opportunity to incorporate the complexity of the built environment directly within the floodplain DEM and simulate urban flooding. This poster describes a series of experiments designed to explore model functionality at these reduced scales. Important questions are considered, raised by this new approach, about the reliability and representation of the floodplain topography and built environment, and the resultant sensitivity of inundation forecasts. The experiments apply a raster floodplain model to reconstruct a 1:100 year flood event on the River Granta in eastern England, which flooded 72 properties in the town of Linton in October 2001. The simulations use a nested-scale model to maintain efficiency. A 2km by 4km urban zone is represented by a high-resolution DEM derived from single-pulse LIDAR data supplied by the UK Environment Agency, together with surveyed data and aerial photography. Novel methods of processing the raw data to provide the individual structure detail required are investigated and compared. This is then embedded within a lower-resolution model application at the reach scale which provides boundary conditions based on recorded flood stage. The high resolution predictions on a scale commensurate with urban structures make possible a multi-criteria validation which combines verification of reach-scale characteristics such as downstream flow and inundation extent with internal validation of flood depth at individual sites.
Application of Multi-Satellite Precipitation Analysis to Floods and Landslides
NASA Technical Reports Server (NTRS)
Adler, Robert; Hong, Yang; Huffman, George
2007-01-01
Satellite data acquired and processed in real time now have the potential to provide the spacetime information on rainfall needed to monitor flood and landslide events around the world. This can be achieved by integrating the satellite-derived forcing data with hydrological models and landslide algorithms. Progress in using the TRMM Multi-satellite Precipitation Analysis (TMPA) as input to flood and landslide forecasts is outlined, with a focus on understanding limitations of the rainfall data and impacts of those limitations on flood/landslide analyses. Case studies of both successes and failures will be shown, as well as comparison with ground comparison data sets both in terms of rainfall and in terms of flood/landslide events. In addition to potential uses in real-time, the nearly ten years of TMPA data allow retrospective running of the models to examine variations in extreme events. The flood determination algorithm consists of four major components: 1) multi-satellite precipitation estimation; 2) characterization of land surface including digital elevation from NASA SRTM (Shuttle Radar Terrain Mission), topography-derived hydrologic parameters such as flow direction, flow accumulation, basin, and river network etc.; 3) a hydrological model to infiltrate rainfall and route overland runoff; and 4) an implementation interface to relay the input data to the models and display the flood inundation results to potential users and decision-makers. In terms of landslides, the satellite rainfall information is combined with a global landslide susceptibility map, derived from a combination of global surface characteristics (digital elevation topography, slope, soil types, soil texture, and land cover classification etc.) using a weighted linear combination approach. In those areas identified as "susceptible" (based on the surface characteristics), landslides are forecast where and when a rainfall intensity/duration threshold is exceeded. Results are described indicating general agreement with landslide occurrences. However, difficulties in comparing landslide event information (mostly from news reports) with the satellite-based forecasts are analyzed.
Real-time flood extent maps based on social media
NASA Astrophysics Data System (ADS)
Eilander, Dirk; van Loenen, Arnejan; Roskam, Ruud; Wagemaker, Jurjen
2015-04-01
During a flood event it is often difficult to get accurate information about the flood extent and the people affected. This information is very important for disaster risk reduction management and crisis relief organizations. In the post flood phase, information about the flood extent is needed for damage estimation and calibrating hydrodynamic models. Currently, flood extent maps are derived from a few sources such as satellite images, areal images and post-flooding flood marks. However, getting accurate real-time or maximum flood extent maps remains difficult. With the rise of social media, we now have a new source of information with large numbers of observations. In the city of Jakarta, Indonesia, the intensity of unique flood related tweets during a flood event, peaked at 8 tweets per second during floods in early 2014. A fair amount of these tweets also contains observations of water depth and location. Our hypothesis is that based on the large numbers of tweets it is possible to generate real-time flood extent maps. In this study we use tweets from the city of Jakarta, Indonesia, to generate these flood extent maps. The data-mining procedure looks for tweets with a mention of 'banjir', the Bahasa Indonesia word for flood. It then removes modified and retweeted messages in order to keep unique tweets only. Since tweets are not always sent directly from the location of observation, the geotag in the tweets is unreliable. We therefore extract location information using mentions of names of neighborhoods and points of interest. Finally, where encountered, a mention of a length measure is extracted as water depth. These tweets containing a location reference and a water level are considered to be flood observations. The strength of this method is that it can easily be extended to other regions and languages. Based on the intensity of tweets in Jakarta during a flood event we can provide a rough estimate of the flood extent. To provide more accurate flood extend information, we project the water depth observations in tweets on a digital elevation model using a flood-fill algorithm. Based on statistical methods we combine the large numbers of observations in order to create time series of flood extent maps. Early results indicate this method is very promising.
A framework for global river flood risk assessment
NASA Astrophysics Data System (ADS)
Winsemius, H. C.; Van Beek, L. P. H.; Bouwman, A.; Ward, P. J.; Jongman, B.
2012-04-01
There is an increasing need for strategic global assessments of flood risks. Such assessments may be required by: (a) International Financing Institutes and Disaster Management Agencies to evaluate where, when, and which investments in flood risk mitigation are most required; (b) (re-)insurers, who need to determine their required coverage capital; and (c) large companies to account for risks of regional investments. In this contribution, we propose a framework for global river flood risk assessment. The framework combines coarse scale resolution hazard probability distributions, derived from global hydrological model runs (typical scale about 0.5 degree resolution) with high resolution estimates of exposure indicators. The high resolution is required because floods typically occur at a much smaller scale than the typical resolution of global hydrological models, and exposure indicators such as population, land use and economic value generally are strongly variable in space and time. The framework therefore estimates hazard at a high resolution ( 1 km2) by using a) global forcing data sets of the current (or in scenario mode, future) climate; b) a global hydrological model; c) a global flood routing model, and d) importantly, a flood spatial downscaling routine. This results in probability distributions of annual flood extremes as an indicator of flood hazard, at the appropriate resolution. A second component of the framework combines the hazard probability distribution with classical flood impact models (e.g. damage, affected GDP, affected population) to establish indicators for flood risk. The framework can be applied with a large number of datasets and models and sensitivities of such choices can be evaluated by the user. The framework is applied using the global hydrological model PCR-GLOBWB, combined with a global flood routing model. Downscaling of the hazard probability distributions to 1 km2 resolution is performed with a new downscaling algorithm, applied on a number of target regions. We demonstrate the use of impact models in these regions based on global GDP, population, and land use maps. In this application, we show sensitivities of the estimated risks with regard to the use of different climate input datasets, decisions made in the downscaling algorithm, and different approaches to establish distributed estimates of GDP and asset exposure to flooding.
Evaluation of flash-flood discharge forecasts in complex terrain using precipitation
Yates, D.; Warner, T.T.; Brandes, E.A.; Leavesley, G.H.; Sun, Jielun; Mueller, C.K.
2001-01-01
Operational prediction of flash floods produced by thunderstorm (convective) precipitation in mountainous areas requires accurate estimates or predictions of the precipitation distribution in space and time. The details of the spatial distribution are especially critical in complex terrain because the watersheds are generally small in size, and small position errors in the forecast or observed placement of the precipitation can distribute the rain over the wrong watershed. In addition to the need for good precipitation estimates and predictions, accurate flood prediction requires a surface-hydrologic model that is capable of predicting stream or river discharge based on the precipitation-rate input data. Different techniques for the estimation and prediction of convective precipitation will be applied to the Buffalo Creek, Colorado flash flood of July 1996, where over 75 mm of rain from a thunderstorm fell on the watershed in less than 1 h. The hydrologic impact of the precipitation was exacerbated by the fact that a significant fraction of the watershed experienced a wildfire approximately two months prior to the rain event. Precipitation estimates from the National Weather Service's operational Weather Surveillance Radar-Doppler 1988 and the National Center for Atmospheric Research S-band, research, dual-polarization radar, colocated to the east of Denver, are compared. In addition, very short range forecasts from a convection-resolving dynamic model, which is initialized variationally using the radar reflectivity and Doppler winds, are compared with forecasts from an automated-algorithmic forecast system that also employs the radar data. The radar estimates of rain rate, and the two forecasting systems that employ the radar data, have degraded accuracy by virtue of the fact that they are applied in complex terrain. Nevertheless, the radar data and forecasts from the dynamic model and the automated algorithm could be operationally useful for input to surface-hydrologic models employed for flood warning. Precipitation data provided by these various techniques at short time scales and at fine spatial resolutions are employed as detailed input to a distributed-parameter hydrologic model for flash-flood prediction and analysis. With the radar-based precipitation estimates employed as input, the simulated flood discharge was similar to that observed. The dynamic-model precipitation forecast showed the most promise in providing a significant discharge-forecast lead time. The algorithmic system's precipitation forecast did not demonstrate as much skill, but the associated discharge forecast would still have been sufficient to have provided an alert of impending flood danger.
Integration of Grid and Sensor Web for Flood Monitoring and Risk Assessment from Heterogeneous Data
NASA Astrophysics Data System (ADS)
Kussul, Nataliia; Skakun, Sergii; Shelestov, Andrii
2013-04-01
Over last decades we have witnessed the upward global trend in natural disaster occurrence. Hydrological and meteorological disasters such as floods are the main contributors to this pattern. In recent years flood management has shifted from protection against floods to managing the risks of floods (the European Flood risk directive). In order to enable operational flood monitoring and assessment of flood risk, it is required to provide an infrastructure with standardized interfaces and services. Grid and Sensor Web can meet these requirements. In this paper we present a general approach to flood monitoring and risk assessment based on heterogeneous geospatial data acquired from multiple sources. To enable operational flood risk assessment integration of Grid and Sensor Web approaches is proposed [1]. Grid represents a distributed environment that integrates heterogeneous computing and storage resources administrated by multiple organizations. SensorWeb is an emerging paradigm for integrating heterogeneous satellite and in situ sensors and data systems into a common informational infrastructure that produces products on demand. The basic Sensor Web functionality includes sensor discovery, triggering events by observed or predicted conditions, remote data access and processing capabilities to generate and deliver data products. Sensor Web is governed by the set of standards, called Sensor Web Enablement (SWE), developed by the Open Geospatial Consortium (OGC). Different practical issues regarding integration of Sensor Web with Grids are discussed in the study. We show how the Sensor Web can benefit from using Grids and vice versa. For example, Sensor Web services such as SOS, SPS and SAS can benefit from the integration with the Grid platform like Globus Toolkit. The proposed approach is implemented within the Sensor Web framework for flood monitoring and risk assessment, and a case-study of exploiting this framework, namely the Namibia SensorWeb Pilot Project, is described. The project was created as a testbed for evaluating and prototyping key technologies for rapid acquisition and distribution of data products for decision support systems to monitor floods and enable flood risk assessment. The system provides access to real-time products on rainfall estimates and flood potential forecast derived from the Tropical Rainfall Measuring Mission (TRMM) mission with lag time of 6 h, alerts from the Global Disaster Alert and Coordination System (GDACS) with lag time of 4 h, and the Coupled Routing and Excess STorage (CREST) model to generate alerts. These are alerts are used to trigger satellite observations. With deployed SPS service for NASA's EO-1 satellite it is possible to automatically task sensor with re-image capability of less 8 h. Therefore, with enabled computational and storage services provided by Grid and cloud infrastructure it was possible to generate flood maps within 24-48 h after trigger was alerted. To enable interoperability between system components and services OGC-compliant standards are utilized. [1] Hluchy L., Kussul N., Shelestov A., Skakun S., Kravchenko O., Gripich Y., Kopp P., Lupian E., "The Data Fusion Grid Infrastructure: Project Objectives and Achievements," Computing and Informatics, 2010, vol. 29, no. 2, pp. 319-334.
Operating rules for multireservoir systems
NASA Astrophysics Data System (ADS)
Oliveira, Rodrigo; Loucks, Daniel P.
1997-04-01
Multireservoir operating policies are usually defined by rules that specify either individual reservoir desired (target) storage volumes or desired (target) releases based on the time of year and the existing total storage volume in all reservoirs. This paper focuses on the use of genetic search algorithms to derive these multireservoir operating policies. The genetic algorithms use real-valued vectors containing information needed to define both system release and individual reservoir storage volume targets as functions of total storage in each of multiple within-year periods. Elitism, arithmetic crossover, mutation, and "en bloc" replacement are used in the algorithms to generate successive sets of possible operating policies. Each policy is then evaluated using simulation to compute a performance index for a given flow series. The better performing policies are then used as a basis for generating new sets of possible policies. The process of improved policy generation and evaluation is repeated until no further improvement in performance is obtained. The proposed algorithm is applied to example reservoir systems used for water supply and hydropower.
Flood risk assessment of land pollution hotspots
NASA Astrophysics Data System (ADS)
Masi, Matteo; Arrighi, Chiara; Iannelli, Renato
2017-04-01
Among the risks caused by extreme events, the potential spread of pollutants stored in land hotspots due to floods is an aspect that has been rarely examined with a risk-based approach. In this contribution, an attempt to estimate pollution risks related to flood events of land pollution hotspots was carried out. Flood risk has been defined as the combination of river flood hazard, hotspots exposure and vulnerability to contamination of the area, i.e. the expected severity of the environmental impacts. The assessment was performed on a geographical basis, using geo-referenced open data, available from databases of land management institutions, authorities and agencies. The list of land pollution hotspots included landfills and other waste handling facilities (e.g., temporary storage, treatment and recycling sites), municipal wastewater treatment plants, liquid waste treatment facilities and contaminated sites. The assessment was carried out by combining geo-referenced data of pollution hotspots with flood hazard maps. We derived maps of land pollution risk based on geographical and geological properties and source characteristics available from environmental authorities. These included information about soil particle size, soil hydraulic conductivity, terrain slope, type of stored pollutants, the type of facility, capacity, size of the area, land use, etc. The analysis was carried out at catchment scale. The case study of the Arno river basin in Tuscany (central Italy) is presented.
Floods in mountain environments: A synthesis
NASA Astrophysics Data System (ADS)
Stoffel, Markus; Wyżga, Bartłomiej; Marston, Richard A.
2016-11-01
Floods are a crucial agent of geomorphic change in the channels and valley floors of mountains watercourses. At the same time, they can be highly damaging to property, infrastructure, and life. Because of their high energy, mountain watercourses are highly vulnerable to environmental changes affecting their catchments and channels. Many factors have modified and frequently still tend to modify the environmental conditions in mountain areas, with impacts on geomorphic processes and the frequency, magnitude, and timing of floods in mountain watercourses. The ongoing climate changes vary between regions but may affect floods in mountain areas in many ways. In many mountain regions of Europe, widespread afforestation took place over the twentieth century, considerably increasing the amounts of large wood delivered to the channels and the likelihood of jamming bridges. At the same time, deforestation continues in other mountain areas, accelerating runoff and amplifying the magnitude and frequency of floods in foreland areas. In many countries, in-channel gravel mining has been a common practice during recent decades; the resultant deficit of bed material in the affected channels may suddenly manifest during flood events, resulting in the failure of scoured bridges or catastrophic channel widening. During the past century many rivers in mountain and foreland areas incised deeply; the resultant loss of floodplain water storage has decreased attenuation of flood waves, hence increasing flood hazard to downstream river reaches. On the other hand, a large amount of recent river restoration activities worldwide may provide examples of beneficial changes to flood risk, attained as a result of increased channel storage or reestablished floodplain water storage. Relations between geomorphic processes and floods operate in both directions, which means that changes in flood probability or the character of floods (e.g., increased wood load) may significantly modify the morphology of mountain rivers, but morphological changes of rivers can also affect hydrological properties of floods and the associated risk for societies. This paper provides a review of research in the field of floods in mountain environments and puts the papers of this special issue dedicated to the same topic into context. It also provides insight into innovative studies, methods, or emerging aspects of the relations between environmental changes, geomorphic processes, and the occurrence of floods in mountain rivers.
NASA Astrophysics Data System (ADS)
Hortos, William S.
2009-05-01
In previous work by the author, parameters across network protocol layers were selected as features in supervised algorithms that detect and identify certain intrusion attacks on wireless ad hoc sensor networks (WSNs) carrying multisensor data. The algorithms improved the residual performance of the intrusion prevention measures provided by any dynamic key-management schemes and trust models implemented among network nodes. The approach of this paper does not train algorithms on the signature of known attack traffic, but, instead, the approach is based on unsupervised anomaly detection techniques that learn the signature of normal network traffic. Unsupervised learning does not require the data to be labeled or to be purely of one type, i.e., normal or attack traffic. The approach can be augmented to add any security attributes and quantified trust levels, established during data exchanges among nodes, to the set of cross-layer features from the WSN protocols. A two-stage framework is introduced for the security algorithms to overcome the problems of input size and resource constraints. The first stage is an unsupervised clustering algorithm which reduces the payload of network data packets to a tractable size. The second stage is a traditional anomaly detection algorithm based on a variation of support vector machines (SVMs), whose efficiency is improved by the availability of data in the packet payload. In the first stage, selected algorithms are adapted to WSN platforms to meet system requirements for simple parallel distributed computation, distributed storage and data robustness. A set of mobile software agents, acting like an ant colony in securing the WSN, are distributed at the nodes to implement the algorithms. The agents move among the layers involved in the network response to the intrusions at each active node and trustworthy neighborhood, collecting parametric values and executing assigned decision tasks. This minimizes the need to move large amounts of audit-log data through resource-limited nodes and locates routines closer to that data. Performance of the unsupervised algorithms is evaluated against the network intrusions of black hole, flooding, Sybil and other denial-of-service attacks in simulations of published scenarios. Results for scenarios with intentionally malfunctioning sensors show the robustness of the two-stage approach to intrusion anomalies.
Risk factors of diarrhoea among flood victims: a controlled epidemiological study.
Mondal, N C; Biswas, R; Manna, A
2001-01-01
The concept and practice of 'disaster preparedness and response', instead of traditional casualty relief, is relatively new. Vulnerability analysis and health risks assessment of disaster prone communities are important prerequisites of meaningful preparedness and effective response against any calamity. In this community based study, the risk of diarrhoeal disease and its related epidemiological factors were analysed by collecting data from two selected flood prone block of Midnapur district of West Bengal. The information was compared with that of another population living in two non-flood prone blocks of the same district. The study showed that diarrhoeal disease was the commonest morbidity in flood prone population. Some behaviours, like use of pond water for utensil wash and kitchen purpose, hand washing after defecation without soap, improper hand washing before eating, open field defecation, storage of drinking water in wide mouth vessels etc. were found to be associated with high attack rate of diarrhoea, in both study and control population during flood season compared to pre-flood season. Attack rates were also significantly higher in flood prone population than that of population in non-flood prone area during the same season. Necessity of both community education for proper water use behaviour and personal hygiene along with ensuring safe water and sanitation facilities of flood affected communities were emphasized.
A fast method for optical simulation of flood maps of light-sharing detector modules.
Shi, Han; Du, Dong; Xu, JianFeng; Moses, William W; Peng, Qiyu
2015-12-01
Optical simulation of the detector module level is highly desired for Position Emission Tomography (PET) system design. Commonly used simulation toolkits such as GATE are not efficient in the optical simulation of detector modules with complicated light-sharing configurations, where a vast amount of photons need to be tracked. We present a fast approach based on a simplified specular reflectance model and a structured light-tracking algorithm to speed up the photon tracking in detector modules constructed with polished finish and specular reflector materials. We simulated conventional block detector designs with different slotted light guide patterns using the new approach and compared the outcomes with those from GATE simulations. While the two approaches generated comparable flood maps, the new approach was more than 200-600 times faster. The new approach has also been validated by constructing a prototype detector and comparing the simulated flood map with the experimental flood map. The experimental flood map has nearly uniformly distributed spots similar to those in the simulated flood map. In conclusion, the new approach provides a fast and reliable simulation tool for assisting in the development of light-sharing-based detector modules with a polished surface finish and using specular reflector materials.
NASA Astrophysics Data System (ADS)
Niayifar, A.; Perona, P.
2015-12-01
River impoundment by dams is known to strongly affect the natural flow regime and in turn the river attributes and the related ecosystem biodiversity. Making hydropower sustainable implies to seek for innovative operational policies able to generate dynamic environmental flows while maintaining economic efficiency. For dammed systems, we build the ecological and economical efficiency plot for non-proportional flow redistribution operational rules compared to minimal flow operational. As for the case of small hydropower plants (e.g., see the companion paper by Gorla et al., this session), we use a four parameters Fermi-Dirac statistical distribution to mathematically formulate non-proportional redistribution rules. These rules allocate a fraction of water to the riverine environment depending on current reservoir inflows and storage. Riverine ecological benefits associated to dynamic environmental flows are computed by integrating the Weighted Usable Area (WUA) for fishes with Richter's hydrological indicators. Then, we apply nondominated sorting genetic algorithm II (NSGA-II) to an ensemble of non-proportional and minimal flow redistribution rules in order to generate the Pareto frontier showing the system performances in the ecologic and economic space. This fast and elitist multiobjective optimization method is eventually applied to a case study. It is found that non-proportional dynamic flow releases ensure maximal power production on the one hand, while conciliating ecological sustainability on the other hand. Much of the improvement in the environmental indicator is seen to arise from a better use of the reservoir storage dynamics, which allows to capture, and laminate flood events while recovering part of them for energy production. In conclusion, adopting such new operational policies would unravel a spectrum of globally-efficient performances of the dammed system when compared with those resulting from policies based on constant minimum flow releases.
Snow mass and river flows modelled using GRACE total water storage observations
NASA Astrophysics Data System (ADS)
Wang, S.
2017-12-01
Snow mass and river flow measurements are difficult and less accurate in cold regions due to the hash environment. Floods in cold regions are commonly a result of snowmelt during the spring break-up. Flooding is projected to increase with climate change in many parts of the world. Forecasting floods from snowmelt remains a challenge due to scarce and quality issues in basin-scale snow observations and lack of knowledge for cold region hydrological processes. This study developed a model for estimating basin-level snow mass (snow water equivalent SWE) and river flows using the total water storage (TWS) observations from the Gravity Recovery and Climate Experiment (GRACE) satellite mission. The SWE estimation is based on mass balance approach which is independent of in situ snow gauge observations, thus largely eliminates the limitations and uncertainties with traditional in situ or remote sensing snow estimates. The model forecasts river flows by simulating surface runoff from snowmelt and the corresponding baseflow from groundwater discharge. Snowmelt is predicted using a temperature index model. Baseflow is predicted using a modified linear reservoir model. The model also quantifies the hysteresis between the snowmelt and the streamflow rates, or the lump time for water travel in the basin. The model was applied to the Red River Basin, the Mackenzie River Basin, and the Hudson Bay Lowland Basins in Canada. The predicted river flows were compared with the observed values at downstream hydrometric stations. The results were also compared to that for the Lower Fraser River obtained in a separate study to help better understand the roles of environmental factors in determining flood and their variations with different hydroclimatic conditions. This study advances the applications of space-based time-variable gravity measurements in cold region snow mass estimation, river flow and flood forecasting. It demonstrates a relatively simple method that only needs GRACE TWS and temperature data for river flow or flood forecasting. The model can be particularly useful for regions with spare observation networks, and can be used in combination with other available methods to help improve the accuracy in river flow and flood forecasting over cold regions.
NASA Astrophysics Data System (ADS)
Yang, Z. L.; Wu, W. Y.; Lin, P.; Maidment, D. R.
2017-12-01
Extreme water events such as catastrophic floods and severe droughts have increased in recent decades. Mitigating the risk to lives, food security, infrastructure, energy supplies, as well as numerous other industries posed by these extreme events requires informed decision-making and planning based on sound science. We are developing a global water modeling capability by building models that will provide total operational water predictions (evapotranspiration, soil moisture, groundwater, channel flow, inundation, snow) at unprecedented spatial resolutions and updated frequencies. Toward this goal, this talk presents an integrated global hydrological modeling framework that takes advantage of gridded meteorological forcing, land surface modeling, channeled flow modeling, ground observations, and satellite remote sensing. Launched in August 2016, the National Water Model successfully incorporates weather forecasts to predict river flows for more than 2.7 million rivers across the continental United States, which transfers a "synoptic weather map" to a "synoptic river flow map" operationally. In this study, we apply a similar framework to a high-resolution global river network database, which is developed from a hierarchical Dominant River Tracing (DRT) algorithm, and runoff output from the Global Land Data Assimilation System (GLDAS) to a vector-based river routing model (The Routing Application for Parallel Computation of Discharge, RAPID) to produce river flows from 2001 to 2016 using Message Passing Interface (MPI) on Texas Advanced Computer Center's Stampede system. In this simulation, global river discharges for more than 177,000 rivers are computed every 30 minutes. The modeling framework's performance is evaluated with various observations including river flows at more than 400 gauge stations globally. Overall, the model exhibits a reasonably good performance in simulating the averaged patterns of terrestrial water storage, evapotranspiration and runoff. The system is appropriate for monitoring and studying floods and droughts. Directions for future research will be outlined and discussed.
Integrating Physical and Topographic Information Into a Fuzzy Scheme to Map Flooded Area by SAR.
Pierdicca, Nazzareno; Chini, Marco; Pulvirenti, Luca; Macina, Flavia
2008-07-10
A flood mapping procedure based on a fuzzy sets theory has been developed. The method is based on the integration of Synthetic Aperture Radar (SAR) measurements with additional data on the inundated area, such as a land cover map and a digital elevation model (DEM). The information on land cover has allowed us to account for both specular reflection, typical of open water, and double bounce backscattering, typical of forested and urban areas. DEM has been exploited to include simple hydraulic considerations on the dependence of inundation probability on surface characteristics. Contextual information has been taken into account too. The proposed algorithm has been tested on a flood occurred in Italy on November 1994. A pair of ERS-1 images, collected before and after (three days later) the flood, has been used. The results have been compared with the data provided by a ground survey carried out when the flood reached its maximum extension. Despite the temporal mismatch between the survey and the post-inundation SAR image, the comparison has yielded encouraging results, with the 87% of the pixels correctly classified as inundated.
18 CFR 11.16 - Filing requirements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... generating capacity separately designated. (3) A description of the total storage capacity of the reservoir..., irrigation storage, and flood control storage. Identification, by reservoir elevation, of the portion of the reservoir assigned to each of its respective storage functions. (4) An elevation-capacity curve, or a...
18 CFR 11.16 - Filing requirements.
Code of Federal Regulations, 2011 CFR
2011-04-01
... generating capacity separately designated. (3) A description of the total storage capacity of the reservoir..., irrigation storage, and flood control storage. Identification, by reservoir elevation, of the portion of the reservoir assigned to each of its respective storage functions. (4) An elevation-capacity curve, or a...
Volumes of recent floods and potential for storage in upland watershed areas of Iowa
Buchmiller, Robert C.; Eash, David A.; Harvey, Craig A.
2000-01-01
During the autumn of 1997, the U.S. Geological Survey (USGS), in cooperation with the U.S. Environmental Protection Agency, began a study to determine the volume of water associated with recent flood events in parts of the Midwestern United States and a preliminary evaluation of the potential upland areas for storage of flood-waters in selected watersheds. This analysis, although preliminary, may be useful in determining the feasibility of conducting additional, more detailed studies into the role of upland areas in a watershed management strategy. The methods and results of this preliminary hydrologic study are presented in this report.
A framework for global river flood risk assessments
NASA Astrophysics Data System (ADS)
Winsemius, H. C.; Van Beek, L. P. H.; Jongman, B.; Ward, P. J.; Bouwman, A.
2013-05-01
There is an increasing need for strategic global assessments of flood risks in current and future conditions. In this paper, we propose a framework for global flood risk assessment for river floods, which can be applied in current conditions, as well as in future conditions due to climate and socio-economic changes. The framework's goal is to establish flood hazard and impact estimates at a high enough resolution to allow for their combination into a risk estimate, which can be used for strategic global flood risk assessments. The framework estimates hazard at a resolution of ~ 1 km2 using global forcing datasets of the current (or in scenario mode, future) climate, a global hydrological model, a global flood-routing model, and more importantly, an inundation downscaling routine. The second component of the framework combines hazard with flood impact models at the same resolution (e.g. damage, affected GDP, and affected population) to establish indicators for flood risk (e.g. annual expected damage, affected GDP, and affected population). The framework has been applied using the global hydrological model PCR-GLOBWB, which includes an optional global flood routing model DynRout, combined with scenarios from the Integrated Model to Assess the Global Environment (IMAGE). We performed downscaling of the hazard probability distributions to 1 km2 resolution with a new downscaling algorithm, applied on Bangladesh as a first case study application area. We demonstrate the risk assessment approach in Bangladesh based on GDP per capita data, population, and land use maps for 2010 and 2050. Validation of the hazard estimates has been performed using the Dartmouth Flood Observatory database. This was done by comparing a high return period flood with the maximum observed extent, as well as by comparing a time series of a single event with Dartmouth imagery of the event. Validation of modelled damage estimates was performed using observed damage estimates from the EM-DAT database and World Bank sources. We discuss and show sensitivities of the estimated risks with regard to the use of different climate input sets, decisions made in the downscaling algorithm, and different approaches to establish impact models.
Enhancing water supply through reservoir reoperation
NASA Astrophysics Data System (ADS)
Rajagopal, S.; Sterle, K. M.; Jose, L.; Coors, S.; Pohll, G.; Singletary, L.
2017-12-01
Snowmelt is a significant contributor to water supply in western U.S. which is stored in reservoirs for use during peak summer demand. The reservoirs were built to satisfy multiple objectives, but primarily to either enhance water supply and/or for flood mitigation. The operating rules for these water supply reservoirs are based on historical assumptions of stationarity of climate, assuming peak snowmelt occurs after April 1 and hence have to let water pass through if it arrived earlier. Using the Truckee River which originates in the eastern Sierra Nevada, has seven reservoirs and is shared between California and Nevada as an example, we show enhanced water storage by altering reservoir operating rules. These results are based on a coupled hydrology (Ground-Surface water Flow, GSFLOW) and water management model (RIverware) developed for the river system. All the reservoirs in the system benefit from altering the reservoir rules, but some benefit more than others. Prosser Creek reservoir for example, historically averaged 76% of capacity, which was lowered to 46% of capacity in the future as climate warms and shifts snowmelt to earlier days of the year. This reduction in storage can be mitigated by altering the reservoir operation rules and the reservoir storage increases to 64-76% of capacity. There are limitations to altering operating rules as reservoirs operated primarily for flood control are required to maintain lower storage to absorb a flood pulse, yet using modeling we show that there are water supply benefits to adopting a more flexible rules of operation. In the future, due to changing climate we anticipate the reservoirs in the western U.S. which were typically capturing spring- summer snowmelt will have to be managed more actively as the water stored in the snowpack becomes more variable. This study presents a framework for understanding, modeling and quantifying the consequences of such a shift in hydrology and water management.
1983-07-01
storage areas were taken into account during the flood routings. AI.36 The computer program REVPULS, developed for this report, reverse Modified Puls...routed the hydrograph at Batavia through the storage upstream of the LVRR embankment. Subtracting this reverse -routed hydrograph from the combined...segments to form a more accurate reconstitution. The hydrographs upstream of Batavia were derived by reverse -routing and prorating by drainage area. Table
Zhang, Geli; Xiao, Xiangming; Dong, Jinwei; Kou, Weili; Jin, Cui; Qin, Yuanwei; Zhou, Yuting; Wang, Jie; Menarguez, Michael Angelo; Biradar, Chandrashekhar
2016-01-01
Knowledge of the area and spatial distribution of paddy rice is important for assessment of food security, management of water resources, and estimation of greenhouse gas (methane) emissions. Paddy rice agriculture has expanded rapidly in northeastern China in the last decade, but there are no updated maps of paddy rice fields in the region. Existing algorithms for identifying paddy rice fields are based on the unique physical features of paddy rice during the flooding and transplanting phases and use vegetation indices that are sensitive to the dynamics of the canopy and surface water content. However, the flooding phenomena in high latitude area could also be from spring snowmelt flooding. We used land surface temperature (LST) data from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor to determine the temporal window of flooding and rice transplantation over a year to improve the existing phenology-based approach. Other land cover types (e.g., evergreen vegetation, permanent water bodies, and sparse vegetation) with potential influences on paddy rice identification were removed (masked out) due to their different temporal profiles. The accuracy assessment using high-resolution images showed that the resultant MODIS-derived paddy rice map of northeastern China in 2010 had a high accuracy (producer and user accuracies of 92% and 96%, respectively). The MODIS-based map also had a comparable accuracy to the 2010 Landsat-based National Land Cover Dataset (NLCD) of China in terms of both area and spatial pattern. This study demonstrated that our improved algorithm by using both thermal and optical MODIS data, provides a robust, simple and automated approach to identify and map paddy rice fields in temperate and cold temperate zones, the northern frontier of rice planting. PMID:27667901
Zhang, Geli; Xiao, Xiangming; Dong, Jinwei; Kou, Weili; Jin, Cui; Qin, Yuanwei; Zhou, Yuting; Wang, Jie; Menarguez, Michael Angelo; Biradar, Chandrashekhar
2015-08-01
Knowledge of the area and spatial distribution of paddy rice is important for assessment of food security, management of water resources, and estimation of greenhouse gas (methane) emissions. Paddy rice agriculture has expanded rapidly in northeastern China in the last decade, but there are no updated maps of paddy rice fields in the region. Existing algorithms for identifying paddy rice fields are based on the unique physical features of paddy rice during the flooding and transplanting phases and use vegetation indices that are sensitive to the dynamics of the canopy and surface water content. However, the flooding phenomena in high latitude area could also be from spring snowmelt flooding. We used land surface temperature (LST) data from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor to determine the temporal window of flooding and rice transplantation over a year to improve the existing phenology-based approach. Other land cover types (e.g., evergreen vegetation, permanent water bodies, and sparse vegetation) with potential influences on paddy rice identification were removed (masked out) due to their different temporal profiles. The accuracy assessment using high-resolution images showed that the resultant MODIS-derived paddy rice map of northeastern China in 2010 had a high accuracy (producer and user accuracies of 92% and 96%, respectively). The MODIS-based map also had a comparable accuracy to the 2010 Landsat-based National Land Cover Dataset (NLCD) of China in terms of both area and spatial pattern. This study demonstrated that our improved algorithm by using both thermal and optical MODIS data, provides a robust, simple and automated approach to identify and map paddy rice fields in temperate and cold temperate zones, the northern frontier of rice planting.
Integration of Remote Sensing Data In Operational Flood Forecast In Southwest Germany
NASA Astrophysics Data System (ADS)
Bach, H.; Appel, F.; Schulz, W.; Merkel, U.; Ludwig, R.; Mauser, W.
Methods to accurately assess and forecast flood discharge are mandatory to minimise the impact of hydrological hazards. However, existing rainfall-runoff models rarely accurately consider the spatial characteristics of the watershed, which is essential for a suitable and physics-based description of processes relevant for runoff formation. Spatial information with low temporal variability like elevation, slopes and land use can be mapped or extracted from remote sensing data. However, land surface param- eters of high temporal variability, like soil moisture and snow properties are hardly available and used in operational forecasts. Remote sensing methods can improve flood forecast by providing information on the actual water retention capacities in the watershed and facilitate the regionalisation of hydrological models. To prove and demonstrate this, the project 'InFerno' (Integration of remote sensing data in opera- tional water balance and flood forecast modelling) has been set up, funded by DLR (50EE0053). Within InFerno remote sensing data (optical and microwave) are thor- oughly processed to deliver spatially distributed parameters of snow properties and soil moisture. Especially during the onset of a flood this information is essential to estimate the initial conditions of the model. At the flood forecast centres of 'Baden- Württemberg' and 'Rheinland-Pfalz' (Southwest Germany) the remote sensing based maps on soil moisture and snow properties will be integrated in the continuously op- erated water balance and flood forecast model LARSIM. The concept is to transfer the developed methodology from the Neckar to the Mosel basin. The major challenges lie on the one hand in the implementation of algorithms developed for a multisensoral synergy and the creation of robust, operationally applicable remote sensing products. On the other hand, the operational flood forecast must be adapted to make full use of the new data sources. In the operational phase of the project ESA's ENVISAT satellite, which will be launched in 2002, will serve as remote sensing data source. Until EN- VISAT data is available, algorithm retrieval, software development and product gener- ation is performed using existing sensors with ENVISAT-like specifications. Based on these data sets test cases and demonstration runs are conducted and will be presented to prove the advantages of the approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ye, Sheng; Covino, Timothy P.; Sivapalan, Murugesu
In this paper, we use a dynamic network flow model, coupled with a transient storage zone biogeochemical model, to simulate dissolved nutrient removal processes at the channel network scale. We have explored several scenarios in respect of the combination of rainfall variability, and the biological and geomorphic characteristics of the catchment, to understand the dominant controls on removal and delivery of dissolved nutrients (e.g., nitrate). These model-based theoretical analyses suggested that while nutrient removal efficiency is lower during flood events compared to during baseflow periods, flood events contribute significantly to bulk nutrient removal, whereas bulk removal during baseflow periods ismore » less. This is due to the fact that nutrient supply is larger during flood events; this trend is even stronger in large rivers. However, the efficiency of removal during both periods decreases in larger rivers, however, due to (i) increasing flow velocities and thus decreasing residence time, and (ii) increasing flow depth, and thus decreasing nutrient uptake rates. Besides nutrient removal processes can be divided into two parts: in the main channel and in the hyporheic transient storage zone. When assessing their relative contributions the size of the transient storage zone is a dominant control, followed by uptake rates in the main channel and in the transient storage zone. Increasing size of the transient storage zone with downstream distance affects the relative contributions to nutrient removal of the water column and the transient storage zone, which also impacts the way nutrient removal rates scale with increasing size of rivers. Intra-annual hydrologic variability has a significant impact on removal rates at all scales: the more variable the streamflow is, compared to mean discharge, the less nutrient is removed in the channel network. A scale-independent first order uptake coefficient, ke, estimated from model simulations, is highly dependent on the relative size of the transient storage zone and how it changes in the downstream direction, as well as the nature of hydrologic variability.« less
NASA Astrophysics Data System (ADS)
Monnier, J.; Couderc, F.; Dartus, D.; Larnier, K.; Madec, R.; Vila, J.-P.
2016-11-01
The 2D shallow water equations adequately model some geophysical flows with wet-dry fronts (e.g. flood plain or tidal flows); nevertheless deriving accurate, robust and conservative numerical schemes for dynamic wet-dry fronts over complex topographies remains a challenge. Furthermore for these flows, data are generally complex, multi-scale and uncertain. Robust variational inverse algorithms, providing sensitivity maps and data assimilation processes may contribute to breakthrough shallow wet-dry front dynamics modelling. The present study aims at deriving an accurate, positive and stable finite volume scheme in presence of dynamic wet-dry fronts, and some corresponding inverse computational algorithms (variational approach). The schemes and algorithms are assessed on classical and original benchmarks plus a real flood plain test case (Lèze river, France). Original sensitivity maps with respect to the (friction, topography) pair are performed and discussed. The identification of inflow discharges (time series) or friction coefficients (spatially distributed parameters) demonstrate the algorithms efficiency.
On identifying relationships between the flood scaling exponent and basin attributes.
Medhi, Hemanta; Tripathi, Shivam
2015-07-01
Floods are known to exhibit self-similarity and follow scaling laws that form the basis of regional flood frequency analysis. However, the relationship between basin attributes and the scaling behavior of floods is still not fully understood. Identifying these relationships is essential for drawing connections between hydrological processes in a basin and the flood response of the basin. The existing studies mostly rely on simulation models to draw these connections. This paper proposes a new methodology that draws connections between basin attributes and the flood scaling exponents by using observed data. In the proposed methodology, region-of-influence approach is used to delineate homogeneous regions for each gaging station. Ordinary least squares regression is then applied to estimate flood scaling exponents for each homogeneous region, and finally stepwise regression is used to identify basin attributes that affect flood scaling exponents. The effectiveness of the proposed methodology is tested by applying it to data from river basins in the United States. The results suggest that flood scaling exponent is small for regions having (i) large abstractions from precipitation in the form of large soil moisture storages and high evapotranspiration losses, and (ii) large fractions of overland flow compared to base flow, i.e., regions having fast-responding basins. Analysis of simple scaling and multiscaling of floods showed evidence of simple scaling for regions in which the snowfall dominates the total precipitation.
Real-Time Application of Multi-Satellite Precipitation Analysis for Floods and Landslides
NASA Technical Reports Server (NTRS)
Adler, Robert; Hong, Yang; Huffman, George
2007-01-01
Satellite data acquired and processed in real time now have the potential to provide the spacetime information on rainfall needed to monitor flood and landslide events around the world. This can be achieved by integrating the satellite-derived forcing data with hydrological models and landslide algorithms. Progress in using the TRMM Multi-satellite Precipitation Analysis (TMPA) as input to flood and landslide forecasts is outlined, with a focus on understanding limitations of the rainfall data and impacts of those limitations on flood/landslide analyses. Case studies of both successes and failures will be shown, as well as comparison with ground comparison data sets-- both in terms of rainfall and in terms of flood/landslide events. In addition to potential uses in real-time, the nearly ten years of TMPA data allow retrospective running of the models to examine variations in extreme events. The flood determination algorithm consists of four major components: 1) multi-satellite precipitation estimation; 2) characterization of land surface including digital elevation from NASA SRTM (Shuttle Radar Terrain Mission), topography-derived hydrologic parameters such as flow direction, flow accumulation, basin, and river network etc.; 3) a hydrological model to infiltrate rainfall and route overland runoff; and 4) an implementation interface to relay the input data to the models and display the flood inundation results to potential users and decision-makers, In terms of landslides, the satellite rainfall information is combined with a global landslide susceptibility map, derived from a combination of global surface characteristics (digital elevation topography, slope, soil types, soil texture, and land cover classification etc.) using a weighted linear combination approach. In those areas identified as "susceptible" (based on the surface characteristics), landslides are forecast where and when a rainfall intensity/duration threshold is exceeded. Results are described indicating general agreement with landslide occurrences.
BN-FLEMOps pluvial - A probabilistic multi-variable loss estimation model for pluvial floods
NASA Astrophysics Data System (ADS)
Roezer, V.; Kreibich, H.; Schroeter, K.; Doss-Gollin, J.; Lall, U.; Merz, B.
2017-12-01
Pluvial flood events, such as in Copenhagen (Denmark) in 2011, Beijing (China) in 2012 or Houston (USA) in 2016, have caused severe losses to urban dwellings in recent years. These floods are caused by storm events with high rainfall rates well above the design levels of urban drainage systems, which lead to inundation of streets and buildings. A projected increase in frequency and intensity of heavy rainfall events in many areas and an ongoing urbanization may increase pluvial flood losses in the future. For an efficient risk assessment and adaptation to pluvial floods, a quantification of the flood risk is needed. Few loss models have been developed particularly for pluvial floods. These models usually use simple waterlevel- or rainfall-loss functions and come with very high uncertainties. To account for these uncertainties and improve the loss estimation, we present a probabilistic multi-variable loss estimation model for pluvial floods based on empirical data. The model was developed in a two-step process using a machine learning approach and a comprehensive database comprising 783 records of direct building and content damage of private households. The data was gathered through surveys after four different pluvial flood events in Germany between 2005 and 2014. In a first step, linear and non-linear machine learning algorithms, such as tree-based and penalized regression models were used to identify the most important loss influencing factors among a set of 55 candidate variables. These variables comprise hydrological and hydraulic aspects, early warning, precaution, building characteristics and the socio-economic status of the household. In a second step, the most important loss influencing variables were used to derive a probabilistic multi-variable pluvial flood loss estimation model based on Bayesian Networks. Two different networks were tested: a score-based network learned from the data and a network based on expert knowledge. Loss predictions are made through Bayesian inference using Markov chain Monte Carlo (MCMC) sampling. With the ability to cope with incomplete information and use expert knowledge, as well as inherently providing quantitative uncertainty information, it is shown that loss models based on BNs are superior to deterministic approaches for pluvial flood risk assessment.
Prediction of Flood Warning in Taiwan Using Nonlinear SVM with Simulated Annealing Algorithm
NASA Astrophysics Data System (ADS)
Lee, C.
2013-12-01
The issue of the floods is important in Taiwan. It is because the narrow and high topography of the island make lots of rivers steep in Taiwan. The tropical depression likes typhoon always causes rivers to flood. Prediction of river flow under the extreme rainfall circumstances is important for government to announce the warning of flood. Every time typhoon passed through Taiwan, there were always floods along some rivers. The warning is classified to three levels according to the warning water levels in Taiwan. The propose of this study is to predict the level of floods warning from the information of precipitation, rainfall duration and slope of riverbed. To classify the level of floods warning by the above-mentioned information and modeling the problems, a machine learning model, nonlinear Support vector machine (SVM), is formulated to classify the level of floods warning. In addition, simulated annealing (SA), a probabilistic heuristic algorithm, is used to determine the optimal parameter of the SVM model. A case study of flooding-trend rivers of different gradients in Taiwan is conducted. The contribution of this SVM model with simulated annealing is capable of making efficient announcement for flood warning and keeping the danger of flood from residents along the rivers.
NASA Astrophysics Data System (ADS)
Fang, Kuai; Shen, Chaopeng
2017-09-01
Interannual changes in low, median, and high regimes of streamflow have important implications for flood control, irrigation, and ecologic and human health. The Gravity Recovery and Climate Experiment (GRACE) satellites record global terrestrial water storage anomalies (TWSA), providing an opportunity to observe, interpret, and potentially utilize the complex relationships between storage and full-flow-regime streamflow. Here we show that utilizable storage-streamflow correlations exist throughout vastly different climates in the continental US (CONUS) across low- to high-flow regimes. A panoramic framework, the storage-streamflow correlation spectrum (SSCS), is proposed to examine macroscopic gradients in these relationships. SSCS helps form, corroborate or reject hypotheses about basin hydrologic behaviors. SSCS patterns vary greatly over CONUS with climate, land surface, and geologic conditions. Data mining analysis suggests that for catchments with hydrologic settings that favor storage over runoff, e.g., a large fraction of precipitation as snow, thick and highly-permeable permeable soil, SSCS values tend to be high. Based on our results, we form the hypotheses that groundwater flow dominates streamflows in Southeastern CONUS and Great Plains, while thin soils in a belt along the Appalachian Plateau impose alimit on water storage. SSCS also suggests shallow water table caused by high-bulk density soil and flat terrain induces rapid runoff in several regions. Our results highlight the importance of subsurface properties and groundwater flow in capturing flood and drought. We propose that SSCS can be used as a fundamental hydrologic signature to constrain models and to provide insights thatlead usto better understand hydrologic functioning.
Likelihood-based confidence intervals for estimating floods with given return periods
NASA Astrophysics Data System (ADS)
Martins, Eduardo Sávio P. R.; Clarke, Robin T.
1993-06-01
This paper discusses aspects of the calculation of likelihood-based confidence intervals for T-year floods, with particular reference to (1) the two-parameter gamma distribution; (2) the Gumbel distribution; (3) the two-parameter log-normal distribution, and other distributions related to the normal by Box-Cox transformations. Calculation of the confidence limits is straightforward using the Nelder-Mead algorithm with a constraint incorporated, although care is necessary to ensure convergence either of the Nelder-Mead algorithm, or of the Newton-Raphson calculation of maximum-likelihood estimates. Methods are illustrated using records from 18 gauging stations in the basin of the River Itajai-Acu, State of Santa Catarina, southern Brazil. A small and restricted simulation compared likelihood-based confidence limits with those given by use of the central limit theorem; for the same confidence probability, the confidence limits of the simulation were wider than those of the central limit theorem, which failed more frequently to contain the true quantile being estimated. The paper discusses possible applications of likelihood-based confidence intervals in other areas of hydrological analysis.
The study on the control strategy of micro grid considering the economy of energy storage operation
NASA Astrophysics Data System (ADS)
Ma, Zhiwei; Liu, Yiqun; Wang, Xin; Li, Bei; Zeng, Ming
2017-08-01
To optimize the running of micro grid to guarantee the supply and demand balance of electricity, and to promote the utilization of renewable energy. The control strategy of micro grid energy storage system is studied. Firstly, the mixed integer linear programming model is established based on the receding horizon control. Secondly, the modified cuckoo search algorithm is proposed to calculate the model. Finally, a case study is carried out to study the signal characteristic of micro grid and batteries under the optimal control strategy, and the convergence of the modified cuckoo search algorithm is compared with others to verify the validity of the proposed model and method. The results show that, different micro grid running targets can affect the control strategy of energy storage system, which further affect the signal characteristics of the micro grid. Meanwhile, the convergent speed, computing time and the economy of the modified cuckoo search algorithm are improved compared with the traditional cuckoo search algorithm and differential evolution algorithm.
Reservoir operations under climate change: Storage capacity options to mitigate risk
NASA Astrophysics Data System (ADS)
Ehsani, Nima; Vörösmarty, Charles J.; Fekete, Balázs M.; Stakhiv, Eugene Z.
2017-12-01
Observed changes in precipitation patterns, rising surface temperature, increases in frequency and intensity of floods and droughts, widespread melting of ice, and reduced snow cover are some of the documented hydrologic changes associated with global climate change. Climate change is therefore expected to affect the water supply-demand balance in the Northeast United States and challenge existing water management strategies. The hydrological implications of future climate will affect the design capacity and operating characteristics of dams. The vulnerability of water resources systems to floods and droughts will increase, and the trade-offs between reservoir releases to maintain flood control storage, drought resilience, ecological flow, human water demand, and energy production should be reconsidered. We used a Neural Networks based General Reservoir Operation Scheme to estimate the implications of climate change for dams on a regional scale. This dynamic daily reservoir module automatically adapts to changes in climate and re-adjusts the operation of dams based on water storage level, timing, and magnitude of incoming flows. Our findings suggest that the importance of dams in providing water security in the region will increase. We create an indicator of the Effective Degree of Regulation (EDR) by dams on water resources and show that it is expected to increase, particularly during drier months of year, simply as a consequence of projected climate change. The results also indicate that increasing the size and number of dams, in addition to modifying their operations, may become necessary to offset the vulnerabilities of water resources systems to future climate uncertainties. This is the case even without considering the likely increase in future water demand, especially in the most densely populated regions of the Northeast.
A pipelined FPGA implementation of an encryption algorithm based on genetic algorithm
NASA Astrophysics Data System (ADS)
Thirer, Nonel
2013-05-01
With the evolution of digital data storage and exchange, it is essential to protect the confidential information from every unauthorized access. High performance encryption algorithms were developed and implemented by software and hardware. Also many methods to attack the cipher text were developed. In the last years, the genetic algorithm has gained much interest in cryptanalysis of cipher texts and also in encryption ciphers. This paper analyses the possibility to use the genetic algorithm as a multiple key sequence generator for an AES (Advanced Encryption Standard) cryptographic system, and also to use a three stages pipeline (with four main blocks: Input data, AES Core, Key generator, Output data) to provide a fast encryption and storage/transmission of a large amount of data.
Mueller, Erich R.; Grams, Paul E.; Hazel, Joseph E.; Schmidt, John C.
2018-01-01
Sandbars are iconic features of the Colorado River in the Grand Canyon, Arizona, U.S.A. Following completion of Glen Canyon Dam in 1963, sediment deficit conditions caused erosion of eddy sandbars throughout much of the 360 km study reach downstream from the dam. Controlled floods in 1996, 2004, and 2008 demonstrated that sand on the channel bed could be redistributed to higher elevations, and that floods timed to follow tributary sediment inputs would increase suspended sand concentrations during floods. Since 2012, a new management protocol has resulted in four controlled floods timed to follow large inputs of sand from a major tributary. Monitoring of 44 downstream eddy sandbars, initiated in 1990, shows that each controlled flood deposited significant amounts of sand and increased the size of subaerial sandbars. However, the magnitude of sandbar deposition varied from eddy to eddy, even over relatively short distances where main-stem suspended sediment concentrations were similar. Here, we characterize spatial and temporal trends in sandbar volume and site-scale (i.e., individual eddy) sediment storage as a function of flow, channel, and vegetation characteristics that reflect the reach-scale (i.e., kilometer-scale) hydraulic environment. We grouped the long-term monitoring sites based on geomorphic setting and used a principal component analysis (PCA) to correlate differences in sandbar behavior to changes in reach-scale geomorphic metrics. Sites in narrow reaches are less-vegetated, stage changes markedly with discharge, sandbars tend to remain dynamic, and sand storage change dominantly occurs in the eddy compared to the main channel. In wider reaches, where stage-change during floods may be half that of narrow sites, sandbars are more likely to be stabilized by vegetation, and floods tend to aggrade the vegetated sandbar surfaces. In these locations, deposition during controlled floods is more akin to floodplain sedimentation, and the elevation of sandbar surfaces increases with successive floods. Because many sandbars are intermediate to the end members described above, high-elevation bar surfaces stabilized by vegetation often have a more dynamic unvegetated sandbar on the channel-ward margin that aggrades and erodes in response to controlled flood cycles. Ultimately, controlled floods have been effective at increasing averaged sandbar volumes, and, while bar deposition during floods decreases through time where vegetation has stabilized sandbars, future controlled floods are likely to continue to result in deposition in a majority of the river corridor.
Monitoring on The Quality and Quantity of DIY Rainwater Harvesting System
NASA Astrophysics Data System (ADS)
Kasmin, H.; Bakar, N. H.; Zubir, M. M.
2016-07-01
Rainwater harvesting is an alternative sources of water supply and can be used for potable and non-potable uses. It could helps to store treated rainwater for more beneficial use and also for flood mitigation. Sustainable approach for flooding problem reduction in urban areas is by slowing down the rate of surface runoff flows at source by providing more storage area/tank. In order to understand the performance of a rainwater harvesting system (RWH), a preliminary monitoring on a ‘do it yourself’ (DIY) RWH model with additional first -flush strategy for water quality treatment was done. The main concept behind first flush diversion is to prevent initial polluted rainwater from entering the storage tank. Based on seven rainfall events observed in Parit Raja, both quality and quantity of the rainfalls were analysed. For rainwater quality, the samples from first flush diverter and storage tank were taken to understand their performance based on pH, dissolved oxygen (DO), turbidity, total dissolved solid (TDS), total suspended solid (TSS), chemical oxygen demand (COD) and biochemical oxygen demand (BOD) parameters. While for rainwater quantity, hydrograph analysis were done based on the performance of total rainfall and runoff, peak flow of rainfall and runoff; and delayed time parameters. Based on Interim National Water Quality Standard (INWQS) and National Drinking Water Quality Standard (NDWQS), first flush diverter apparently helps on water quality improvement in storage tanks when pH, DO, TDS, TSS and turbidity were classified as Class I (INWQS) and is allowable for drinking; but BOD and COD parameters were classified as Class III (INWQS). Hence, it has potential to be used as potable usage but will need extensive treatment to reduce its poor microbial quality. Based on the maximum observed rainfall event which had total volume of 3195.5 liter, had peakflow reduction from 0.00071 m3/s to 0.00034 m3/s and delayed runoff between 5 and 10 minutes after rainfall started. It concludes that the performance of water retention could be due to total rainfall and the tank capacity. Therefore, RWH has a potential to be used as potable use and at the same time it also has a potential to reduce local urban flooding.
The Need for Modernized Operational Snow Models: A Tale of Two Years
NASA Astrophysics Data System (ADS)
Winstral, A. H.; Marks, D. G.
2014-12-01
The Boise River Basin in southwest Idaho, USA contains three major reservoirs totaling nearly 1,000,000 acre-feet of storage capacity. The primary goals for water managers are water supply and flood protection. In terms of observed SWE at monitoring sites throughout the basin, water years 2012 and 2014 were similar and near normal. In WY 2014 inflows into the BRB reservoir system followed historic patterns and reservoir releases were ideally controlled for management goals. WY2012 however was warmer than average and the winter snowpack had uncharacteristically high melt susceptibility. Subsequent energy fluxes produced late winter inflows much higher than normally encountered. The uncharacteristic flow patterns and inability of traditional operational modeling tools to handle this situation challenged water managers. Through late March and early April 2012 near flood stage flows were pushed through the city of Boise in order to increase storage and prevent more catastrophic flooding. While in this case a greater catastrophe was narrowly averted, the shortcomings of the traditional modeling approaches taken by operational agencies were exposed. "Uncharacteristic" events such as these are becoming more and more frequent as the effects of climate change are realized. The need for modernized methods - ones based on the physical controlling processes rather than historic patterns - is imperative. This presentation outlines the latest developments in the application of a physically-based, high-resolution spatial snow model to aid operational water management decisions.
Flood Resilient Systems and their Application for Flood Resilient Planning
NASA Astrophysics Data System (ADS)
Manojlovic, N.; Gabalda, V.; Antanaskovic, D.; Gershovich, I.; Pasche, E.
2012-04-01
Following the paradigm shift in flood management from traditional to more integrated approaches, and considering the uncertainties of future development due to drivers such as climate change, one of the main emerging tasks of flood managers becomes the development of (flood) resilient cities. It can be achieved by application of non-structural - flood resilience measures, summarised in the 4As: assistance, alleviation, awareness and avoidance (FIAC, 2007). As a part of this strategy, the key aspect of development of resilient cities - resilient built environment can be reached by efficient application of Flood Resilience Technology (FReT) and its meaningful combination into flood resilient systems (FRS). FRS are given as [an interconnecting network of FReT which facilitates resilience (including both restorative and adaptive capacity) to flooding, addressing physical and social systems and considering different flood typologies] (SMARTeST, http://www.floodresilience.eu/). Applying the system approach (e.g. Zevenbergen, 2008), FRS can be developed at different scales from the building to the city level. Still, a matter of research is a method to define and systematise different FRS crossing those scales. Further, the decision on which resilient system is to be applied for the given conditions and given scale is a complex task, calling for utilisation of decision support tools. This process of decision-making should follow the steps of flood risk assessment (1) and development of a flood resilience plan (2) (Manojlovic et al, 2009). The key problem in (2) is how to match the input parameters that describe physical&social system and flood typology to the appropriate flood resilient system. Additionally, an open issue is how to integrate the advances in FReT and findings on its efficiency into decision support tools. This paper presents a way to define, systematise and make decisions on FRS at different scales of an urban system developed within the 7th FP Project SMARTeST. A web based three tier advisory system FLORETO-KALYPSO (http://floreto.wb.tu-harburg.de/, Manojlovic et al, 2009) devoted to support decision-making process at the building level has been further developed to support multi-scale decision making on resilient systems, improving the existing data mining algorithms of the Business Logic tier. Further tuning of the algorithms is to be performed based on the new developments and findings in applicability and efficiency of different FRe Technology for different flood typologies. The first results obtained at the case studies in Greater Hamburg, Germany indicate the potential of this approach to contribute to the multiscale resilient planning on the road to flood resilient cities. FIAC (2007): "Final report form the Awareness and Assistance Sub-committee", FIAC, Scottish Government Zevenbergen C. et al (2008) "Challenges in urban flood management: travelling across spatial and temporal scales", Journal of FRM Volume 1 Issue 2, p 81-88 Manojlovic N., et al (2009): "Capacity Building in FRM through a DSS Utilising Data Mining Approach", Proceed. 8th HIC, Concepcion, Chile, January, 2009
Development of Hydrological Model of Klang River Valley for flood forecasting
NASA Astrophysics Data System (ADS)
Mohammad, M.; Andras, B.
2012-12-01
This study is to review the impact of climate change and land used on flooding through the Klang River and to compare the changes in the existing river system in Klang River Basin with the Storm water Management and Road Tunnel (SMART) which is now already operating in the city centre of Kuala Lumpur. Klang River Basin is the most urbanized region in Malaysia. More than half of the basin has been urbanized on the land that is prone to flooding. Numerous flood mitigation projects and studies have been carried out to enhance the existing flood forecasting and mitigation project. The objective of this study is to develop a hydrological model for flood forecasting in Klang Basin Malaysia. Hydrological modelling generally requires large set of input data and this is more often a challenge for a developing country. Due to this limitation, the Tropical Rainfall Measuring Mission (TRMM) rainfall measurement, initiated by the US space agency NASA and Japanese space agency JAXA was used in this study. TRMM data was transformed and corrected by quantile to quantile transformation. However, transforming the data based on ground measurement doesn't make any significant improvement and the statistical comparison shows only 10% difference. The conceptual HYMOD model was used in this study and calibrated using ROPE algorithm. But, using the whole time series of the observation period in this area resulted in insufficient performance. The depth function which used in ROPE algorithm are then used to identified and calibrated using only unusual event to observed the improvement and efficiency of the model.
Voice-enabled Knowledge Engine using Flood Ontology and Natural Language Processing
NASA Astrophysics Data System (ADS)
Sermet, M. Y.; Demir, I.; Krajewski, W. F.
2015-12-01
The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to flood inundation maps, real-time flood conditions, flood forecasts, flood-related data, information and interactive visualizations for communities in Iowa. The IFIS is designed for use by general public, often people with no domain knowledge and limited general science background. To improve effective communication with such audience, we have introduced a voice-enabled knowledge engine on flood related issues in IFIS. Instead of navigating within many features and interfaces of the information system and web-based sources, the system provides dynamic computations based on a collection of built-in data, analysis, and methods. The IFIS Knowledge Engine connects to real-time stream gauges, in-house data sources, analysis and visualization tools to answer natural language questions. Our goal is the systematization of data and modeling results on flood related issues in Iowa, and to provide an interface for definitive answers to factual queries. The goal of the knowledge engine is to make all flood related knowledge in Iowa easily accessible to everyone, and support voice-enabled natural language input. We aim to integrate and curate all flood related data, implement analytical and visualization tools, and make it possible to compute answers from questions. The IFIS explicitly implements analytical methods and models, as algorithms, and curates all flood related data and resources so that all these resources are computable. The IFIS Knowledge Engine computes the answer by deriving it from its computational knowledge base. The knowledge engine processes the statement, access data warehouse, run complex database queries on the server-side and return outputs in various formats. This presentation provides an overview of IFIS Knowledge Engine, its unique information interface and functionality as an educational tool, and discusses the future plans for providing knowledge on flood related issues and resources. IFIS Knowledge Engine provides an alternative access method to these comprehensive set of tools and data resources available in IFIS. Current implementation of the system accepts free-form input and voice recognition capabilities within browser and mobile applications.
Inventory and mapping of flood inundation using interactive digital image analysis techniques
Rohde, Wayne G.; Nelson, Charles A.; Taranik, J.V.
1979-01-01
LANDSAT digital data and color infra-red photographs were used in a multiphase sampling scheme to estimate the area of agricultural land affected by a flood. The LANDSAT data were classified with a maximum likelihood algorithm. Stratification of the LANDSAT data, prior to classification, greatly reduced misclassification errors. The classification results were used to prepare a map overlay showing the areal extent of flooding. These data also provided statistics required to estimate sample size in a two phase sampling scheme, and provided quick, accurate estimates of areas flooded for the first phase. The measurements made in the second phase, based on ground data and photo-interpretation, were used with two phase sampling statistics to estimate the area of agricultural land affected by flooding These results show that LANDSAT digital data can be used to prepare map overlays showing the extent of flooding on agricultural land and, with two phase sampling procedures, can provide acreage estimates with sampling errors of about 5 percent. This procedure provides a technique for rapidly assessing the areal extent of flood conditions on agricultural land and would provide a basis for designing a sampling framework to estimate the impact of flooding on crop production.
Predicting Flood in Perlis Using Ant Colony Optimization
NASA Astrophysics Data System (ADS)
Nadia Sabri, Syaidatul; Saian, Rizauddin
2017-06-01
Flood forecasting is widely being studied in order to reduce the effect of flood such as loss of property, loss of life and contamination of water supply. Usually flood occurs due to continuous heavy rainfall. This study used a variant of Ant Colony Optimization (ACO) algorithm named the Ant-Miner to develop the classification prediction model to predict flood. However, since Ant-Miner only accept discrete data, while rainfall data is a time series data, a pre-processing steps is needed to discretize the rainfall data initially. This study used a technique called the Symbolic Aggregate Approximation (SAX) to convert the rainfall time series data into discrete data. As an addition, Simple K-Means algorithm was used to cluster the data produced by SAX. The findings show that the predictive accuracy of the classification prediction model is more than 80%.
Long-range prediction of Indian summer monsoon rainfall using data mining and statistical approaches
NASA Astrophysics Data System (ADS)
H, Vathsala; Koolagudi, Shashidhar G.
2017-10-01
This paper presents a hybrid model to better predict Indian summer monsoon rainfall. The algorithm considers suitable techniques for processing dense datasets. The proposed three-step algorithm comprises closed itemset generation-based association rule mining for feature selection, cluster membership for dimensionality reduction, and simple logistic function for prediction. The application of predicting rainfall into flood, excess, normal, deficit, and drought based on 36 predictors consisting of land and ocean variables is presented. Results show good accuracy in the considered study period of 37years (1969-2005).
PCM-Based Durable Write Cache for Fast Disk I/O
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Zhuo; Wang, Bin; Carpenter, Patrick
2012-01-01
Flash based solid-state devices (FSSDs) have been adopted within the memory hierarchy to improve the performance of hard disk drive (HDD) based storage system. However, with the fast development of storage-class memories, new storage technologies with better performance and higher write endurance than FSSDs are emerging, e.g., phase-change memory (PCM). Understanding how to leverage these state-of-the-art storage technologies for modern computing systems is important to solve challenging data intensive computing problems. In this paper, we propose to leverage PCM for a hybrid PCM-HDD storage architecture. We identify the limitations of traditional LRU caching algorithms for PCM-based caches, and develop amore » novel hash-based write caching scheme called HALO to improve random write performance of hard disks. To address the limited durability of PCM devices and solve the degraded spatial locality in traditional wear-leveling techniques, we further propose novel PCM management algorithms that provide effective wear-leveling while maximizing access parallelism. We have evaluated this PCM-based hybrid storage architecture using applications with a diverse set of I/O access patterns. Our experimental results demonstrate that the HALO caching scheme leads to an average reduction of 36.8% in execution time compared to the LRU caching scheme, and that the SFC wear leveling extends the lifetime of PCM by a factor of 21.6.« less
A satellite and model based flood inundation climatology of Australia
NASA Astrophysics Data System (ADS)
Schumann, G.; Andreadis, K.; Castillo, C. J.
2013-12-01
To date there is no coherent and consistent database on observed or simulated flood event inundation and magnitude at large scales (continental to global). The only compiled data set showing a consistent history of flood inundation area and extent at a near global scale is provided by the MODIS-based Dartmouth Flood Observatory. However, MODIS satellite imagery is only available from 2000 and is hampered by a number of issues associated with flood mapping using optical images (e.g. classification algorithms, cloud cover, vegetation). Here, we present for the first time a proof-of-concept study in which we employ a computationally efficient 2-D hydrodynamic model (LISFLOOD-FP) complemented with a sub-grid channel formulation to generate a complete flood inundation climatology of the past 40 years (1973-2012) for the entire Australian continent. The model was built completely from freely available SRTM-derived data, including channel widths, bank heights and floodplain topography, which was corrected for vegetation canopy height using a global ICESat canopy dataset. Channel hydraulics were resolved using actual channel data and bathymetry was estimated within the model using hydraulic geometry. On the floodplain, the model simulated the flow paths and inundation variables at a 1 km resolution. The developed model was run over a period of 40 years and a floodplain inundation climatology was generated and compared to satellite flood event observations. Our proof-of-concept study demonstrates that this type of model can reliably simulate past flood events with reasonable accuracies both in time and space. The Australian model was forced with both observed flow climatology and VIC-simulated flows in order to assess the feasibility of a model-based flood inundation climatology at the global scale.
Aqil, M; Kita, I; Yano, A; Nishiyama, S
2006-01-01
It is widely accepted that an efficient flood alarm system may significantly improve public safety and mitigate economical damages caused by inundations. In this paper, a modified adaptive neuro-fuzzy system is proposed to modify the traditional neuro-fuzzy model. This new method employs a rule-correction based algorithm to replace the error back propagation algorithm that is employed by the traditional neuro-fuzzy method in backward pass calculation. The final value obtained during the backward pass calculation using the rule-correction algorithm is then considered as a mapping function of the learning mechanism of the modified neuro-fuzzy system. Effectiveness of the proposed identification technique is demonstrated through a simulation study on the flood series of the Citarum River in Indonesia. The first four-year data (1987 to 1990) was used for model training/calibration, while the other remaining data (1991 to 2002) was used for testing the model. The number of antecedent flows that should be included in the input variables was determined by two statistical methods, i.e. autocorrelation and partial autocorrelation between the variables. Performance accuracy of the model was evaluated in terms of two statistical indices, i.e. mean average percentage error and root mean square error. The algorithm was developed in a decision support system environment in order to enable users to process the data. The decision support system is found to be useful due to its interactive nature, flexibility in approach, and evolving graphical features, and can be adopted for any similar situation to predict the streamflow. The main data processing includes gauging station selection, input generation, lead-time selection/generation, and length of prediction. This program enables users to process the flood data, to train/test the model using various input options, and to visualize results. The program code consists of a set of files, which can be modified as well to match other purposes. This program may also serve as a tool for real-time flood monitoring and process control. The results indicate that the modified neuro-fuzzy model applied to the flood prediction seems to have reached encouraging results for the river basin under examination. The comparison of the modified neuro-fuzzy predictions with the observed data was satisfactory, where the error resulted from the testing period was varied between 2.632% and 5.560%. Thus, this program may also serve as a tool for real-time flood monitoring and process control.
33 CFR 208.33 - Cheney Dam and Reservoir, North Fork of Ninnescah River, Kans.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 33 Navigation and Navigable Waters 3 2011-07-01 2011-07-01 false Cheney Dam and Reservoir, North..., DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE FLOOD CONTROL REGULATIONS § 208.33 Cheney Dam and Reservoir... the Cheney Dam and Reservoir in the interest of flood control as follows: (a) Flood control storage in...
33 CFR 208.33 - Cheney Dam and Reservoir, North Fork of Ninnescah River, Kans.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Cheney Dam and Reservoir, North..., DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE FLOOD CONTROL REGULATIONS § 208.33 Cheney Dam and Reservoir... the Cheney Dam and Reservoir in the interest of flood control as follows: (a) Flood control storage in...
NASA Astrophysics Data System (ADS)
Yoon, S.; Lee, B.; Nakakita, E.; Lee, G.
2016-12-01
Recent climate changes and abnormal weather phenomena have resulted in increased occurrences of localized torrential rainfall. Urban areas in Korea have suffered from localized heavy rainfall, including the notable Seoul flood disaster in 2010 and 2011. The urban hydrological environment has changed in relation to precipitation, such as reduced concentration time, a decreased storage rate, and increased peak discharge. These changes have altered and accelerated the severity of damage to urban areas. In order to prevent such urban flash flood damages, we have to secure the lead time for evacuation through the improvement of radar-based quantitative precipitation forecasting (QPF). The purpose of this research is to improve the QPF products using spatial-scale decomposition method for considering the life time of storm and to assess the accuracy between traditional QPF method and proposed method in terms of urban flood management. The layout of this research is as below. First, this research applies the image filtering to separate the spatial-scale of rainfall field. Second, the separated small and large-scale rainfall fields are extrapolated by each different forecasting method. Third, forecasted rainfall fields are combined at each lead time. Finally, results of this method are evaluated and compared with the results of uniform advection model for urban flood modeling. It is expected that urban flood information using improved QPF will help to reduce casualties and property damage caused by urban flooding through this research.
NASA Astrophysics Data System (ADS)
Chen, Y. W.; Chang, L. C.
2012-04-01
Typhoons which normally bring a great amount of precipitation are the primary natural hazard in Taiwan during flooding season. Because the plentiful rainfall quantities brought by typhoons are normally stored for the usage of the next draught period, the determination of release strategies for flood operation of reservoirs which is required to simultaneously consider not only the impact of reservoir safety and the flooding damage in plain area but also for the water resource stored in the reservoir after typhoon becomes important. This study proposes a two-steps study process. First, this study develop an optimal flood operation model (OFOM) for the planning of flood control and also applies the OFOM on Tseng-wun reservoir and the downstream plain related to the reservoir. Second, integrating a typhoon event database with the OFOM mentioned above makes the proposed planning model have ability to deal with a real-time flood control problem and names as real-time flood operation model (RTFOM). Three conditions are considered in the proposed models, OFOM and RTFOM, include the safety of the reservoir itself, the reservoir storage after typhoons and the impact of flooding in the plain area. Besides, the flood operation guideline announced by government is also considered in the proposed models. The these conditions and the guideline can be formed as an optimization problem which is solved by the genetic algorithm (GA) in this study. Furthermore, a distributed runoff model, kinematic-wave geomorphic instantaneous unit hydrograph (KW-GIUH), and a river flow simulation model, HEC-RAS, are used to simulate the river water level of Tseng-wun basin in the plain area and the simulated level is shown as an index of the impact of flooding. Because the simulated levels are required to re-calculate iteratively in the optimization model, applying a recursive artificial neural network (recursive ANN) instead of the HEC-RAS model can significantly reduce the computational burden of the entire optimization problem. This study applies the developed methodology to Tseng-wun Reservoir. Forty typhoon events are collected as the historical database and six typhoon events are used to verify the proposed model. These typhoons include Typhoon Sepat and Typhoon Korsa in 2007 and Typhoon Kalmaegi, Typhoon Fung-Wong, Typhoon Sinlaku and Typhoon Jangmi in 2008. The results show that the proposed model can reduce the flood duration at the downstream area. For example, the real-time flood control model can reduce the flood duration by four and three hours for Typhoon Korsa and Typhoon Sinlaku respectively. This results indicate that the developed model can be a very useful tool for real-time flood control operation of reservoirs.
HUGO: Hierarchical mUlti-reference Genome cOmpression for aligned reads
Li, Pinghao; Jiang, Xiaoqian; Wang, Shuang; Kim, Jihoon; Xiong, Hongkai; Ohno-Machado, Lucila
2014-01-01
Background and objective Short-read sequencing is becoming the standard of practice for the study of structural variants associated with disease. However, with the growth of sequence data largely surpassing reasonable storage capability, the biomedical community is challenged with the management, transfer, archiving, and storage of sequence data. Methods We developed Hierarchical mUlti-reference Genome cOmpression (HUGO), a novel compression algorithm for aligned reads in the sorted Sequence Alignment/Map (SAM) format. We first aligned short reads against a reference genome and stored exactly mapped reads for compression. For the inexact mapped or unmapped reads, we realigned them against different reference genomes using an adaptive scheme by gradually shortening the read length. Regarding the base quality value, we offer lossy and lossless compression mechanisms. The lossy compression mechanism for the base quality values uses k-means clustering, where a user can adjust the balance between decompression quality and compression rate. The lossless compression can be produced by setting k (the number of clusters) to the number of different quality values. Results The proposed method produced a compression ratio in the range 0.5–0.65, which corresponds to 35–50% storage savings based on experimental datasets. The proposed approach achieved 15% more storage savings over CRAM and comparable compression ratio with Samcomp (CRAM and Samcomp are two of the state-of-the-art genome compression algorithms). The software is freely available at https://sourceforge.net/projects/hierachicaldnac/with a General Public License (GPL) license. Limitation Our method requires having different reference genomes and prolongs the execution time for additional alignments. Conclusions The proposed multi-reference-based compression algorithm for aligned reads outperforms existing single-reference based algorithms. PMID:24368726
Why continuous simulation? The role of antecedent moisture in design flood estimation
NASA Astrophysics Data System (ADS)
Pathiraja, S.; Westra, S.; Sharma, A.
2012-06-01
Continuous simulation for design flood estimation is increasingly becoming a viable alternative to traditional event-based methods. The advantage of continuous simulation approaches is that the catchment moisture state prior to the flood-producing rainfall event is implicitly incorporated within the modeling framework, provided the model has been calibrated and validated to produce reasonable simulations. This contrasts with event-based models in which both information about the expected sequence of rainfall and evaporation preceding the flood-producing rainfall event, as well as catchment storage and infiltration properties, are commonly pooled together into a single set of "loss" parameters which require adjustment through the process of calibration. To identify the importance of accounting for antecedent moisture in flood modeling, this paper uses a continuous rainfall-runoff model calibrated to 45 catchments in the Murray-Darling Basin in Australia. Flood peaks derived using the historical daily rainfall record are compared with those derived using resampled daily rainfall, for which the sequencing of wet and dry days preceding the heavy rainfall event is removed. The analysis shows that there is a consistent underestimation of the design flood events when antecedent moisture is not properly simulated, which can be as much as 30% when only 1 or 2 days of antecedent rainfall are considered, compared to 5% when this is extended to 60 days of prior rainfall. These results show that, in general, it is necessary to consider both short-term memory in rainfall associated with synoptic scale dependence, as well as longer-term memory at seasonal or longer time scale variability in order to obtain accurate design flood estimates.
A fast method for optical simulation of flood maps of light-sharing detector modules
Shi, Han; Du, Dong; Xu, JianFeng; Moses, William W.; Peng, Qiyu
2016-01-01
Optical simulation of the detector module level is highly desired for Position Emission Tomography (PET) system design. Commonly used simulation toolkits such as GATE are not efficient in the optical simulation of detector modules with complicated light-sharing configurations, where a vast amount of photons need to be tracked. We present a fast approach based on a simplified specular reflectance model and a structured light-tracking algorithm to speed up the photon tracking in detector modules constructed with polished finish and specular reflector materials. We simulated conventional block detector designs with different slotted light guide patterns using the new approach and compared the outcomes with those from GATE simulations. While the two approaches generated comparable flood maps, the new approach was more than 200–600 times faster. The new approach has also been validated by constructing a prototype detector and comparing the simulated flood map with the experimental flood map. The experimental flood map has nearly uniformly distributed spots similar to those in the simulated flood map. In conclusion, the new approach provides a fast and reliable simulation tool for assisting in the development of light-sharing-based detector modules with a polished surface finish and using specular reflector materials. PMID:27660376
A fast method for optical simulation of flood maps of light-sharing detector modules
Shi, Han; Du, Dong; Xu, JianFeng; ...
2015-09-03
Optical simulation of the detector module level is highly desired for Position Emission Tomography (PET) system design. Commonly used simulation toolkits such as GATE are not efficient in the optical simulation of detector modules with complicated light-sharing configurations, where a vast amount of photons need to be tracked. Here, we present a fast approach based on a simplified specular reflectance model and a structured light-tracking algorithm to speed up the photon tracking in detector modules constructed with polished finish and specular reflector materials. We also simulated conventional block detector designs with different slotted light guide patterns using the new approachmore » and compared the outcomes with those from GATE simulations. And while the two approaches generated comparable flood maps, the new approach was more than 200–600 times faster. The new approach has also been validated by constructing a prototype detector and comparing the simulated flood map with the experimental flood map. The experimental flood map has nearly uniformly distributed spots similar to those in the simulated flood map. In conclusion, the new approach provides a fast and reliable simulation tool for assisting in the development of light-sharing-based detector modules with a polished surface finish and using specular reflector materials.« less
Calibration of a rainfall-runoff hydrological model and flood simulation using data assimilation
NASA Astrophysics Data System (ADS)
Piacentini, A.; Ricci, S. M.; Thual, O.; Coustau, M.; Marchandise, A.
2010-12-01
Rainfall-runoff models are crucial tools for long-term assessment of flash floods or real-time forecasting. This work focuses on the calibration of a distributed parsimonious event-based rainfall-runoff model using data assimilation. The model combines a SCS-derived runoff model and a Lag and Route routing model for each cell of a regular grid mesh. The SCS-derived runoff model is parametrized by the initial water deficit, the discharge coefficient for the soil reservoir and a lagged discharge coefficient. The Lag and Route routing model is parametrized by the velocity of travel and the lag parameter. These parameters are assumed to be constant for a given catchment except for the initial water deficit and the velocity travel that are event-dependent (landuse, soil type and moisture initial conditions). In the present work, a BLUE filtering technique was used to calibrate the initial water deficit and the velocity travel for each flood event assimilating the first available discharge measurements at the catchment outlet. The advantages of the BLUE algorithm are its low computational cost and its convenient implementation, especially in the context of the calibration of a reduced number of parameters. The assimilation algorithm was applied on two Mediterranean catchment areas of different size and dynamics: Gardon d'Anduze and Lez. The Lez catchment, of 114 km2 drainage area, is located upstream Montpellier. It is a karstic catchment mainly affected by floods in autumn during intense rainstorms with short Lag-times and high discharge peaks (up to 480 m3.s-1 in September 2005). The Gardon d'Anduze catchment, mostly granite and schistose, of 545 km2 drainage area, lies over the departements of Lozère and Gard. It is often affected by flash and devasting floods (up to 3000 m3.s-1 in September 2002). The discharge observations at the beginning of the flood event are assimilated so that the BLUE algorithm provides optimal values for the initial water deficit and the velocity travel before the flood peak. These optimal values are used for a new simulation of the event in forecast mode (under the assumption of perfect rain-fall). On both catchments, it was shown over a significant number of flood events, that the data assimilation procedure improves the flood peak forecast. The improvement is globally more important for the Gardon d'Anduze catchment where the flood events are stronger. The peak can be forecasted up to 36 hours head of time assimilating very few observations (up to 4) during the rise of the water level. For multiple peaks events, the assimilation of the observations from the first peak leads to a significant improvement of the second peak simulation. It was also shown that the flood rise is often faster in reality than it is represented by the model. In this case and when the flood peak is under estimated in the simulation, the use of the first observations can be misleading for the data assimilation algorithm. The careful estimation of the observation and background error variances enabled the satisfying use of the data assimilation in these complex cases even though it does not allow the model error correction.
Magnitude of flood flows for selected annual exceedance probabilities in Rhode Island through 2010
Zarriello, Phillip J.; Ahearn, Elizabeth A.; Levin, Sara B.
2012-01-01
Heavy persistent rains from late February through March 2010 caused severe widespread flooding in Rhode Island that set or nearly set record flows and water levels at many long-term streamgages in the State. In response, the U.S. Geological Survey, in partnership with the Federal Emergency Management Agency, conducted a study to update estimates of flood magnitudes at streamgages and regional equations for estimating flood flows at ungaged locations. This report provides information needed for flood plain management, transportation infrastructure design, flood insurance studies, and other purposes that can help minimize future flood damages and risks. The magnitudes of floods were determined from the annual peak flows at 43 streamgages in Rhode Island (20 sites), Connecticut (14 sites), and Massachusetts (9 sites) using the standard Bulletin 17B log-Pearson type III method and a modification of this method called the expected moments algorithm (EMA) for 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probability (AEP) floods. Annual-peak flows were analyzed for the period of record through the 2010 water year; however, records were extended at 23 streamgages using the maintenance of variance extension (MOVE) procedure to best represent the longest period possible for determining the generalized skew and flood magnitudes. Generalized least square regression equations were developed from the flood quantiles computed at 41 streamgages (2 streamgages in Rhode Island with reported flood quantiles were not used in the regional regression because of regulation or redundancy) and their respective basin characteristics to estimate magnitude of floods at ungaged sites. Of 55 basin characteristics evaluated as potential explanatory variables, 3 were statistically significant—drainage area, stream density, and basin storage. The pseudo-coefficient of determination (pseudo-R2) indicates these three explanatory variables explain 95 to 96 percent of the variance in the flood magnitudes from 20- to 0.2-percent AEPs. Estimates of uncertainty of the at-site and regression flood magnitudes are provided and were combined with their respective estimated flood quantiles to improve estimates of flood flows at streamgages. This region has a long history of urban development, which is considered to have an important effect on flood flows. This study includes basins that have an impervious area ranging from 0.5 to 37 percent. Although imperviousness provided some explanatory power in the regression, it was not statistically significant at the 95-percent confidence level for any of the AEPs examined. Influence of urbanization on flood flows indicates a complex interaction with other characteristics that confounds a statistical explanation of its effects. Standard methods for calculating magnitude of floods for given AEP are based on the assumption of stationarity, that is, the annual peak flows exhibit no significant trend over time. A subset of 16 streamgages with 70 or more years of unregulated systematic record indicates all but 4 streamgages have a statistically significant positive trend at the 95-percent confidence level; three of these are statistically significant at about the 90-percent confidence level or above. If the trend continues linearly in time, the estimated magnitude of floods for any AEP, on average, will increase by 6, 13, and 21 percent in 10, 20, and 30 years' time, respectively. In 2010, new peaks of record were set at 18 of the 21 active streamgages in Rhode Island. The updated flood frequency analysis indicates the peaks at these streamgages ranged from 2- to 0.2-percent AEP. Many streamgages in the State peaked at a 0.5- and 0.2-percent AEP, except for streamgages in the Blackstone River Basin, which peaked from a 4- to 2-percent AEP.
NASA Astrophysics Data System (ADS)
Kireeva, Maria; Sazonov, Alexey; Rets, Ekaterina; Ezerova, Natalia; Frolova, Natalia; Samsonov, Timofey
2017-04-01
Detection of the rivers' feeding type is a complex and multifactor task. Such partitioning should be based, on the one hand, on the genesis of the feeding water, on the other hand, on its physical path. At the same time it should consider relationship of the feeding type with corresponding phase of the water regime. Due to the above difficulties and complexity of the approach, there are many different variants of separation of flow hydrograph for feeding types. The most common method is extraction of so called basic component which in one way or another reflects groundwater feeding of the river. In this case, the selection most often is based on the principle of local minima or graphic separation of this component. However, in this case neither origin of the water nor corresponding phase of water regime is considered. In this paper, the authors offer a method of complex automated analysis of genetic components of the river's feeding together with the separation of specific phases of the water regime. The objects of the study are medium and large rivers of European Russia having a pronounced spring flood, formed due to melt water, and summer-autumn and winter low water which is periodically interrupted by rain or thaw flooding. The method is based on genetic separation of hydrograph proposed in 1960s years by B. I. Kudelin. This technique is considered for large rivers having hydraulic connection with groundwater horizons during flood. For better detection of floods genesis the analysis involves reanalysis data on temperature and precipitation. Separation is based on the following fundamental graphic-analytical principles: • Ground feeding during the passage of flood peak tends to zero • Beginning of the flood is determined as the exceeding of critical value of low water discharge • Flood periods are determined on the basis of exceeding the critical low-water discharge; they relate to thaw in case of above-zero temperatures • During thaw and rain floods, ground feeding is determined using interpolation of values before and after the flood • Floods during the rise and fall of high water are determined using depletion curves plotting • Groundwater component of runoff is divided into dynamic and static parts. The algorithm of subdivision described was formalized in the form of a program code in Fortran, with the connection of additional modules of R-Studio. The use of two languages allows, on the one hand, to speed up the processing of a large array of daily water discharges, on the other hand, to facilitate visualization and interpretation of results. The algorithm includes the selection of 15 calibration parameters describing the characteristics of each watershed. Verification and calibration of the program was carried out for 20 rivers of European Russia. According to calculations, there is a significant increase in the groundwater flow component in the most part of watershed and an increase in the role of flooding as the phase of the water regime as a whole. This research was supported by Russian Foundation for Basic Research (contract No. 16-35-60080).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, G.; Sepehrnoori, K.
1995-08-01
This research consists of the parallel development of a new chemical flooding simulator and the application of our existing UTCHEM simulation code to model surfactant flooding. The new code is based upon a completely new numerical method that combines for the first time higher-order finite-difference methods, flux limiters, and implicit algorithms. Results indicate that this approach has significant advantages in some problems and will likely enable us to simulate much larger and more realistic chemical floods once it is fully developed. Additional improvements have also been made to the UTCHEM code, and it has been applied to the study ofmore » stochastic reservoirs with and without horizontal wells to evaluate methods to reduce the cost and risk of surfactant flooding. During the second year of this contract, we have already made significant progress on both of these tasks and are ahead of schedule on both of them.« less
Direct trust-based security scheme for RREQ flooding attack in mobile ad hoc networks
NASA Astrophysics Data System (ADS)
Kumar, Sunil; Dutta, Kamlesh
2017-06-01
The routing algorithms in MANETs exhibit distributed and cooperative behaviour which makes them easy target for denial of service (DoS) attacks. RREQ flooding attack is a flooding-type DoS attack in context to Ad hoc On Demand Distance Vector (AODV) routing protocol, where the attacker broadcasts massive amount of bogus Route Request (RREQ) packets to set up the route with the non-existent or existent destination in the network. This paper presents direct trust-based security scheme to detect and mitigate the impact of RREQ flooding attack on the network, in which, every node evaluates the trust degree value of its neighbours through analysing the frequency of RREQ packets originated by them over a short period of time. Taking the node's trust degree value as the input, the proposed scheme is smoothly extended for suppressing the surplus RREQ and bogus RREQ flooding packets at one-hop neighbours during the route discovery process. This scheme distinguishes itself from existing techniques by not directly blocking the service of a normal node due to increased amount of RREQ packets in some unusual conditions. The results obtained throughout the simulation experiments clearly show the feasibility and effectiveness of the proposed defensive scheme.
Integrating Physical and Topographic Information Into a Fuzzy Scheme to Map Flooded Area by SAR
Pierdicca, Nazzareno; Chini, Marco; Pulvirenti, Luca; Macina, Flavia
2008-01-01
A flood mapping procedure based on a fuzzy sets theory has been developed. The method is based on the integration of Synthetic Aperture Radar (SAR) measurements with additional data on the inundated area, such as a land cover map and a digital elevation model (DEM). The information on land cover has allowed us to account for both specular reflection, typical of open water, and double bounce backscattering, typical of forested and urban areas. DEM has been exploited to include simple hydraulic considerations on the dependence of inundation probability on surface characteristics. Contextual information has been taken into account too. The proposed algorithm has been tested on a flood occurred in Italy on November 1994. A pair of ERS-1 images, collected before and after (three days later) the flood, has been used. The results have been compared with the data provided by a ground survey carried out when the flood reached its maximum extension. Despite the temporal mismatch between the survey and the post-inundation SAR image, the comparison has yielded encouraging results, with the 87% of the pixels correctly classified as inundated. PMID:27879928
Calibrating a Rainfall-Runoff and Routing Model for the Continental United States
NASA Astrophysics Data System (ADS)
Jankowfsky, S.; Li, S.; Assteerawatt, A.; Tillmanns, S.; Hilberts, A.
2014-12-01
Catastrophe risk models are widely used in the insurance industry to estimate the cost of risk. The models consist of hazard models linked to vulnerability and financial loss models. In flood risk models, the hazard model generates inundation maps. In order to develop country wide inundation maps for different return periods a rainfall-runoff and routing model is run using stochastic rainfall data. The simulated discharge and runoff is then input to a two dimensional inundation model, which produces the flood maps. In order to get realistic flood maps, the rainfall-runoff and routing models have to be calibrated with observed discharge data. The rainfall-runoff model applied here is a semi-distributed model based on the Topmodel (Beven and Kirkby, 1979) approach which includes additional snowmelt and evapotranspiration models. The routing model is based on the Muskingum-Cunge (Cunge, 1969) approach and includes the simulation of lakes and reservoirs using the linear reservoir approach. Both models were calibrated using the multiobjective NSGA-II (Deb et al., 2002) genetic algorithm with NLDAS forcing data and around 4500 USGS discharge gauges for the period from 1979-2013. Additional gauges having no data after 1979 were calibrated using CPC rainfall data. The model performed well in wetter regions and shows the difficulty of simulating areas with sinks such as karstic areas or dry areas. Beven, K., Kirkby, M., 1979. A physically based, variable contributing area model of basin hydrology. Hydrol. Sci. Bull. 24 (1), 43-69. Cunge, J.A., 1969. On the subject of a flood propagation computation method (Muskingum method), J. Hydr. Research, 7(2), 205-230. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T., 2002. A fast and elitist multiobjective genetic algorithm: NSGA-II, IEEE Transactions on evolutionary computation, 6(2), 182-197.
Front gardens to car parks: changes in garden permeability and effects on flood regulation.
Warhurst, Jennifer R; Parks, Katherine E; McCulloch, Lindsay; Hudson, Malcolm D
2014-07-01
This study addresses the consequences of widespread conversion of permeable front gardens to hard standing car parking surfaces, and the potential consequences in high-risk urban flooding hotspots, in the city of Southampton. The last two decades has seen a trend for domestic front gardens in urban areas to be converted for parking, driven by the lack of space and increased car ownership. Despite media and political attention, the effects of this change are unknown, but increased and more intense rainfall, potentially linked to climate change, could generate negative consequences as runoff from impermeable surfaces increases. Information is limited on garden permeability change, despite the consequences for ecosystem services, especially flood regulation. We focused on eight flooding hotspots identified by the local council as part of a wider urban flooding policy response. Aerial photographs from 1991, 2004 and 2011 were used to estimate changes in surface cover and to analyse permeability change within a digital surface model in a GIS environment. The 1, 30 and 100 year required attenuation storage volumes were estimated, which are the temporary storage required to reduce the peak flow rate given surface permeability. Within our study areas, impermeable cover in domestic front gardens increased by 22.47% over the 20-year study period (1991-2011) and required attenuation storage volumes increased by 26.23% on average. These increases suggest that a consequence of the conversion of gardens to parking areas will be a potential increase in flooding frequency and severity - a situation which is likely to occur in urban locations worldwide. Copyright © 2014 Elsevier B.V. All rights reserved.
Parrett, Charles; Veilleux, Andrea; Stedinger, J.R.; Barth, N.A.; Knifong, Donna L.; Ferris, J.C.
2011-01-01
Improved flood-frequency information is important throughout California in general and in the Sacramento-San Joaquin River Basin in particular, because of an extensive network of flood-control levees and the risk of catastrophic flooding. A key first step in updating flood-frequency information is determining regional skew. A Bayesian generalized least squares (GLS) regression method was used to derive a regional-skew model based on annual peak-discharge data for 158 long-term (30 or more years of record) stations throughout most of California. The desert areas in southeastern California had too few long-term stations to reliably determine regional skew for that hydrologically distinct region; therefore, the desert areas were excluded from the regional skew analysis for California. Of the 158 long-term stations used to determine regional skew, 145 have minimally regulated annual-peak discharges, and 13 stations are dam sites for which unregulated peak discharges were estimated from unregulated daily maximum discharge data furnished by the U.S. Army Corp of Engineers. Station skew was determined by using an expected moments algorithm (EMA) program for fitting the Pearson Type 3 flood-frequency distribution to the logarithms of annual peak-discharge data. The Bayesian GLS regression method previously developed was modified because of the large cross correlations among concurrent recorded peak discharges in California and the use of censored data and historical flood information with the new expected moments algorithm. In particular, to properly account for these cross-correlation problems and develop a suitable regression model and regression diagnostics, a combination of Bayesian weighted least squares and generalized least squares regression was adopted. This new methodology identified a nonlinear function relating regional skew to mean basin elevation. The regional skew values ranged from -0.62 for a mean basin elevation of zero to 0.61 for a mean basin elevation of 11,000 feet. This relation between skew and elevation reflects the interaction of snow with rain, which increases with increased elevation. The equivalent record length for the new regional skew ranges from 52 to 65 years of record, depending upon mean basin elevation. The old regional skew map in Bulletin 17B, published by the Hydrology Subcommittee of the Interagency Advisory Committee on Water Data (1982), reported an equivalent record length of only 17 years. The newly developed regional skew relation for California was used to update flood frequency for the 158 sites used in the regional skew analysis as well as 206 selected sites in the Sacramento-San Joaquin River Basin. For these sites, annual-peak discharges having recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years were determined on the basis of data through water year 2006. The expected moments algorithm was used for determining the magnitude and frequency of floods at gaged sites by using regional skew values and using the basic approach outlined in Bulletin
Priority and construction sites of water storage in a watershed in response to climate change
NASA Astrophysics Data System (ADS)
Lin, Cheng-Yu; Zhang, Wen-Yan; Lin, Chao-Yuan
2014-05-01
Taiwan is located at the Eastern Asia Monsoon climate zone. Typhoons and/or convectional rains occur frequently and result in high intensity storms in the summer season. Once the detention facilities are shortage or soil infiltration rate become worse in a watershed due to land use, surface runoff is easily to concentrate and threaten the protected areas. Therefore, it is very important to examine the functionality of water storage for a watershed. The purpose of this study is to solve the issue of flooding in the Puzi Creek. A case study of Yizen Bridge Watershed, in which the SCS curve number was used as an index to extract the spatial distribution of the strength of water storage, and the value of watershed mean CN along the main channel was calculated using area-weighting method. Therefore, the hotspot management sites were then derived and the priority method was applied to screen the depression sites for the reference of management authorities in detention ponds placement. The results show that the areas of subzone A with the characteristics of bad condition in topography and soil, which results in poor infiltration. However, the areas are mostly covered with forest and are difficult to create the artificial water storage facilities. Detention dams are strongly recommended at the site of depression in the river channel to decrease discharge velocity and reduce impact from flood disaster. The areas of subzone B are mainly located at the agriculture slope land. The topographic depressions in the farmland are the suitable places to construct the farm ponds for the use of flood detention and sediment deposition in the rainy seasons and irrigation in the dry seasons. Areas of subzone C are mainly occupied the gentle slope land with a better ability in water storage due to low CN value. Farm ponds constructed in the riparian to bypass the nearby river channel can create multifunctional wetland to effectively decrease the peak discharge in the downstream during storm events. Depression storages are based on additional runoff obtained from CN calculation. Strategies mentioned in this study can be provided as references of climate change adaptions for related authorities.
NASA Astrophysics Data System (ADS)
Pattison, Ian; Lane, Stuart; Hardy, Richard; Reaney, Sim
2010-05-01
The theoretical basis for why changes in land management might increase flood risk are well known, but proving them through numerical modelling still remains a challenge. In large catchments, like the River Eden in Cumbria, NW England, one of the reasons for this is that it is unfeasible to test multiple scenarios in all their possible locations. We have developed two linked approaches to refine the number of scenarios and locations using 1) spatial downscaling and 2) participatory decision making, which potentially should increase the likelihood of finding a link between land use and downstream flooding. Firstly, land management practices can have both flood reducing and flood increasing effects, depending on their location. As a result some areas of the catchment are more important in determining downstream flood risk than others, depending on the land use and hydrological connectivity. We apply a downscaling approach to identify which sub-catchments are most important in explaining downstream flooding. This is important because it is in these areas that management options are most likely to have a positive and detectable effect. Secondly, once the dominant sub-catchment has been identified, the land management scenarios that are both feasible and likely to impact flood risk need to be determined. This was done through active stakeholder engagement. The stakeholder group undertook a brainstorming exercise, which suggested about 30 different rural land management scenarios, which were mapped on to a literature-based conceptual framework of hydrological processes. Then these options were evaluated based on five criteria: relevance to catchment, scientific effectiveness, testability, robustness/uncertainty and feasibility of implementation. The suitability of each scenario was discussed and prioritised by the stakeholder group based on scientific needs and expectations and local suitability and feasibility. The next stage of the participatory approach was a mapping workshop, whereby a map of the catchment was laid out and locations where each scenario could feasibly be implemented were drawn on. This was combined with an analysis of historical maps to identify past land covers and a catchment walkover survey to put modelling work in the real world context. The land management scenarios were tested using hydrological and hydraulic models. Landscape scale changes, such as the effects of compaction and afforestation were tested using a catchment scale hydrological mode, CRUM2D. Channel scale changes, such as re-meandering and floodplain storage were tested using the 1D hydraulic model, iSIS, by altering channel cross sections and creating spills between the channel and floodplain. It is expected that the channel modification and floodplain storage scenarios will have the greatest impact on flooding both at the local and catchment scales. The landscape scale changes are more diffuse and therefore their impact is expected to be less significant. Although, early analysis indicates that the spatial location of changes strongly influences their effect on flooding.
Huang, S.; Young, Caitlin; Feng, M.; Heidemann, Hans Karl; Cushing, Matthew; Mushet, D.M.; Liu, S.
2011-01-01
Recent flood events in the Prairie Pothole Region of North America have stimulated interest in modeling water storage capacities of wetlands and their surrounding catchments to facilitate flood mitigation efforts. Accurate estimates of basin storage capacities have been hampered by a lack of high-resolution elevation data. In this paper, we developed a 0.5 m bare-earth model from Light Detection And Ranging (LiDAR) data and, in combination with National Wetlands Inventory data, delineated wetland catchments and their spilling points within a 196 km2 study area. We then calculated the maximum water storage capacity of individual basins and modeled the connectivity among these basins. When compared to field survey results, catchment and spilling point delineations from the LiDAR bare-earth model captured subtle landscape features very well. Of the 11 modeled spilling points, 10 matched field survey spilling points. The comparison between observed and modeled maximum water storage had an R2 of 0.87 with mean absolute error of 5564 m3. Since maximum water storage capacity of basins does not translate into floodwater regulation capability, we further developed a Basin Floodwater Regulation Index. Based upon this index, the absolute and relative water that could be held by wetlands over a landscape could be modeled. This conceptual model of floodwater downstream contribution was demonstrated with water level data from 17 May 2008.
NASA Astrophysics Data System (ADS)
Wang, Wenrui; Wu, Yaohua; Wu, Yingying
2016-05-01
E-commerce, as an emerging marketing mode, has attracted more and more attention and gradually changed the way of our life. However, the existing layout of distribution centers can't fulfill the storage and picking demands of e-commerce sufficiently. In this paper, a modified miniload automated storage/retrieval system is designed to fit these new characteristics of e-commerce in logistics. Meanwhile, a matching problem, concerning with the improvement of picking efficiency in new system, is studied in this paper. The problem is how to reduce the travelling distance of totes between aisles and picking stations. A multi-stage heuristic algorithm is proposed based on statement and model of this problem. The main idea of this algorithm is, with some heuristic strategies based on similarity coefficients, minimizing the transportations of items which can not arrive in the destination picking stations just through direct conveyors. The experimental results based on the cases generated by computers show that the average reduced rate of indirect transport times can reach 14.36% with the application of multi-stage heuristic algorithm. For the cases from a real e-commerce distribution center, the order processing time can be reduced from 11.20 h to 10.06 h with the help of the modified system and the proposed algorithm. In summary, this research proposed a modified system and a multi-stage heuristic algorithm that can reduce the travelling distance of totes effectively and improve the whole performance of e-commerce distribution center.
NASA Astrophysics Data System (ADS)
Wahyudi, Slamet Imam; Adi, Henny Pratiwi; Santoso, Esti; Heikoop, Rick
2017-03-01
Settlement in the Jati District, Kudus Regency, Central Java Province, Indonesia, is growing rapidly. Previous paddy fields area turns into new residential, industrial and office buildings. The rain water collected in small Kencing river that flows into big Wulan River. But the current condition, during high rain intensity Wulan river water elevation higher than the Kencing river, so that water can not flow gravity and the area inundated. To reduce the flooding, required polder drainage system by providing a long channel as water storage and pumping water into Wulan river. How to get optimal value of water storage volume, drainage system channels and the pump capacity? The result used to be efficient in the operation and maintenance of the polder system. The purpose of this study is to develop some scenarios water storage volume, water gate operation and to get the optimal value of operational pumps removing water from the Kencing River to Wulan River. Research Method is conducted by some steps. The first step, it is done field orientation in detail, then collecting secondary data including maps and rainfall data. The map is processed into Watershed or catchment area, while the rainfall data is processed into runoff discharge. Furthermore, the team collects primary data by measuring topography to determine the surface and volume of water storage. The analysis conducted to determine of flood discharge, water channel hydraulics, water storage volume and pump capacity corresponding. Based on the simulating of long water storage volume and pump capacity with some scenario trying, it can be determined optimum values. The results used to be guideline in to construction proses, operation and maintenance of the drainage polder system.
An Effective Cache Algorithm for Heterogeneous Storage Systems
Li, Yong; Feng, Dan
2013-01-01
Modern storage environment is commonly composed of heterogeneous storage devices. However, traditional cache algorithms exhibit performance degradation in heterogeneous storage systems because they were not designed to work with the diverse performance characteristics. In this paper, we present a new cache algorithm called HCM for heterogeneous storage systems. The HCM algorithm partitions the cache among the disks and adopts an effective scheme to balance the work across the disks. Furthermore, it applies benefit-cost analysis to choose the best allocation of cache block to improve the performance. Conducting simulations with a variety of traces and a wide range of cache size, our experiments show that HCM significantly outperforms the existing state-of-the-art storage-aware cache algorithms. PMID:24453890
NASA Astrophysics Data System (ADS)
Zamora-Reyes, D.; Hirschboeck, K. K.; Paretti, N. V.
2012-12-01
Bulletin 17B (B17B) has prevailed for 30 years as the standard manual for determining flood frequency in the United States. Recently proposed updates to B17B include revising the issue of flood heterogeneity, and improving flood estimates by using the Expected Moments Algorithm (EMA) which can better address low outliers and accommodate information on historical peaks. Incorporating information on mixed populations, such as flood-causing mechanisms, into flood estimates for regions that have noticeable flood heterogeneity can be statistically challenging when systematic flood records are short. The problem magnifies when the population sample size is reduced by decomposing the record, especially if multiple flood mechanisms are involved. In B17B, the guidelines for dealing with mixed populations focus primarily on how to rule out any need to perform a mixed-population analysis. However, in some regions mixed flood populations are critically important determinants of regional flood frequency variations and should be explored from this perspective. Arizona is an area with a heterogeneous mixture of flood processes due to: warm season convective thunderstorms, cool season synoptic-scale storms, and tropical cyclone-enhanced convective activity occurring in the late summer or early fall. USGS station data throughout Arizona was compiled into a database and each flood peak (annual and partial duration series) was classified according to its meteorological cause. Using these data, we have explored the role of flood heterogeneity in Arizona flood estimates through composite flood frequency analysis based on mixed flood populations using EMA. First, for selected stations, the three flood-causing populations were separated out from the systematic annual flood series record and analyzed individually. Second, to create composite probability curves, the individual curves for each of the three populations were generated and combined using Crippen's (1978) composite probability equations for sites that have two or more independent flood populations. Finally, the individual probability curves generated for each of the three flood-causing populations were compared with both the site's composite probability curve and the standard B17B curve to explore the influence of heterogeneity using the 100-year and 200-year flood estimates as a basis of comparison. Results showed that sites located in southern Arizona and along the abrupt elevation transition zone of the Mogollon Rim exhibit a better fit to the systematic data using their composite probability curves than the curves derived from standard B17B analysis. Synoptic storm floods and tropical cyclone-enhanced floods had the greatest influence on 100-year and 200-year flood estimates. This was especially true in southern Arizona, even though summer convective floods are much more frequent and therefore dominate the composite curve. Using the EMA approach also influenced our results because all possible low outliers were censored by the built-in Multiple Grubbs-Beck Test, providing a better fit to the systematic data in the upper probabilities. In conclusion, flood heterogeneity can play an important role in regional flood frequency variations in Arizona and that understanding its influence is important when making projections about future flood variations.
Khosravi, Khabat; Pham, Binh Thai; Chapi, Kamran; Shirzadi, Ataollah; Shahabi, Himan; Revhaug, Inge; Prakash, Indra; Tien Bui, Dieu
2018-06-15
Floods are one of the most damaging natural hazards causing huge loss of property, infrastructure and lives. Prediction of occurrence of flash flood locations is very difficult due to sudden change in climatic condition and manmade factors. However, prior identification of flood susceptible areas can be done with the help of machine learning techniques for proper timely management of flood hazards. In this study, we tested four decision trees based machine learning models namely Logistic Model Trees (LMT), Reduced Error Pruning Trees (REPT), Naïve Bayes Trees (NBT), and Alternating Decision Trees (ADT) for flash flood susceptibility mapping at the Haraz Watershed in the northern part of Iran. For this, a spatial database was constructed with 201 present and past flood locations and eleven flood-influencing factors namely ground slope, altitude, curvature, Stream Power Index (SPI), Topographic Wetness Index (TWI), land use, rainfall, river density, distance from river, lithology, and Normalized Difference Vegetation Index (NDVI). Statistical evaluation measures, the Receiver Operating Characteristic (ROC) curve, and Freidman and Wilcoxon signed-rank tests were used to validate and compare the prediction capability of the models. Results show that the ADT model has the highest prediction capability for flash flood susceptibility assessment, followed by the NBT, the LMT, and the REPT, respectively. These techniques have proven successful in quickly determining flood susceptible areas. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Prosdocimi, Massimo; Sofia, Giulia; Dalla Fontana, Giancarlo; Tarolli, Paolo
2013-04-01
In a high-density populated country such as Italy, the anthropic pressure plays a fundamental role in the alteration and the modification of the landscape. Among the most evident anthropic alterations, the most important are the urbanization processes that have been occurring since the end of the second world war. Agricultural activities, housing and other land uses have shifted due to the progressive spreading of urban areas. These modifications affect the hydrologic regimes, but municipalities often are not aware of the real impact of land cover changes on such processes and, consequently, an increase of the elements at risk of flooding is generally registered. The main objective of this work is to evaluate the impact of land cover changes in the Veneto region (north-east Italy), from 1954 to 2006, on the minor drainage network system and on its capacity to attenuate the direct runoff. The major flood event occurred between October and November 2010. The study is a typical agrarian landscape and it has been chosen considering its involvement inthe major flood of 2010 and considering also the availability of high-resolution topographic data (LiDAR-derived DTMs) and historical aerial photographs. Aerial photographs dated back to 1954 and 1981, in particular, have been used either to classify the land cover in five categories according to the first level of the CORINE land cover classification and to identify the minor drainage network. A semi-automatic approach based on the high-resolution DTM (Cazorzi et al., 2012), was also considered to identify the minor drainage network and estimate its water storage capacity. The results underline how land cover variation over the last 50 years has strongly increased the propension of the soil to produce direct runoff (increase of the Curve Number value) and it has also reduced the extent of the minor network system. As a consequence, the capacity of the agrarian minor network to attenuate and laminate a flood event is decreased as well. These analysis can be considered useful tools for a suitable land use planning in flood prone areas. References Cazorzi, F., Dalla Fontana, G., De Luca, A., Sofia, G., Tarolli, P. (2012). Drainage network detection and assessment of network storage capacity in agrarian landscape, Hydrological Processes, ISSN: 0885-6087, doi:10.1002/hyp.9224
Magnitude of flood flows for selected annual exceedance probabilities for streams in Massachusetts
Zarriello, Phillip J.
2017-05-11
The U.S. Geological Survey, in cooperation with the Massachusetts Department of Transportation, determined the magnitude of flood flows at selected annual exceedance probabilities (AEPs) at streamgages in Massachusetts and from these data developed equations for estimating flood flows at ungaged locations in the State. Flood magnitudes were determined for the 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent AEPs at 220 streamgages, 125 of which are in Massachusetts and 95 are in the adjacent States of Connecticut, New Hampshire, New York, Rhode Island, and Vermont. AEP flood flows were computed for streamgages using the expected moments algorithm weighted with a recently computed regional skewness coefficient for New England.Regional regression equations were developed to estimate the magnitude of floods for selected AEP flows at ungaged sites from 199 selected streamgages and for 60 potential explanatory basin characteristics. AEP flows for 21 of the 125 streamgages in Massachusetts were not used in the final regional regression analysis, primarily because of regulation or redundancy. The final regression equations used generalized least squares methods to account for streamgage record length and correlation. Drainage area, mean basin elevation, and basin storage explained 86 to 93 percent of the variance in flood magnitude from the 50- to 0.2-percent AEPs, respectively. The estimates of AEP flows at streamgages can be improved by using a weighted estimate that is based on the magnitude of the flood and associated uncertainty from the at-site analysis and the regional regression equations. Weighting procedures for estimating AEP flows at an ungaged site on a gaged stream also are provided that improve estimates of flood flows at the ungaged site when hydrologic characteristics do not abruptly change.Urbanization expressed as the percentage of imperviousness provided some explanatory power in the regional regression; however, it was not statistically significant at the 95-percent confidence level for any of the AEPs examined. The effect of urbanization on flood flows indicates a complex interaction with other basin characteristics. Another complicating factor is the assumption of stationarity, that is, the assumption that annual peak flows exhibit no significant trend over time. The results of the analysis show that stationarity does not prevail at all of the streamgages. About 27 percent of streamgages in Massachusetts and about 42 percent of streamgages in adjacent States with 20 or more years of systematic record used in the study show a significant positive trend at the 95-percent confidence level. The remaining streamgages had both positive and negative trends, but the trends were not statistically significant. Trends were shown to vary over time. In particular, during the past decade (2004–2013), peak flows were persistently above normal, which may give the impression of positive trends. Only continued monitoring will provide the information needed to determine whether recent increases in annual peak flows are a normal oscillation or a true trend.The analysis used 37 years of additional data obtained since the last comprehensive study of flood flows in Massachusetts. In addition, new methods for computing flood flows at streamgages and regionalization improved estimates of flood magnitudes at gaged and ungaged locations and better defined the uncertainty of the estimates of AEP floods.
Partial Storage Optimization and Load Control Strategy of Cloud Data Centers
2015-01-01
We present a novel approach to solve the cloud storage issues and provide a fast load balancing algorithm. Our approach is based on partitioning and concurrent dual direction download of the files from multiple cloud nodes. Partitions of the files are saved on the cloud rather than the full files, which provide a good optimization to the cloud storage usage. Only partial replication is used in this algorithm to ensure the reliability and availability of the data. Our focus is to improve the performance and optimize the storage usage by providing the DaaS on the cloud. This algorithm solves the problem of having to fully replicate large data sets, which uses up a lot of precious space on the cloud nodes. Reducing the space needed will help in reducing the cost of providing such space. Moreover, performance is also increased since multiple cloud servers will collaborate to provide the data to the cloud clients in a faster manner. PMID:25973444
Partial storage optimization and load control strategy of cloud data centers.
Al Nuaimi, Klaithem; Mohamed, Nader; Al Nuaimi, Mariam; Al-Jaroodi, Jameela
2015-01-01
We present a novel approach to solve the cloud storage issues and provide a fast load balancing algorithm. Our approach is based on partitioning and concurrent dual direction download of the files from multiple cloud nodes. Partitions of the files are saved on the cloud rather than the full files, which provide a good optimization to the cloud storage usage. Only partial replication is used in this algorithm to ensure the reliability and availability of the data. Our focus is to improve the performance and optimize the storage usage by providing the DaaS on the cloud. This algorithm solves the problem of having to fully replicate large data sets, which uses up a lot of precious space on the cloud nodes. Reducing the space needed will help in reducing the cost of providing such space. Moreover, performance is also increased since multiple cloud servers will collaborate to provide the data to the cloud clients in a faster manner.
A probabilistic approach to modeling postfire erosion after the 2009 Australian bushfires
P. R. Robichaud; W. J. Elliot; F. B. Pierson; D. E. Hall; C. A. Moffet
2009-01-01
Major concerns after bushfires and wildfires include increased flooding, erosion and debris flows due to loss of the protective forest floor layer, loss of water storage, and creation of water repellent soil conditions. To assist postfire assessment teams in their efforts to evaluate fire effects and make postfire treatment decisions, a web-based Erosion Risk...
A Hydrological Modeling Framework for Flood Risk Assessment for Japan
NASA Astrophysics Data System (ADS)
Ashouri, H.; Chinnayakanahalli, K.; Chowdhary, H.; Sen Gupta, A.
2016-12-01
Flooding has been the most frequent natural disaster that claims lives and imposes significant economic losses to human societies worldwide. Japan, with an annual rainfall of up to approximately 4000 mm is extremely vulnerable to flooding. The focus of this research is to develop a macroscale hydrologic model for simulating flooding toward an improved understanding and assessment of flood risk across Japan. The framework employs a conceptual hydrological model, known as the Probability Distributed Model (PDM), as well as the Muskingum-Cunge flood routing procedure for simulating streamflow. In addition, a Temperature-Index model is incorporated to account for snowmelt and its contribution to streamflow. For an efficient calibration of the model, in terms of computational timing and convergence of the parameters, a set of A Priori parameters is obtained based on the relationships between the model parameters and the physical properties of watersheds. In this regard, we have implemented a particle tracking algorithm and a statistical model which use high resolution Digital Terrain Models to estimate different time related parameters of the model such as time to peak of the unit hydrograph. In addition, global soil moisture and depth data are used to generate A Priori estimation of maximum soil moisture capacity, an important parameter of the PDM model. Once the model is calibrated, its performance is examined during the Typhoon Nabi which struck Japan in September 2005 and caused severe flooding throughout the country. The model is also validated for the extreme precipitation event in 2012 which affected Kyushu. In both cases, quantitative measures show that simulated streamflow depicts good agreement with gauge-based observations. The model is employed to simulate thousands of possible flood events for the entire Japan which makes a basis for a comprehensive flood risk assessment and loss estimation for the flood insurance industry.
NASA Astrophysics Data System (ADS)
Cenci, Luca; Boni, Giorgio; Pulvirenti, Luca; Gabellani, Simone; Gardella, Fabio; Squicciarino, Giuseppe; Pierdicca, Nazzareno; Benedetto, Catia
2016-04-01
In a reservoir, water level monitoring is important for emergency management purposes. This information can be used to estimate the degree of filling of the water body, thus helping decision makers in flood control operations. Furthermore, if assimilated in hydrological models and coupled with rainfall forecasts, this information can be used for flood forecast and early warning. In many cases, water level is not known (e.g. data-scarce environments), or not shared by operators. Remote sensing may allow overcoming these limitations, enabling its estimation. The objective of this work is to present the Shoreline to Height (S2H) algorithm, developed to retrieve the height of the water stored in reservoirs from satellite images. To this aim, some auxiliary data are needed: a DEM and the maximum/minimum height that can be reached by the water. In data-scarce environments, these information can be easily obtained on the Internet (e.g. free, worldwide DEM and design data for artificial reservoirs). S2H was tested with different satellite data, both optical and SAR (Landsat and Cosmo SkyMed®-CSK®) in order to assess the impact of different sensors on the final estimates. The study area was the Place-Moulin Lake (Valle d'Aosta-VdA, Italy), where it is present a monitoring network that can provide reliable ground-truths for validating the algorithm and assessing its accuracy. When the algorithm was developed, it was assumed to be in absence of any "official"-auxiliary data. Therefore, two DEMs (SRTM 1 arc-second and ASTER GDEM) were used to evaluate their performances. The maximum/minimum water height values were found on the website of VdA Region. The S2H is based on three steps: i) satellite data preprocessing (Landsat: atmospheric correction; CSK®: geocoding and speckle filtering); ii) water mask generation (using a thresholding and region growing algorithm) and shoreline extraction; iii) retrieval of the shoreline height according to the reference DEMs (adopting a statistical approach). The algorithm was tested for different water heights and results were compared against ground-truths. Findings showed that the combination CSK®-SRTM provided more reliable results. It was also found that the overall quality of the estimates increases as the water height increases, reaching an accuracy up to some centimetres. This result is particularly interesting for flood control applications, where it is important to be accurate when the reservoir's degree of filling is high. The potentialities of S2H for operational hydrology purposes were tested in a real-case simulation, in which the river discharge's prediction downstream of the dam was needed for flood risk management purposes. The water height value retrieved with S2H was assimilated within a semi-distributed, event-based, hydrological model (DRiFt) by using a simple direct insertion algorithm. DRiFt is usually run in operative way on the reservoir by using ground-truths as input data. The result of the data assimilation experiment was compared with the "real", operative run of the model. Findings showed a high agreement between the two simulations, proving the utility/quality of the S2H algorithm. "Project carried out using CSK® Products, © of the Italian Space Agency (ASI), delivered under a license to use by ASI."
Cooperative Optimal Coordination for Distributed Energy Resources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Tao; Wu, Di; Ren, Wei
In this paper, we consider the optimal coordination problem for distributed energy resources (DERs) including distributed generators and energy storage devices. We propose an algorithm based on the push-sum and gradient method to optimally coordinate storage devices and distributed generators in a distributed manner. In the proposed algorithm, each DER only maintains a set of variables and updates them through information exchange with a few neighbors over a time-varying directed communication network. We show that the proposed distributed algorithm solves the optimal DER coordination problem if the time-varying directed communication network is uniformly jointly strongly connected, which is a mildmore » condition on the connectivity of communication topologies. The proposed distributed algorithm is illustrated and validated by numerical simulations.« less
NASA Astrophysics Data System (ADS)
Tanaka, T.; Tachikawa, Y.; Ichikawa, Y.; Yorozu, K.
2017-12-01
Flood is one of the most hazardous disasters and causes serious damage to people and property around the world. To prevent/mitigate flood damage through early warning system and/or river management planning, numerical modelling of flood-inundation processes is essential. In a literature, flood-inundation models have been extensively developed and improved to achieve flood flow simulation with complex topography at high resolution. With increasing demands on flood-inundation modelling, its computational burden is now one of the key issues. Improvements of computational efficiency of full shallow water equations are made from various perspectives such as approximations of the momentum equations, parallelization technique, and coarsening approaches. To support these techniques and more improve the computational efficiency of flood-inundation simulations, this study proposes an Automatic Domain Updating (ADU) method of 2-D flood-inundation simulation. The ADU method traces the wet and dry interface and automatically updates the simulation domain in response to the progress and recession of flood propagation. The updating algorithm is as follow: first, to register the simulation cells potentially flooded at initial stage (such as floodplains nearby river channels), and then if a registered cell is flooded, to register its surrounding cells. The time for this additional process is saved by checking only cells at wet and dry interface. The computation time is reduced by skipping the processing time of non-flooded area. This algorithm is easily applied to any types of 2-D flood inundation models. The proposed ADU method is implemented to 2-D local inertial equations for the Yodo River basin, Japan. Case studies for two flood events show that the simulation is finished within two to 10 times smaller time showing the same result as that without the ADU method.
Instances selection algorithm by ensemble margin
NASA Astrophysics Data System (ADS)
Saidi, Meryem; Bechar, Mohammed El Amine; Settouti, Nesma; Chikh, Mohamed Amine
2018-05-01
The main limit of data mining algorithms is their inability to deal with the huge amount of available data in a reasonable processing time. A solution of producing fast and accurate results is instances and features selection. This process eliminates noisy or redundant data in order to reduce the storage and computational cost without performances degradation. In this paper, a new instance selection approach called Ensemble Margin Instance Selection (EMIS) algorithm is proposed. This approach is based on the ensemble margin. To evaluate our approach, we have conducted several experiments on different real-world classification problems from UCI Machine learning repository. The pixel-based image segmentation is a field where the storage requirement and computational cost of applied model become higher. To solve these limitations we conduct a study based on the application of EMIS and other instance selection techniques for the segmentation and automatic recognition of white blood cells WBC (nucleus and cytoplasm) in cytological images.
NASA Astrophysics Data System (ADS)
Pearson, Callum; Reaney, Sim; Bracken, Louise; Butler, Lucy
2015-04-01
Throughout the United Kingdom flood risk is a growing problem and a significant proportion of the population are at risk from flooding throughout the country. Across England and Wales over 5 million people are believed to be at risk from fluvial, pluvial or coastal flooding (DEFRA, 2013). Increasingly communities that have not dealt with flooding before have recently experienced significant flood events. The communities of Stockdalewath and Highbridge in the Roe catchment, a tributary of the River Eden in Cumbria, UK, are an excellent example. The River Roe has a normal flow of less than 5m3 sec-1 occurring 97 percent of the time however there have been two flash floods of 98.8m3 sec-1 in January 2005 and 86.9m3 sec-1 in May 2013. These two flash flood events resulted in the inundation of numerous properties within the catchment with the 2013 event prompting the creation of the Roe Catchment Community Water Management Group which aims are to deliver a sustainable approach to managing the flood risk. Due to the distributed rural population the community fails the cost-benefit analysis for a centrally funded flood risk mitigation scheme. Therefore the at-risk community within the Roe catchment have to look for cost-effective, sustainable techniques and interventions to reduce the potential negative impacts of future events; this has resulted in a focus on natural flood risk management. This research investigates the potential to reduce flood risk through natural catchment-based land management techniques and interventions within the Roe catchment; providing a scientific base from with further action can be enacted. These interventions include changes to land management and land use, such as soil aeration and targeted afforestation, the creation of runoff attenuation features and the construction of in channel features, such as debris dams. Natural flood management (NFM) application has been proven to be effective when reducing flood risk in smaller catchments and the potential to transfer these benefits to the Roe catchment (~69km2) have been assessed. Furthermore these flood mitigation features have the potential to deliver wider environmental improvements throughout the catchment and hence the potential for multiple benefits such as diffuse pollution reduction and habitat creation are considered. The research explores the impact of NFM techniques, flood storage areas or afforestation for example, with a view to enhancing local scale habitats. The research combines innovative catchment modelling techniques, both risk-based approaches (SCIMAP Flood) and spatially distributed hydrological simulation modelling (CRUM3), with in-field monitoring and observation of flow pathways and tributary response to rainfall using time-lapse cameras. Additional work with the local community and stakeholders will identify the range and location of potential catchment-based land management techniques and interventions being assessed; natural flood management implementation requires the participation and cooperation of landowners and local community to be successful (Howgate and Kenyon, 2009).
NASA Astrophysics Data System (ADS)
Kennedy, J.; Ramirez-Hernandez, J.; Ramirez, J.
2015-12-01
In March and April, 2014, an unprecedented experimental "pulse flow" with a total volume of over 100 million cubic meters (81,000 acre-feet) of water was released from Morelos Dam into the normally dry lower Colorado River below Yuma, Arizona, for the primary purpose of restoring native vegetation and habitat. Significant infiltration and attenuation of the flood peak occurred within the limitrophe reach that forms the US-Mexico border, with total volume reduced to 57 million cubic meters at the southerly international boundary at San Luis Rio Colorado, Sonora, Mexico (32 kilometers downstream). Groundwater levels in piezometers adjacent to the stream channel rose as much as 10 meters, and surface water/groundwater connection was established throughout the reach, despite depths-to-water greater than 15 meters prior to the pulse flow. Based on groundwater levels, a groundwater mound remained in the vicinity of the stream channel for several months but had largely dissipated into the regional groundwater system by fall 2014. Ultimately, a large amount of water was moved from storage in an upstream reservoir (Lake Mead), where it is potentially available to many users but where evaporation losses can be high, to the regional aquifer in the Yuma-Mexicali area, where the water could be available to local users but cannot be precisely quantified as it moves through the groundwater system. During a time of drought, tradeoffs between local vs. upstream storage, and reservoir vs. subsurface storage, will likely be increasingly important considerations in planning future experimental floods on the Colorado River.
NASA Astrophysics Data System (ADS)
Shangguan, Donghui; Ding, Yongjian; Liu, Shiyin; Xie, Zunyi; Pieczonka, Tino; Xu, Junli; Moldobekov, Bolot
2017-10-01
Glacial meltwater and ice calving contribute to the flood volume of glacial lakes such as Lake Merzbacher in the Tian Shan Mountains of central Asia. In this study, we simulated the lake's volume by constructing an empirical relationship between the area of Lake Merzbacher, determined from satellite images, and the lake's water storage, derived from digital elevation models. Results showed that the lake water supply rate before Glacial Lake Outburst Floods (GLOFs) generally agreed well with those during the GLOFs from 2009 to 2012 but not in 2008 and 2015. Furthermore, we found that the combination of glacial meltwater and ice calving is not enough to fully explain the supply rate during GLOFs in 1996 and 1999, suggesting other factors affect the supply rate during GLOFs as well. To examine this further, we compared the water supply rate before and during GLOF events in 1999 and 2008. We inferred that quickly released short-term and intermediate-term water storage by glaciers have likely contributed to both flood events in those years. This study highlights the need to improve our understanding of the supply component of outburst floods, such as irregularly released stored water may lead to GLOF events with generally three different types: case I (singular event-triggered englacial water release), case II (glacier melt due to temperature changes), and case III (englacial water release mixed with glacier melt).
Design of redundant array of independent DVD libraries based on iSCSI
NASA Astrophysics Data System (ADS)
Chen, Yupeng; Pan, Longfa
2003-04-01
This paper presents a new approach to realize the redundant array of independent DVD libraries (RAID-LoIP) by using the iSCSI technology and traditional RAID algorithms. Our design reaches the high performance of optical storage system with following features: large storage size, highly accessing rate, random access, long distance of DVD libraries, block I/O storage, long storage life. Our RAID-LoIP system can be a good solution for broadcasting media asset storage system.
Exploiting Concurrent Wake-Up Transmissions Using Beat Frequencies.
Kumberg, Timo; Schindelhauer, Christian; Reindl, Leonhard
2017-07-26
Wake-up receivers are the natural choice for wireless sensor networks because of their ultra-low power consumption and their ability to provide communications on demand. A downside of ultra-low power wake-up receivers is their low sensitivity caused by the passive demodulation of the carrier signal. In this article, we present a novel communication scheme by exploiting purposefully-interfering out-of-tune signals of two or more wireless sensor nodes, which produce the wake-up signal as the beat frequency of superposed carriers. Additionally, we introduce a communication algorithm and a flooding protocol based on this approach. Our experiments show that our approach increases the received signal strength up to 3 dB, improving communication robustness and reliability. Furthermore, we demonstrate the feasibility of our newly-developed protocols by means of an outdoor experiment and an indoor setup consisting of several nodes. The flooding algorithm achieves almost a 100% wake-up rate in less than 20 ms.
Impact of the Three-Gorges Dam and water transfer project on Changjiang floods
NASA Astrophysics Data System (ADS)
Nakayama, Tadanobu; Shankman, David
2013-01-01
Increasing frequency of severe floods on the middle and lower Changjiang (Yangtze) River during the past few decades can be attributed to both abnormal monsoon rainfall and landscape changes that include extensive deforestation affecting river sedimentation, and shrinking lakes and levee construction that reduced the areas available for floodwater storage. The Three-Gorges Dam (TGD) and the South-to-North Water Transfer Project (SNWTP) will also affect frequency and intensity of severe floods in the Poyang Lake region of the middle Changjiang. Process-based National Integrated Catchment-based Eco-hydrology (NICE) model predicts that the TGD will increase flood risk during the early summer monsoon against the original justifications for building the dam, relating to complex river-lake-groundwater interactions. Several scenarios predict that morphological change will increase flood risk around the lake. This indicates the importance of managing both flood discharge and sediment deposition for the entire basin. Further, the authors assessed the impact of sand mining in the lake after its prohibition on the Changjiang, and clarified that alternative scenario of sand mining in lakes currently disconnected from the mainstream would reduce the flood risk to a greater extent than intensive dredging along junction channel. Because dry biomasses simulated by the model were linearly related to the Time-Integrated Normalized Difference Vegetation Index (TINDVI) estimated from satellite images, its decadal gradient during 1982-1999 showed a spatially heterogeneous distribution and generally decreasing trends beside the lakes, indicating that the increases in lake reclamation and the resultant decrease in rice productivity are closely related to the hydrologic changes. This integrated approach could help to minimize flood damage and promote better decisions addressing sustainable development.
NASA Astrophysics Data System (ADS)
Semenova, O.; Restrepo, P. J.
2011-12-01
The Red River of the North basin (USA) is considered to be under high risk of flood danger, having experienced serious flooding during the last few years. The region climate can be characterized as cold and, during winter, it exhibits continuous snowcover modified by wind redistribution. High-hazard runoff regularly occurs as a major spring snowmelt event resulting from the relatively rapid release of water from the snowpack on frozen soils. Although in summer/autumn most rainfall occurs from convective storms over small areas and does not generate dangerous floods, the pre-winter state of the soils may radically influence spring maximum flows. Large amount of artificial agricultural tiles and numerous small post-glacial depressions influencing the redistribution of runoff complicates the predictions of high floods. In such conditions any hydrological model would not be successful without proper precipitation input. In this study the simulation of runoff processes for two watersheds in the basin of the Red River of the North, USA, was undertaken using the Hydrograph model developed at the State Hydrological Institute (St. Petersburg, Russia). The Hydrograph is a robust process-based model, where the processes have a physical basis combined with some strategic conceptual simplifications that give it the ability to be applied in the conditions of low information availability. It accounts for the processes of frost and thaw of soils, snow redistribution and depression storage impacts. The assessment of the model parameters was conducted based on the characteristics of soil and vegetation cover. While performing the model runs, the parameters of depression storage and the parameters of different types of flow were manually calibrated to reproduce the observed flow. The model provided satisfactory simulation results in terms not only of river runoff but also variable sates of soil like moisture and temperature over a simulation period 2005 - 2010. For experimental runs precipitation from different sources was used as forcing data to the hydrological model: 1) data of ground meteorological stations; 2) the Snow Data Assimilation System (SNODAS) products containing several variables: snow water equivalent, snow depth, solid and liquid precipitation; 3) MAPX precipitation data which is mean areal precipitation for a watershed calculated using the radar- and gauge-based information. The results demonstrated that in the conditions of high uncertainty of model parameters combining precipitation information from different sources (the SNODAS precipitation in winter with the MAPX precipitation in summer) significantly improves the model performance and predictability of high floods.
An efficient sparse matrix multiplication scheme for the CYBER 205 computer
NASA Technical Reports Server (NTRS)
Lambiotte, Jules J., Jr.
1988-01-01
This paper describes the development of an efficient algorithm for computing the product of a matrix and vector on a CYBER 205 vector computer. The desire to provide software which allows the user to choose between the often conflicting goals of minimizing central processing unit (CPU) time or storage requirements has led to a diagonal-based algorithm in which one of four types of storage is selected for each diagonal. The candidate storage types employed were chosen to be efficient on the CYBER 205 for diagonals which have nonzero structure which is dense, moderately sparse, very sparse and short, or very sparse and long; however, for many densities, no diagonal type is most efficient with respect to both resource requirements, and a trade-off must be made. For each diagonal, an initialization subroutine estimates the CPU time and storage required for each storage type based on results from previously performed numerical experimentation. These requirements are adjusted by weights provided by the user which reflect the relative importance the user places on the two resources. The adjusted resource requirements are then compared to select the most efficient storage and computational scheme.
Hydrologic control of nitrogen removal, storage, and export in a mountain stream
Hall, R.O.; Baker, M.A.; Arp, C.D.; Kocha, B.J.
2009-01-01
Nutrient cycling and export in streams and rivers should vary with flow regime, yet most studies of stream nutrient transformation do not include hydrologic variability. We used a stable isotope tracer of nitrogen (15N) to measure nitrate (NO3) uptake, storage, and export in a mountain stream, Spring Creek, Idaho, U.S.A. We conducted two tracer tests of 2-week duration during snowmelt and baseflow. Dissolved and particulate forms of 15N were monitored over three seasons to test the hypothesis that stream N cycling would be dominated by export during floods, and storage during low flow. Floods exported more N than during baseflow conditions; however, snowmelt floods had higher than expected demand for NO{3 because of hyporheic exchange. Residence times of benthic N during both tracer tests were longer than 100 d for ephemeral pools such as benthic algae and wood biofilms. Residence times were much longer in fine detritus, insects, and the particulate N from the hyporheic zone, showing that assimilation and hydrologic storage can be important mechanisms for retaining particulate N. Of the tracer N stored in the stream, the primary form of export was via seston during periods of high flows, produced by summer rainstorms or spring snowmelt the following year. Spring Creek is not necessarily a conduit for nutrients during high flow; hydrologic exchange between the stream and its valley represents an important storage mechanism.
NASA Astrophysics Data System (ADS)
Rastogi, Richa; Londhe, Ashutosh; Srivastava, Abhishek; Sirasala, Kirannmayi M.; Khonde, Kiran
2017-03-01
In this article, a new scalable 3D Kirchhoff depth migration algorithm is presented on state of the art multicore CPU based cluster. Parallelization of 3D Kirchhoff depth migration is challenging due to its high demand of compute time, memory, storage and I/O along with the need of their effective management. The most resource intensive modules of the algorithm are traveltime calculations and migration summation which exhibit an inherent trade off between compute time and other resources. The parallelization strategy of the algorithm largely depends on the storage of calculated traveltimes and its feeding mechanism to the migration process. The presented work is an extension of our previous work, wherein a 3D Kirchhoff depth migration application for multicore CPU based parallel system had been developed. Recently, we have worked on improving parallel performance of this application by re-designing the parallelization approach. The new algorithm is capable to efficiently migrate both prestack and poststack 3D data. It exhibits flexibility for migrating large number of traces within the available node memory and with minimal requirement of storage, I/O and inter-node communication. The resultant application is tested using 3D Overthrust data on PARAM Yuva II, which is a Xeon E5-2670 based multicore CPU cluster with 16 cores/node and 64 GB shared memory. Parallel performance of the algorithm is studied using different numerical experiments and the scalability results show striking improvement over its previous version. An impressive 49.05X speedup with 76.64% efficiency is achieved for 3D prestack data and 32.00X speedup with 50.00% efficiency for 3D poststack data, using 64 nodes. The results also demonstrate the effectiveness and robustness of the improved algorithm with high scalability and efficiency on a multicore CPU cluster.
Remote collection and analysis of witness reports on flash floods
NASA Astrophysics Data System (ADS)
Gourley, Jonathan; Erlingis, Jessica; Smith, Travis; Ortega, Kiel; Hong, Yang
2010-05-01
Typically, flash floods are studied ex post facto in response to a major impact event. A complement to field investigations is developing a detailed database of flash flood events, including minor events and null reports (i.e., where heavy rain occurred but there was no flash flooding), based on public survey questions conducted in near-real time. The Severe Hazards Analysis and Verification Experiment (SHAVE) has been in operation at the National Severe Storms Laboratory (NSSL) in Norman, OK, USA during the summers since 2006. The experiment employs undergraduate students to analyse real-time products from weather radars, target specific regions within the conterminous US, and poll public residences and businesses regarding the occurrence and severity of hail, wind, tornadoes, and now flash floods. In addition to providing a rich learning experience for students, SHAVE has been successful in creating high-resolution datasets of severe hazards used for algorithm and model verification. This talk describes the criteria used to initiate the flash flood survey, the specific questions asked and information entered to the database, and then provides an analysis of results for flash flood data collected during the summer of 2008. It is envisioned that specific details provided by the SHAVE flash flood observation database will complement databases collected by operational agencies and thus lead to better tools to predict the likelihood of flash floods and ultimately reduce their impacts on society.
Scaling the flood regime with the soil hydraulic properties of the catchment
NASA Astrophysics Data System (ADS)
Peña Rojas, Luis Eduardo; Francés García, Félix; Barrios Peña, Miguel
2015-04-01
The spatial land cover distribution and soil type affect the hydraulic properties of soils, facilitating or retarding the infiltration rate and the response of a catchment during flooding events. This research analyzes: 1) the effect of land cover use in different time periods as a source of annual maximum flood records nonstationarity; 2) the scalability of the relationship between soil hydraulic properties of the catchment (initial abstractions, upper soil capillary storage and vertical and horizontal hydraulic conductivity) and the flood regime. The study was conducted in Combeima River basin in Colombia - South America and it was modelled the changes in the land uses registered in 1991, 2000, 2002 and 2007, using distributed hydrological modelling and nonparametric tests. The results showed that changes in land use affect hydraulic properties of soil and it has influence on the magnitude of flood peaks. What is a new finding is that this behavior is scalable with the soil hydraulic properties of the catchment flood moments have a simple scaling behavior and the peaks flow increases with higher values of capillary soil storage, whereas higher values, the peaks decreased. Finally it was applied Generalized Extreme Values and it was found scalable behavior in the parameters of the probability distribution function. The results allowed us to find a relationship between soil hydraulic properties and the behavior of flood regime in the basin studied.
Assessing the impact of climate and land use changes on extreme floods in a large tropical catchment
NASA Astrophysics Data System (ADS)
Jothityangkoon, Chatchai; Hirunteeyakul, Chow; Boonrawd, Kowit; Sivapalan, Murugesu
2013-05-01
In the wake of the recent catastrophic floods in Thailand, there is considerable concern about the safety of large dams designed and built some 50 years ago. In this paper a distributed rainfall-runoff model appropriate for extreme flood conditions is used to generate revised estimates of the Probable Maximum Flood (PMF) for the Upper Ping River catchment (area 26,386 km2) in northern Thailand, upstream of location of the large Bhumipol Dam. The model has two components: a continuous water balance model based on a configuration of parameters estimated from climate, soil and vegetation data and a distributed flood routing model based on non-linear storage-discharge relationships of the river network under extreme flood conditions. The model is implemented under several alternative scenarios regarding the Probable Maximum Precipitation (PMP) estimates and is also used to estimate the potential effects of both climate change and land use and land cover changes on the extreme floods. These new estimates are compared against estimates using other hydrological models, including the application of the original prediction methods under current conditions. Model simulations and sensitivity analyses indicate that a reasonable Probable Maximum Flood (PMF) at the dam site is 6311 m3/s, which is only slightly higher than the original design flood of 6000 m3/s. As part of an uncertainty assessment, the estimated PMF is sensitive to the design method, input PMP, land use changes and the floodplain inundation effect. The increase of PMP depth by 5% can cause a 7.5% increase in PMF. Deforestation by 10%, 20%, 30% can result in PMF increases of 3.1%, 6.2%, 9.2%, respectively. The modest increase of the estimated PMF (to just 6311 m3/s) in spite of these changes is due to the factoring of the hydraulic effects of trees and buildings on the floodplain as the flood situation changes from normal floods to extreme floods, when over-bank flows may be the dominant flooding process, leading to a substantial reduction in the PMF estimates.
D GIS for Flood Modelling in River Valleys
NASA Astrophysics Data System (ADS)
Tymkow, P.; Karpina, M.; Borkowski, A.
2016-06-01
The objective of this study is implementation of system architecture for collecting and analysing data as well as visualizing results for hydrodynamic modelling of flood flows in river valleys using remote sensing methods, tree-dimensional geometry of spatial objects and GPU multithread processing. The proposed solution includes: spatial data acquisition segment, data processing and transformation, mathematical modelling of flow phenomena and results visualization. Data acquisition segment was based on aerial laser scanning supplemented by images in visible range. Vector data creation was based on automatic and semiautomatic algorithms of DTM and 3D spatial features modelling. Algorithms for buildings and vegetation geometry modelling were proposed or adopted from literature. The implementation of the framework was designed as modular software using open specifications and partially reusing open source projects. The database structure for gathering and sharing vector data, including flood modelling results, was created using PostgreSQL. For the internal structure of feature classes of spatial objects in a database, the CityGML standard was used. For the hydrodynamic modelling the solutions of Navier-Stokes equations in two-dimensional version was implemented. Visualization of geospatial data and flow model results was transferred to the client side application. This gave the independence from server hardware platform. A real-world case in Poland, which is a part of Widawa River valley near Wroclaw city, was selected to demonstrate the applicability of proposed system.
NASA Astrophysics Data System (ADS)
Sedlar, F.; Turpin, E.; Kerkez, B.
2014-12-01
As megacities around the world continue to develop at breakneck speeds, future development, investment, and social wellbeing are threatened by a number of environmental and social factors. Chief among these is frequent, persistent, and unpredictable urban flooding. Jakarta, Indonesia with a population of 28 million, is a prime example of a city plagued by such flooding. Yet although Jakarta has ample hydraulic infrastructure already in place with more being constructed, the increasingly severity of the flooding it experiences is not from a lack of hydraulic infrastructure but rather a failure of existing infrastructure. As was demonstrated during the most recent floods in Jakarta, the infrastructure failure is often the result of excessive amounts of trash in the flood canals. This trash clogs pumps and reduces the overall system capacity. Despite this critical weakness of flood control in Jakarta, no data exists on the overall amount of trash in the flood canals, much less on how it varies temporally and spatially. The recent availability of low cost photography provides a means to obtain such data. Time lapse photography postprocessed with computer vision algorithms yields a low cost, remote, and automatic solution to measuring the trash fluxes. When combined with the measurement of key hydrological parameters, a thorough understanding of the relationship between trash fluxes and the hydrology of massive urban areas becomes possible. This work examines algorithm development, quantifying trash parameters, and hydrological measurements followed by data assimilation into existing hydraulic and hydrological models of Jakarta. The insights afforded from such an approach allows for more efficient operating of hydraulic infrastructure, knowledge of when and where critical levels of trash originate from, and the opportunity for community outreach - which is ultimately needed to reduce the trash in the flood canals of Jakarta and megacities around the world.
Techniques for estimating flood hydrographs for ungaged urban watersheds
Stricker, V.A.; Sauer, V.B.
1984-01-01
The Clark Method, modified slightly was used to develop a synthetic, dimensionless hydrograph which can be used to estimate flood hydrographs for ungaged urban watersheds. Application of the technique results in a typical (average) flood hydrograph for a given peak discharge. Input necessary to apply the technique is an estimate of basin lagtime and the recurrence interval peak discharge. Equations for this purpose were obtained from a recent nationwide study on flood frequency in urban watersheds. A regression equation was developed which relates flood volumes to drainage area size, basin lagtime, and peak discharge. This equation is useful where storage of floodwater may be a part of design of flood prevention. (USGS)
NASA Astrophysics Data System (ADS)
Wei, Qingyang; Ma, Tianyu; Xu, Tianpeng; Zeng, Ming; Gu, Yu; Dai, Tiantian; Liu, Yaqiang
2018-01-01
Modern positron emission tomography (PET) detectors are made from pixelated scintillation crystal arrays and readout by Anger logic. The interaction position of the gamma-ray should be assigned to a crystal using a crystal position map or look-up table. Crystal identification is a critical procedure for pixelated PET systems. In this paper, we propose a novel crystal identification method for a dual-layer-offset LYSO based animal PET system via Lu-176 background radiation and mean shift algorithm. Single photon event data of the Lu-176 background radiation are acquired in list-mode for 3 h to generate a single photon flood map (SPFM). Coincidence events are obtained from the same data using time information to generate a coincidence flood map (CFM). The CFM is used to identify the peaks of the inner layer using the mean shift algorithm. The response of the inner layer is deducted from the SPFM by subtracting CFM. Then, the peaks of the outer layer are also identified using the mean shift algorithm. The automatically identified peaks are manually inspected by a graphical user interface program. Finally, a crystal position map is generated using a distance criterion based on these peaks. The proposed method is verified on the animal PET system with 48 detector blocks on a laptop with an Intel i7-5500U processor. The total runtime for whole system peak identification is 67.9 s. Results show that the automatic crystal identification has 99.98% and 99.09% accuracy for the peaks of the inner and outer layers of the whole system respectively. In conclusion, the proposed method is suitable for the dual-layer-offset lutetium based PET system to perform crystal identification instead of external radiation sources.
NASA Astrophysics Data System (ADS)
Wilkinson, M.; Quinn, P. F.; Jonczyk, J.
2010-12-01
The increased risk from flooding continues to be of concern to governments all around the world and flood protection is becoming more of a challenge. In the UK, climate change projections indicate more extremes within the weather systems. In addition, there is an increased demand for using land in urban areas beside channels. These developments both put pressure on our flood defences and there is a need for new solutions to managing flood risk. There is currently support within the England and Wales Environment Agency for sustainable flood management solutions such as storage ponds, wetlands, beaver dams and willow riparian features (referred to here as Runoff Attenuation Features, or RAFs). However the effectiveness of RAFs are not known at the catchment scale since they have only really been trailed at the plot scale. These types of mitigation measure can offer benefits to water quality and create ecological habitats. The village of Belford, situated in the Belford Burn catchment (6km2), northern England, has suffered from numerous flood events. In addition, the catchment suffers from water quality issues within the channel and high sediment loads are having an impact on the ecology of the nearby estuary. There was a desire by the Local Environment Agency Flood Levy team to deliver an alternative catchment-based solution to the problem. With funding from the Northumbria Regional Flood Defence Committee, the Environment Agency North East Local Levy team and Newcastle University have created a partnership to address the flood problem trailing soft engineered RAF’s at the catchment scale. The partnership project, “Belford proactive flood solutions” is testing novel techniques in reducing flood risk in small sub-catchments for the Environment Agency. The project provides the information needed to understand whether the multi-functional mitigation measures are working at the sub-catchment scale. Data suggest that the mitigation measures present have delayed the overall travel time of the flood peak in the catchment by 33%. The current maximum flood storage capacity of all the features stands at around 15,000 m3. The evidence also suggests that a dam like in-stream mitigation measure can significantly reduce sediment load. Other benefits of some mitigation features include large increase in the population of water voles over the past two years. The scheme also acts as a demonstration site for interested stakeholders where they can learn about this approach to flood risk management and see the multipurpose benefits. As the project has progressed and lessons have been learnt, it has been possible to develop a runoff management toolkit for implementing these mitigation measures in other catchments of similar size. Already, the local Environment Agency has utilised the tools and recently applied similar mitigation measures to other catchments. On-going modelling exercises in the project are using the data to explore the up-scaling of the features to larger catchments.
NASA Astrophysics Data System (ADS)
Mueller, Erich R.; Grams, Paul E.; Hazel, Joseph E.; Schmidt, John C.
2018-01-01
Sandbars are iconic features of the Colorado River in the Grand Canyon, Arizona, U.S.A. Following completion of Glen Canyon Dam in 1963, sediment deficit conditions caused erosion of eddy sandbars throughout much of the 360 km study reach downstream from the dam. Controlled floods in 1996, 2004, and 2008 demonstrated that sand on the channel bed could be redistributed to higher elevations, and that floods timed to follow tributary sediment inputs would increase suspended sand concentrations during floods. Since 2012, a new management protocol has resulted in four controlled floods timed to follow large inputs of sand from a major tributary. Monitoring of 44 downstream eddy sandbars, initiated in 1990, shows that each controlled flood deposited significant amounts of sand and increased the size of subaerial sandbars. However, the magnitude of sandbar deposition varied from eddy to eddy, even over relatively short distances where main-stem suspended sediment concentrations were similar. Here, we characterize spatial and temporal trends in sandbar volume and site-scale (i.e., individual eddy) sediment storage as a function of flow, channel, and vegetation characteristics that reflect the reach-scale (i.e., kilometer-scale) hydraulic environment. We grouped the long-term monitoring sites based on geomorphic setting and used a principal component analysis (PCA) to correlate differences in sandbar behavior to changes in reach-scale geomorphic metrics. Sites in narrow reaches are less-vegetated, stage changes markedly with discharge, sandbars tend to remain dynamic, and sand storage change dominantly occurs in the eddy compared to the main channel. In wider reaches, where stage-change during floods may be half that of narrow sites, sandbars are more likely to be stabilized by vegetation, and floods tend to aggrade the vegetated sandbar surfaces. In these locations, deposition during controlled floods is more akin to floodplain sedimentation, and the elevation of sandbar surfaces increases with successive floods. Because many sandbars are intermediate to the end members described above, high-elevation bar surfaces stabilized by vegetation often have a more dynamic unvegetated sandbar on the channel-ward margin that aggrades and erodes in response to controlled flood cycles. Ultimately, controlled floods have been effective at increasing averaged sandbar volumes, and, while bar deposition during floods decreases through time where vegetation has stabilized sandbars, future controlled floods are likely to continue to result in deposition in a majority of the river corridor. Supplementary Fig. 2 Relation between the total site and high-elevation discharge-volume relation slope for all sites where both relations are available (n = 33). Supplementary Fig. 3 Change in sandbar volume since 1990 for Marble versus Grand Canyon sites. Solid vertical gray lines indicate controlled floods, and dashed vertical gray lines indicate other high test flows in 1997 and 2000 as discussed in the text. Photographs by U.S. Geological Survey, 2008-2015.
Streamflow model of Wisconsin River for estimating flood frequency and volume
Krug, William R.; House, Leo B.
1980-01-01
The 100-year flood peak at Wisconsin Dells, computed from the simulated, regulated streamflow data for the period 1915-76, is 82,000 cubic feet per second, including the effects of all the reservoirs in the river system, as they are currently operated. It also includes the effects of Lakes Du Bay, Petenwell, and Castle Rock which are significant for spring floods but are insignificant for summer or fall floods because they are normally maintained nearly full in the summer and fall and have very little storage for floodwaters. (USGS)
Techniques for estimating flood-peak discharges from urban basins in Missouri
Becker, L.D.
1986-01-01
Techniques are defined for estimating the magnitude and frequency of future flood peak discharges of rainfall-induced runoff from small urban basins in Missouri. These techniques were developed from an initial analysis of flood records of 96 gaged sites in Missouri and adjacent states. Final regression equations are based on a balanced, representative sampling of 37 gaged sites in Missouri. This sample included 9 statewide urban study sites, 18 urban sites in St. Louis County, and 10 predominantly rural sites statewide. Short-term records were extended on the basis of long-term climatic records and use of a rainfall-runoff model. Linear least-squares regression analyses were used with log-transformed variables to relate flood magnitudes of selected recurrence intervals (dependent variables) to selected drainage basin indexes (independent variables). For gaged urban study sites within the State, the flood peak estimates are from the frequency curves defined from the synthesized long-term discharge records. Flood frequency estimates are made for ungaged sites by using regression equations that require determination of the drainage basin size and either the percentage of impervious area or a basin development factor. Alternative sets of equations are given for the 2-, 5-, 10-, 25-, 50-, and 100-yr recurrence interval floods. The average standard errors of estimate range from about 33% for the 2-yr flood to 26% for the 100-yr flood. The techniques for estimation are applicable to flood flows that are not significantly affected by storage caused by manmade activities. Flood peak discharge estimating equations are considered applicable for sites on basins draining approximately 0.25 to 40 sq mi. (Author 's abstract)
Beaver Mediated Water Table Dynamics in Mountain Peatlands
NASA Astrophysics Data System (ADS)
Karran, D. J.; Westbrook, C.; Bedard-Haughn, A.
2016-12-01
Water table dynamics play an important role in the ecological and biogeochemical processes that regulate carbon and water storage in peatlands. Beaver are common in these habitats and the dams they build have been shown to raise water tables in other environments. However, the impact of beaver dams in peatlands, where water tables rest close to the surface, has yet to be determined. We monitored a network of 50 shallow wells in a Canadian Rocky Mountain peatland for 6 years. During this period, a beaver colony was maintaining a number of beaver ponds for four years until a flood event removed the colony from the area and breached some of the dams. Two more years of data were collected after the flood event to assess whether the dams enhanced groundwater storage. Beaver dams raised water tables just as they do in other environments. Furthermore, water tables within 100 meters of beaver dams were more stable than those further away and water table stability overall was greater before the flood event. Our results suggest the presence/absence of beaver in peatlands has implications for groundwater water storage and overall system function.
Natural phenomena evaluations of the K-25 site UF{sub 6} cylinder storage yards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fricke, K.E.
1996-09-15
The K-25 Site UF{sub 6} cylinder storage yards are used for the temporary storage of UF{sub 6} normal assay cylinders and long-term storage of other UF{sub 6} cylinders. The K-25 Site UF{sub 6} cylinder storage yards consist of six on-site areas: K-1066-B, K-1066-E, K-1066-F, K-1066-J, K-1066-K and K-1066-L. There are no permanent structures erected on the cylinder yards, except for five portable buildings. The operating contractor for the K-25 Site is preparing a Safety Analysis Report (SAR) to examine the safety related aspects of the K-25 Site UF{sub 6} cylinder storage yards. The SAR preparation encompasses many tasks terminating inmore » consequence analysis for the release of gaseous and liquid UF{sub 6}, one of which is the evaluation of natural phenomena threats, such as earthquakes, floods, and winds. In support of the SAR, the six active cylinder storage yards were evaluated for vulnerabilities to natural phenomena, earthquakes, high winds and tornados, tornado-generated missiles, floods (local and regional), and lightning. This report summarizes those studies. 30 refs.« less
Connectivity and storage functions of channel fens and flat bogs in northern basins
NASA Astrophysics Data System (ADS)
Quinton, W. L.; Hayashi, M.; Pietroniro, A.
2003-12-01
The hydrological response of low relief, wetland-dominated zones of discontinuous permafrost is poorly understood. This poses a major obstacle to the development of a physically meaningful meso-scale hydrological model for the Mackenzie basin, one of the world's largest northern basins. The present study examines the runoff response of five representative study basins (Scotty Creek, and the Jean-Marie, Birch, Blackstone and Martin Rivers) in the lower Liard River valley as a function of their major biophysical characteristics. High-resolution (4 m × 4 m) IKONOS satellite imagery was used in combination with aerial and ground verification surveys to classify the land cover, and to delineate the wetland area connected to the drainage system. Analysis of the annual hydrographs of each basin for the 4 year period 1997 to 2000, demonstrated that runoff was positively correlated with the drainage density, basin slope, and the percentage of the basin covered by channel fens, and was negatively correlated with the percentage of the basin covered by flat bogs. The detailed analysis of the water-level response to summer rainstorms at several nodes along the main drainage network in the Scotty Creek basin showed that the storm water was slowly routed through channel fens with an average flood-wave velocity of 0·23 km h-1. The flood-wave velocity appears to be controlled by channel slope and hydraulic roughness in a manner consistent with the Manning formula, suggesting that a roughness-based routing algorithm might be useful in large-scale hydrological models. Copyright
Interleaved Observation Execution and Rescheduling on Earth Observing Systems
NASA Technical Reports Server (NTRS)
Khatib, Lina; Frank, Jeremy; Smith, David; Morris, Robert; Dungan, Jennifer
2003-01-01
Observation scheduling for Earth orbiting satellites solves the following problem: given a set of requests for images of the Earth, a set of instruments for acquiring those images distributed on a collecting of orbiting satellites, and a set of temporal and resource constraints, generate a set of assignments of instruments and viewing times to those requests that satisfy those constraints. Observation scheduling is often construed as a constrained optimization problem with the objective of maximizing the overall utility of the science data acquired. The utility of an image is typically based on the intrinsic importance of acquiring it (for example, its importance in meeting a mission or science campaign objective) as well as the expected value of the data given current viewing conditions (for example, if the image is occluded by clouds, its value is usually diminished). Currently, science observation scheduling for Earth Observing Systems is done on the ground, for periods covering a day or more. Schedules are uplinked to the satellites and are executed rigorously. An alternative to this scenario is to do some of the decision-making about what images are to be acquired on-board. The principal argument for this capability is that the desirability of making an observation can change dynamically, because of changes in meteorological conditions (e.g. cloud cover), unforeseen events such as fires, floods, or volcanic eruptions, or un-expected changes in satellite or ground station capability. Furthermore, since satellites can only communicate with the ground between 5% to 10% of the time, it may be infeasible to make the desired changes to the schedule on the ground, and uplink the revisions in time for the on-board system to execute them. Examples of scenarios that motivate an on-board capability for revising schedules include the following. First, if a desired visual scene is completely obscured by clouds, then there is little point in taking it. In this case, satellite resources, such as power and storage space can be better utilized taking another image that is higher quality. Second, if an unexpected but important event occurs (such as a fire, flood, or volcanic eruption), there may be good reason to take images of it, instead of expending satellite resources on some of the lower priority scheduled observations. Finally, if there is unexpected loss of capability, it may be impossible to carry out the schedule of planned observations. For example, if a ground station goes down temporarily, a satellite may not be able to free up enough storage space to continue with the remaining schedule of observations. This paper describes an approach for interleaving execution of observation schedules with dynamic schedule revision based on changes to the expected utility of the acquired images. We describe the problem in detail, formulate an algorithm for interleaving schedule revision and execution, and discuss refinements to the algorithm based on the need for search efficiency. We summarize with a brief discussion of the tests performed on the system.
NASA Astrophysics Data System (ADS)
Thomas, Nicholas W.; Arenas Amado, Antonio; Schilling, Keith E.; Weber, Larry J.
2016-10-01
This research systematically analyzed the influence of antecedent soil wetness, rainfall depth, and the subsequent impact on peak flows in a 45 km2 watershed. Peak flows increased with increasing antecedent wetness and rainfall depth, with the highest peak flows occurring under intense precipitation on wet soils. Flood mitigation structures were included and investigated under full and empty initial storage conditions. Peak flows were reduced at the outlet of the watershed by 3-17%. The highest peak flow reductions occurred in scenarios with dry soil, empty project storage, and low rainfall depths. These analyses showed that with increased rainfall depth, antecedent moisture conditions became increasingly less impactful. Scaling invariance of peak discharges were shown to hold true within this basin and were fit through ordinary least squares regression for each design scenario. Scale-invariance relationships were extrapolated beyond the outlet of the analyzed basin to the point of intersection of with and without structure scenarios. In each scenario extrapolated peak discharge benefits depreciated at a drainage area of approximately 100 km2. The associated drainage area translated to roughly 2 km downstream of the Beaver Creek watershed outlet. This work provides an example of internal watershed benefits of structural flood mitigation efforts, and the impact the may exert outside of the basin. Additionally, the influence of 1.8 million in flood reduction tools was not sufficient to routinely address downstream flood concerns, shedding light on the additional investment required to alter peak flows in large basins.
Design of Energy Storage Management System Based on FPGA in Micro-Grid
NASA Astrophysics Data System (ADS)
Liang, Yafeng; Wang, Yanping; Han, Dexiao
2018-01-01
Energy storage system is the core to maintain the stable operation of smart micro-grid. Aiming at the existing problems of the energy storage management system in the micro-grid such as Low fault tolerance, easy to cause fluctuations in micro-grid, a new intelligent battery management system based on field programmable gate array is proposed : taking advantage of FPGA to combine the battery management system with the intelligent micro-grid control strategy. Finally, aiming at the problem that during estimation of battery charge State by neural network, initialization of weights and thresholds are not accurate leading to large errors in prediction results, the genetic algorithm is proposed to optimize the neural network method, and the experimental simulation is carried out. The experimental results show that the algorithm has high precision and provides guarantee for the stable operation of micro-grid.
NASA Astrophysics Data System (ADS)
Nayar, Priya; Singh, Bhim; Mishra, Sukumar
2017-08-01
An artificial intelligence based control algorithm is used in solving power quality problems of a diesel engine driven synchronous generator with automatic voltage regulator and governor based standalone system. A voltage source converter integrated with a battery energy storage system is employed to mitigate the power quality problems. An adaptive neural network based signed regressor control algorithm is used for the estimation of the fundamental component of load currents for control of a standalone system with load leveling as an integral feature. The developed model of the system performs accurately under varying load conditions and provides good dynamic response to the step changes in loads. The real time performance is achieved using MATLAB along with simulink/simpower system toolboxes and results adhere to an IEEE-519 standard for power quality enhancement.
Study on store-space assignment based on logistic AGV in e-commerce goods to person picking pattern
NASA Astrophysics Data System (ADS)
Xu, Lijuan; Zhu, Jie
2017-10-01
This paper studied on the store-space assignment based on logistic AGV in E-commerce goods to person picking pattern, and established the store-space assignment model based on the lowest picking cost, and design for store-space assignment algorithm after the cluster analysis based on similarity coefficient. And then through the example analysis, compared the picking cost between store-space assignment algorithm this paper design and according to item number and storage according to ABC classification allocation, and verified the effectiveness of the design of the store-space assignment algorithm.
Floods of November-December 1950 in the Central Valley basin, California
Paulsen, C.G.
1953-01-01
The flood of November-December 1950 in the Central Valley basin was the greatest in most parts of the basin since the turn of the century and probably was exceeded in the lower San Joaquin River basin only by the historic flood of 1862. In respect to monetary loss, the 1950 flood was the most disastrous in the history of the basin. Loss of life was remarkably small when one considers the extensive damage and destruction to homes and other property, which is estimated at 33 million dollars. Outstanding features of the flood were its unprecedented occurrence so early in the winter flood season, its magnitude in respect to both peak and volume in most major tributaries, and the occurrence of a succession of near-peak flows with a period of three weeks. The flood was caused by a series of storms during the period November 16 to December 8, which brought exceptionally warm, moisture-laden air inland against the Sierra Nevada range and caused intense rainfall, instead of snowfall, at unusually high altitudes. Basin-wide totals of rainfall during the period ranged from 30 inches over the Yuba and American River basins to 13 inches over the upper Sacramento and Feather River basins. Based on continuous records of discharge on major tributaries for periods ranging from 22 to 55 years and averaging about 43 years, the 1950 flood peaks were the greatest of record on the American, Cosumnes, Mokelumne, Stanislaus, Tuolumne, Merced, Chowchilla, Fresno, lower San Joaquin, Kings, Kaweah, Tule, and Kern Rivers. Second highest peak of record occurred during the flood of March 1928 on the Yuba, American and Mokelumne Rivers; the flood of Marcn 1940 on Cosumnes River; the flood of January 1911 on the Stanislaus and Tuolumne Rivers; the flood of December 1937 on the Merced, Kings, and Kaweah Rivers; the flood of March 1938 on the Chowchilla, Fresno, and lower San Joaquin Rivers; and the flood of March 1943 on the Tule and Kern Rivers. Peak discharges for 1950 did not exceed previous maxima on Bear, Yuba, Feather, and upper Sacramento Rivers, nor on west side tributaries of lower Sacramento River, Calaveras River, and upper San Joaquin River (above Friant Reservoir). Notable high rates of discharge were 354 cfs per square mile from 39.5 square miles in North Fork of Middle Fork Tule River, 225 cfs per square mile from 198 square miles in Rubicon River, 115 cfs per square mile from 999 square miles in North Fork of American River and 93.7 cfs per square mile from 1,921 square miles in American River at Fair Oaks. This report presents a general description of the 1950 flood, details and estimates of the damage incurred, records of stage and discharge for the period of the flood at 171 stream-gaging stations, records of storage in 14 reservoirs, a summary of peak discharges with comparative data for previous floods at 252 measurement points, and tables showing crest stages along the main stem and major tributary channels of the Sacramento and San Joaquin Rivers. The report also includes a discussion of meteorologic and hydrologic conditions associated with the flood, examples of the flood regulation afforded by storage reservoirs, a brief study of runoff characteristics, and a summary and comparison with previous floods in the Central Valley basin.
NASA Astrophysics Data System (ADS)
Craciunescu, V.; Flueraru, C.; Stancalie, G.
2009-04-01
Floods are the major disaster affecting many countries in the world year after year. From Romania perspective, floods are among the most hazardous natural disasters in terms of human suffering and economic losses. Major floods occurred in 2005, 2006 and 2008, the worst ones in more than 40 years, have affected large regions of Romania: in the Timis county (April 2005) over 1 300 homes have been damaged or destroyed, 3 800 people have been evacuated and about 30 000 hectares of agricultural land flooded; in five counties situated in eastern Romania (July 2005) 11 000 homes were inundated, 8 600 people have been evacuated, 20 people were killed, 53 000 ha farmland flooded, 379 bridges damaged or destroyed; in 12 counties along the Danube (April 2006) 3 077 homes were affected (1.049 completely destroyed), 16 000 people evacuated, five people killed, 144 000 hectares of land flooded; in six counties from the North-East part of Romania (July 2008) 3 985 houses were affected (over 300 totally destroyed), 15 834 people evacuated and 35 084 hectares of agricultural land inundated. Flood management evolves and changes as more knowledge and technology becomes available to the environmental community. Satellite imagery can be very effective for flood management in detailed mapping that is required for the production of hazard assessment maps and for input to various types of hydrological models, as well as in monitoring land use/cover changes over the years to quantify prominent changes in land use/cover in general and extent of impervious area in particular. In the same time, the wealth of old cartographic documents is an important cultural and scientific heritage. By careful studying this kind of documents, a modern manager can better understand the way territory was managed in the past and the implications of that management in today's floods reality. Good quality photo cameras, flat-bed and large size scanners were used to convert the analogue old cartographic materials into digital files. Specially, highly compressed, file formats were used to reduce the raster database size without affecting the documents quality. Digitisation and online distribution of this kind of documents, via an online system, provided new ways to access and to interact with our patrimony and new tangible arguments for the flood decision makers. The research included the development of key components and modules providing characterisation (based on metadata), virtual storage, discovery and access services, including intuitive query and browsing mechanisms and exploiting the potential of semantic web and advanced storage technologies. For all the mentioned flood events various processing techniques (classification, geo-referencing, filtering, and photo-interpretation) were used to combine the optical and radar images in order to delineate the flooded areas. The resulted flood masks were integrated in GIS environment with the old cartographic database and also with digital layers that represent the current geographic reality.
A method of measuring increase in soil depth and water-storage capacity due to forest management
George R., Jr. Trimble
1952-01-01
Conservationists, engineers, and others who deal with water problems have become more and more concerned in recent years with increasing the storage of water in the ground. Their concern has centered around problems of flood control and storage of water for later use by plants or animals, including man.
Cloud Computing and Its Applications in GIS
NASA Astrophysics Data System (ADS)
Kang, Cao
2011-12-01
Cloud computing is a novel computing paradigm that offers highly scalable and highly available distributed computing services. The objectives of this research are to: 1. analyze and understand cloud computing and its potential for GIS; 2. discover the feasibilities of migrating truly spatial GIS algorithms to distributed computing infrastructures; 3. explore a solution to host and serve large volumes of raster GIS data efficiently and speedily. These objectives thus form the basis for three professional articles. The first article is entitled "Cloud Computing and Its Applications in GIS". This paper introduces the concept, structure, and features of cloud computing. Features of cloud computing such as scalability, parallelization, and high availability make it a very capable computing paradigm. Unlike High Performance Computing (HPC), cloud computing uses inexpensive commodity computers. The uniform administration systems in cloud computing make it easier to use than GRID computing. Potential advantages of cloud-based GIS systems such as lower barrier to entry are consequently presented. Three cloud-based GIS system architectures are proposed: public cloud- based GIS systems, private cloud-based GIS systems and hybrid cloud-based GIS systems. Public cloud-based GIS systems provide the lowest entry barriers for users among these three architectures, but their advantages are offset by data security and privacy related issues. Private cloud-based GIS systems provide the best data protection, though they have the highest entry barriers. Hybrid cloud-based GIS systems provide a compromise between these extremes. The second article is entitled "A cloud computing algorithm for the calculation of Euclidian distance for raster GIS". Euclidean distance is a truly spatial GIS algorithm. Classical algorithms such as the pushbroom and growth ring techniques require computational propagation through the entire raster image, which makes it incompatible with the distributed nature of cloud computing. This paper presents a parallel Euclidean distance algorithm that works seamlessly with the distributed nature of cloud computing infrastructures. The mechanism of this algorithm is to subdivide a raster image into sub-images and wrap them with a one pixel deep edge layer of individually computed distance information. Each sub-image is then processed by a separate node, after which the resulting sub-images are reassembled into the final output. It is shown that while any rectangular sub-image shape can be used, those approximating squares are computationally optimal. This study also serves as a demonstration of this subdivide and layer-wrap strategy, which would enable the migration of many truly spatial GIS algorithms to cloud computing infrastructures. However, this research also indicates that certain spatial GIS algorithms such as cost distance cannot be migrated by adopting this mechanism, which presents significant challenges for the development of cloud-based GIS systems. The third article is entitled "A Distributed Storage Schema for Cloud Computing based Raster GIS Systems". This paper proposes a NoSQL Database Management System (NDDBMS) based raster GIS data storage schema. NDDBMS has good scalability and is able to use distributed commodity computers, which make it superior to Relational Database Management Systems (RDBMS) in a cloud computing environment. In order to provide optimized data service performance, the proposed storage schema analyzes the nature of commonly used raster GIS data sets. It discriminates two categories of commonly used data sets, and then designs corresponding data storage models for both categories. As a result, the proposed storage schema is capable of hosting and serving enormous volumes of raster GIS data speedily and efficiently on cloud computing infrastructures. In addition, the scheme also takes advantage of the data compression characteristics of Quadtrees, thus promoting efficient data storage. Through this assessment of cloud computing technology, the exploration of the challenges and solutions to the migration of GIS algorithms to cloud computing infrastructures, and the examination of strategies for serving large amounts of GIS data in a cloud computing infrastructure, this dissertation lends support to the feasibility of building a cloud-based GIS system. However, there are still challenges that need to be addressed before a full-scale functional cloud-based GIS system can be successfully implemented. (Abstract shortened by UMI.)
Dynamics of Extreme Floods in Southeast and South Brazil
NASA Astrophysics Data System (ADS)
Ribeiro Lima, C. H.; Lall, U.
2015-12-01
Many extreme floods result from a causal chain, where exceptional rain and floods in water basins from different sizes are related to large scale, anomalous and persistent patterns in atmospheric and oceanic circulation. Organized moisture plumes from oceanic sources are often implicated. One could use an Eulerian-Lagrangian climate model to test a causal chain hypothesis, but the parameterization and testing of such a model covering convection and transport continues to be a challenge. Consequently, empirical data based studies can be useful to establish the need to formally model such events using this approach. Here we consider two flood-prone regions in Southeast and South Brazil as case studies. A hypothesis of the causal chain of extreme floods in these regions is investigated by means of observed streamflow and reanalysis data and some machine learning tools. The signatures of the organization of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the integrated moisture flux and its divergence field and storm track data, so that a better understanding of the relations between the flood magnitude and duration, strength of moisture convergence and role of regional moisture recycling or teleconnected moisture is established. Persistent patterns and anomalies in the sea surface temperature (SST) field in the Pacific and Atlantic oceans that may be associated with disturbances in the atmospheric circulation and with the flood dynamics are investigated through composite analysis. Finally, machine learning algorithms for nonlinear dimension reduction are employed to visualize and understand some of the spatio-temporal patterns of the dominated climate variables in a reduced dimensional space. Prospects for prediction are discussed.
Li, Fawen; Wang, Liping; Zhao, Yong
2017-08-01
Soil organic carbon (SOC) plays an important role in the global carbon cycle. The aim of this study was to evaluate the response of SOC to land use change and its influence on land use planning in the Haihe basin, and provide planning land use pattern for basin flood risk assessment. Firstly, the areas of different land use types in 1980, 2008, and the planning year (2020) were counted by area statistics function of ArcGIS. Then, the transfer matrixes of land use were produced by spatial overlay analysis function. Lastly, based on the land use maps, soil type map and soil profile database, SOC storage of different land use types in three different periods were calculated. The results showed the patterns of land use have changed a lot from 1980 to 2008, among the 19,835 km 2 of grassland was transformed into forestland, which was the largest conversion landscape. And land use conversion brought the SOC storage changes. Total carbon source was 88.83 Tg, and total carbon sink was 85.49 Tg. So, the Haihe basin presented as a carbon source from 1980 to 2008. From 2008 to 2020, the changes of forestland and grassland are the biggest in Haihe basin, which cause the SOC pool change from a carbon source to a carbon sink. SOC storage will increase from 2420.5 Tg in 2008 to 2495.5 Tg in 2020. The changing trend is conducive to reducing atmospheric concentrations. Therefore, land use planning in Haihe basin is reasonable and can provide the underlying surface condition for flood risk assessment.
NASA Astrophysics Data System (ADS)
Nardi, F.; Grimaldi, S.; Petroselli, A.
2012-12-01
Remotely sensed Digital Elevation Models (DEMs), largely available at high resolution, and advanced terrain analysis techniques built in Geographic Information Systems (GIS), provide unique opportunities for DEM-based hydrologic and hydraulic modelling in data-scarce river basins paving the way for flood mapping at the global scale. This research is based on the implementation of a fully continuous hydrologic-hydraulic modelling optimized for ungauged basins with limited river flow measurements. The proposed procedure is characterized by a rainfall generator that feeds a continuous rainfall-runoff model producing flow time series that are routed along the channel using a bidimensional hydraulic model for the detailed representation of the inundation process. The main advantage of the proposed approach is the characterization of the entire physical process during hydrologic extreme events of channel runoff generation, propagation, and overland flow within the floodplain domain. This physically-based model neglects the need for synthetic design hyetograph and hydrograph estimation that constitute the main source of subjective analysis and uncertainty of standard methods for flood mapping. Selected case studies show results and performances of the proposed procedure as respect to standard event-based approaches.
Remote collection and analysis of witness reports on flash floods
NASA Astrophysics Data System (ADS)
Gourley, J. J.; Erlingis, J. M.; Smith, T. M.; Ortega, K. L.; Hong, Y.
2010-11-01
SummaryTypically, flash floods are studied ex post facto in response to a major impact event. A complement to field investigations is developing a detailed database of flash flood events, including minor events and null reports (i.e., where heavy rain occurred but there was no flash flooding), based on public survey questions conducted in near-real time. The Severe hazards analysis and verification experiment (SHAVE) has been in operation at the National Severe Storms Laboratory (NSSL) in Norman, OK, USA during the summers since 2006. The experiment employs undergraduate students to analyse real-time products from weather radars, target specific regions within the conterminous US, and poll public residences and businesses regarding the occurrence and severity of hail, wind, tornadoes, and now flash floods. In addition to providing a rich learning experience for students, SHAVE has also been successful in creating high-resolution datasets of severe hazards used for algorithm and model verification. This paper describes the criteria used to initiate the flash flood survey, the specific questions asked and information entered to the database, and then provides an analysis of results for flash flood data collected during the summer of 2008. It is envisioned that specific details provided by the SHAVE flash flood observation database will complement databases collected by operational agencies (i.e., US National Weather Service Storm Data reports) and thus lead to better tools to predict the likelihood of flash floods and ultimately reduce their impacts on society.
NASA Astrophysics Data System (ADS)
Stark, J.; Smolders, S.; Meire, P.; Temmerman, S.
2017-11-01
Marsh restoration projects are nowadays being implemented as ecosystem-based strategies to reduce flood risks and to restore intertidal habitat along estuaries. Changes in estuarine tidal hydrodynamics are expected along with such intertidal area changes. A validated hydrodynamic model of the Scheldt Estuary is used to gain fundamental insights in the role of intertidal area characteristics on tidal hydrodynamics and tidal asymmetry in particular through several geomorphological scenarios in which intertidal area elevation and location along the estuary is varied. Model results indicate that the location of intertidal areas and their storage volume relative to the local tidal prism determine the intensity and reach along the estuary over which tidal hydrodynamics are affected. Our model results also suggest that intertidal storage areas that are located within the main estuarine channel system, and hence are part of the flow-carrying part of the estuary, may affect tidal hydrodynamics differently than intertidal areas that are side-basins of the main estuarine channel, and hence only contribute little to the flow-carrying cross-section of the estuary. If tidal flats contribute to the channel cross-section and exert frictional effects on the tidal propagation, the elevation of intertidal flats influences the magnitude and direction of tidal asymmetry along estuarine channels. Ebb-dominance is most strongly enhanced if tidal flats are around mean sea level or slightly above. Conversely, flood-dominance is enhanced if the tidal flats are situated low in the tidal frame. For intertidal storage areas at specific locations besides the main channel, flood-dominance in the estuary channel peaks in the vicinity of those areas and generally reduces upstream and downstream compared to a reference scenario. Finally, the model results indicate an along-estuary varying impact on the tidal prism as a result of adding intertidal storage at a specific location. In addition to known effects of tidal prism decrease upstream and tidal prism increase downstream of additional storage areas, our model results indicate a reduction in tidal prism far downstream of intertidal storage areas as a result of a decreasing tidal range. This study may assist estuarine managers in assessing the impact of marsh restoration and managed shoreline realignment projects, as well as with the morphological management of estuaries through dredging and disposal of sediment on intertidal areas.
Real-time flood forecasts & risk assessment using a possibility-theory based fuzzy neural network
NASA Astrophysics Data System (ADS)
Khan, U. T.
2016-12-01
Globally floods are one of the most devastating natural disasters and improved flood forecasting methods are essential for better flood protection in urban areas. Given the availability of high resolution real-time datasets for flood variables (e.g. streamflow and precipitation) in many urban areas, data-driven models have been effectively used to predict peak flow rates in river; however, the selection of input parameters for these types of models is often subjective. Additionally, the inherit uncertainty associated with data models along with errors in extreme event observations means that uncertainty quantification is essential. Addressing these concerns will enable improved flood forecasting methods and provide more accurate flood risk assessments. In this research, a new type of data-driven model, a quasi-real-time updating fuzzy neural network is developed to predict peak flow rates in urban riverine watersheds. A possibility-to-probability transformation is first used to convert observed data into fuzzy numbers. A possibility theory based training regime is them used to construct the fuzzy parameters and the outputs. A new entropy-based optimisation criterion is used to train the network. Two existing methods to select the optimum input parameters are modified to account for fuzzy number inputs, and compared. These methods are: Entropy-Wavelet-based Artificial Neural Network (EWANN) and Combined Neural Pathway Strength Analysis (CNPSA). Finally, an automated algorithm design to select the optimum structure of the neural network is implemented. The overall impact of each component of training this network is to replace the traditional ad hoc network configuration methods, with one based on objective criteria. Ten years of data from the Bow River in Calgary, Canada (including two major floods in 2005 and 2013) are used to calibrate and test the network. The EWANN method selected lagged peak flow as a candidate input, whereas the CNPSA method selected lagged precipitation and lagged mean daily flow as candidate inputs. Model performance metric show that the CNPSA method had higher performance (with an efficiency of 0.76). Model output was used to assess the risk of extreme peak flows for a given day using an inverse possibility-to-probability transformation.
Towards a global flood detection system using social media
NASA Astrophysics Data System (ADS)
de Bruijn, Jens; de Moel, Hans; Jongman, Brenden; Aerts, Jeroen
2017-04-01
It is widely recognized that an early warning is critical in improving international disaster response. Analysis of social media in real-time can provide valuable information about an event or help to detect unexpected events. For successful and reliable detection systems that work globally, it is important that sufficient data is available and that the algorithm works both in data-rich and data-poor environments. In this study, both a new geotagging system and multi-level event detection system for flood hazards was developed using Twitter data. Geotagging algorithms that regard one tweet as a single document are well-studied. However, no algorithms exist that combine several sequential tweets mentioning keywords regarding a specific event type. Within the time frame of an event, multiple users use event related keywords that refer to the same place name. This notion allows us to treat several sequential tweets posted in the last 24 hours as one document. For all these tweets, we collect a series of spatial indicators given in the tweet metadata and extract additional topological indicators from the text. Using these indicators, we can reduce ambiguity and thus better estimate what locations are tweeted about. Using these localized tweets, Bayesian change-point analysis is used to find significant increases of tweets mentioning countries, provinces or towns. In data-poor environments detection of events on a country level is possible, while in other, data-rich, environments detection on a city level is achieved. Additionally, on a city-level we analyse the spatial dependence of mentioned places. If multiple places within a limited spatial extent are mentioned, detection confidence increases. We run the algorithm using 2 years of Twitter data with flood related keywords in 13 major languages and validate against a flood event database. We find that the geotagging algorithm yields significantly more data than previously developed algorithms and successfully deals with ambiguous place names. In addition, we show that our detection system can both quickly and reliably detect floods, even in countries where data is scarce, while achieving high detail in countries where more data is available.
NASA Astrophysics Data System (ADS)
Sutanudjaja, Edwin; van Beek, Rens; Winsemius, Hessel; Ward, Philip; Bierkens, Marc
2017-04-01
The Aqueduct Global Flood Analyzer, launched in 2015, is an open-access and free-of-charge web-based interactive platform which assesses and visualises current and future projections of river flood impacts across the globe. One of the key components in the Analyzer is a set of river flood inundation hazard maps derived from the global hydrological model simulation of PCR-GLOBWB. For the current version of the Analyzer, accessible on http://floods.wri.org/#/, the early generation of PCR-GLOBWB 1.0 was used and simulated at 30 arc-minute ( 50 km at the equator) resolution. In this presentation, we will show the new version of these hazard maps. This new version is based on the latest version of PCR-GLOBWB 2.0 (https://github.com/UU-Hydro/PCR-GLOBWB_model, Sutanudjaja et al., 2016, doi:10.5281/zenodo.60764) simulated at 5 arc-minute ( 10 km at the equator) resolution. The model simulates daily hydrological and water resource fluxes and storages, including the simulation of overbank volume that ends up on the floodplain (if flooding occurs). The simulation was performed for the present day situation (from 1960) and future climate projections (until 2099) using the climate forcing created in the ISI-MIP project. From the simulated flood inundation volume time series, we then extract annual maxima for each cell, and fit these maxima to a Gumbel extreme value distribution. This allows us to derive flood volume maps of any hazard magnitude (ranging from 2-year to 1000-year flood events) and for any time period (e.g. 1960-1999, 2010-2049, 2030-2069, and 2060-2099). The derived flood volumes (at 5 arc-minute resolution) are then spread over the high resolution terrain model using an updated GLOFRIS downscaling module (Winsemius et al., 2013, doi:10.5194/hess-17-1871-2013). The updated version performs a volume spreading sequentially from more upstream basins to downstream basins, hence enabling a better inclusion of smaller streams, and takes into account spreading of water over diverging deltaic regions. This results in a set of high resolution hazard maps of flood inundation depth at 30 arc-second ( 1 km at the equator) resolution. Together with many other updates and new features, the resulting flood hazard maps will be used in the next generation of the Aqueduct Global Flood Analyzer.
NASA Astrophysics Data System (ADS)
Perrou, Theodora; Papastergios, Asterios; Parcharidis, Issaak; Chini, Marco
2017-10-01
Flood disaster is one of the heaviest disasters in the world. It is necessary to monitor and evaluate the flood disaster in order to mitigate the consequences. As floods do not recognize borders, transboundary flood risk management is imperative in shared river basins. Disaster management is highly dependent on early information and requires data from the whole river basin. Based on the hypothesis that the flood events over the same area with same magnitude have almost identical evolution, it is crucial to develop a repository database of historical flood events. This tool, in the case of extended transboundary river basins, could constitute an operational warning system for the downstream area. The utility of SAR images for flood mapping, was demonstrated by previous studies but the SAR systems in orbit were not characterized by high operational capacity. Copernicus system will fill this gap in operational service for risk management, especially during emergency phase. The operational capabilities have been significantly improved by newly available satellite constellation, such as the Sentinel-1A AB mission, which is able to provide systematic acquisitions with a very high temporal resolution in a wide swath coverage. The present study deals with the monitoring of a transboundary flood event in Evros basin. The objective of the study is to create the "migration story" of the flooded areas on the basis of the evolution in time for the event occurred from October 2014 till May 2015. Flood hazard maps will be created, using SAR-based semi-automatic algorithms and then through the synthesis of the related maps in a GIS-system, a spatiotemporal thematic map of the event will be produced. The thematic map combined with TanDEM-X DEM, 12m/pixel spatial resolution, will define the non- affected areas which is a very useful information for the emergency planning and emergency response phases. The Sentinels meet the main requirements to be an effective and suitable operational tool in transboundary flood risk management.
Feedbacks between Reservoir Operation and Floodplain Development
NASA Astrophysics Data System (ADS)
Wallington, K.; Cai, X.
2017-12-01
The increased connectedness of socioeconomic and natural systems warrants the study of them jointly as Coupled Natural-Human Systems (CNHS) (Liu et al., 2007). One such CNHS given significant attention in recent years has been the coupled sociological-hydrological system of floodplains. Di Baldassarre et al. (2015) developed a model coupling floodplain development and levee heightening, a flood control measure, which demonstrated the "levee effect" and "adaptation effect" seen in observations. Here, we adapt the concepts discussed by Di Baldassarre et al. (2015) and apply them to floodplains in which the primary flood control measure is reservoir storage, rather than levee construction, to study the role of feedbacks between reservoir operation and floodplain development. Specifically, we investigate the feedback between floodplain development and optimal management of trade-offs between flood water conservation and flood control. By coupling a socio-economic model based on that of Di Baldassarre et al. (2015) with a reservoir optimization model based on that discussed in Ding et al. (2017), we show that reservoir operation rules can co-evolve with floodplain development. Furthermore, we intend to demonstrate that the model results are consistent with real-world data for reservoir operating curves and floodplain development. This model will help explain why some reservoirs are currently operated for purposes which they were not originally intended and thus inform reservoir design and construction.
SU-F-T-261: Reconstruction of Initial Photon Fluence Based On EPID Images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seliger, T; Engenhart-Cabillic, R; Czarnecki, D
2016-06-15
Purpose: Verifying an algorithm to reconstruct relative initial photon fluence for clinical use. Clinical EPID and CT images were acquired to reconstruct an external photon radiation treatment field. The reconstructed initial photon fluence could be used to verify the treatment or calculate the applied dose to the patient. Methods: The acquired EPID images were corrected for scatter caused by the patient and the EPID with an iterative reconstruction algorithm. The transmitted photon fluence behind the patient was calculated subsequently. Based on the transmitted fluence the initial photon fluence was calculated using a back-projection algorithm which takes the patient geometry andmore » its energy dependent linear attenuation into account. This attenuation was gained from the acquired cone-beam CT or the planning CT by calculating a water-equivalent radiological thickness for each irradiation direction. To verify the algorithm an inhomogeneous phantom consisting of three inhomogeneities was irradiated by a static 6 MV photon field and compared to a reference flood field image. Results: The mean deviation between the reconstructed relative photon fluence for the inhomogeneous phantom and the flood field EPID image was 3% rising up to 7% for off-axis fluence. This was probably caused by the used clinical EPID calibration, which flattens the inhomogeneous fluence profile of the beam. Conclusion: In this clinical experiment the algorithm achieved good results in the center of the field while it showed high deviation of the lateral fluence. This could be reduced by optimizing the EPID calibration, considering the off-axis differential energy response. In further progress this and other aspects of the EPID, eg. field size dependency, CT and dose calibration have to be studied to realize a clinical acceptable accuracy of 2%.« less
Dynamics of coupled human-landscape systems
NASA Astrophysics Data System (ADS)
Werner, B. T.; McNamara, D. E.
2007-11-01
A preliminary dynamical analysis of landscapes and humans as hierarchical complex systems suggests that strong coupling between the two that spreads to become regionally or globally pervasive should be focused at multi-year to decadal time scales. At these scales, landscape dynamics is dominated by water, sediment and biological routing mediated by fluvial, oceanic, atmospheric processes and human dynamics is dominated by simplifying, profit-maximizing market forces and political action based on projection of economic effect. Also at these scales, landscapes impact humans through patterns of natural disasters and trends such as sea level rise; humans impact landscapes by the effect of economic activity and changes meant to mitigate natural disasters and longer term trends. Based on this analysis, human-landscape coupled systems can be modeled using heterogeneous agents employing prediction models to determine actions to represent the nonlinear behavior of economic and political systems and rule-based routing algorithms to represent landscape processes. A cellular model for the development of New Orleans illustrates this approach, with routing algorithms for river and hurricane-storm surge determining flood extent, five markets (home, labor, hotel, tourism and port services) connecting seven types of economic agents (home buyers/laborers, home developers, hotel owners/ employers, hotel developers, tourists, port services developer and port services owners/employers), building of levees or a river spillway by political agents and damage to homes, hotels or port services within cells determined by the passage or depth of flood waters. The model reproduces historical aspects of New Orleans economic development and levee construction and the filtering of frequent small-scale floods at the expense of large disasters.
Alternating flood and drought hazards in the Drava Plain, Hungary
NASA Astrophysics Data System (ADS)
Lóczy, Dénes; Dezsö, József; Gyenizse, Péter; Ortmann-Ajkai, Adrienne
2016-04-01
Our research project covers the assessment of archive data and monitoring present-day water availability in the floodplain of the Hungarian Drava River. Historically flood hazard has been prevalent in the area. Recently, however, flood and drought hazards occur with equal frequency. Potential floodwater storage is defined from the analyses of soil conditions (grain size, porosity, water conductivity etc.) and GIS-based volumetric estimations of storage capacities in oxbows (including communication with groundwater). With the remarkable rate of river channel incision (2.4 m per century) and predictable climate change trends (increased annual mean temperature and decreased summer precipitation), the growing frequency and intensification of drought hazard is expected. For the assessment of drought hazard the impacts of hydrometeorological events, groundwater table dynamics and capillary rise are modelled, the water demands of natural vegetation and agricultural crops are studied. The project is closely linked to the ongoing Old Drava Programme, a comprehensive government project, which envisions floodplain rehabilitation through major transformations in water governance and land use of the region, and has numerous implications for regional development. Authors are grateful for financial support from the Hungarian National Scientific Research Fund (OTKA, contacts nos K 104552 and K 108755) as well as from the Visegrad Fund (31210058). The contribution is dedicated to the 650th anniversary of the foundation of the University of Pécs, Hungary.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, G.A.; Sepehrnoori, K.
1994-08-01
This research consists of the parallel development of a new chemical flooding simulator and the application of existing UTCHEM simulation code to model surfactant flooding. The new code is based upon a completely new numerical method that combines for the first time higher order finite difference methods, flux limiters, and implicit algorithms. Early results indicate that this approach has significant advantages in some problems and will likely enable simulation of much larger and more realistic chemical floods once it is fully developed. Additional improvements have also been made to the UTCHEM code and it has been applied for the firstmore » time to the study of stochastic reservoirs with and without horizontal wells to evaluate methods to reduce the cost and risk of surfactant flooding. During the first year of this contract, significant progress has been made on both of these tasks. The authors have found that there are indeed significant differences between the performance predictions based upon the traditional layered reservoir description and the more realistic and flexible descriptions using geostatistics. These preliminary studies of surfactant flooding using horizontal wells shows that although they have significant potential to greatly reduce project life and thus improve the economics of the process, their use requires accurate reservoir descriptions and simulations to be effective. Much more needs to be done to fully understand and optimize their use and develop reliable design criteria.« less
Co-Optimization of CO 2-EOR and Storage Processes in Mature Oil Reservoirs
Ampomah, William; Balch, Robert S.; Grigg, Reid B.; ...
2016-08-02
This article presents an optimization methodology for CO 2 enhanced oil recovery in partially depleted reservoirs. A field-scale compositional reservoir flow model was developed for assessing the performance history of an active CO 2 flood and for optimizing both oil production and CO 2 storage in the Farnsworth Unit (FWU), Ochiltree County, Texas. A geological framework model constructed from geophysical, geological, and engineering data acquired from the FWU was the basis for all reservoir simulations and the optimization method. An equation of state was calibrated with laboratory fluid analyses and subsequently used to predict the thermodynamic minimum miscible pressure (MMP).more » Initial history calibrations of primary, secondary and tertiary recovery were conducted as the basis for the study. After a good match was achieved, an optimization approach consisting of a proxy or surrogate model was constructed with a polynomial response surface method (PRSM). The PRSM utilized an objective function that maximized both oil recovery and CO 2 storage. Experimental design was used to link uncertain parameters to the objective function. Control variables considered in this study included: water alternating gas cycle and ratio, production rates and bottom-hole pressure of injectors and producers. Other key parameters considered in the modeling process were CO 2 purchase, gas recycle and addition of infill wells and/or patterns. The PRSM proxy model was ‘trained’ or calibrated with a series of training simulations. This involved an iterative process until the surrogate model reached a specific validation criterion. A sensitivity analysis was first conducted to ascertain which of these control variables to retain in the surrogate model. A genetic algorithm with a mixed-integer capability optimization approach was employed to determine the optimum developmental strategy to maximize both oil recovery and CO 2 storage. The proxy model reduced the computational cost significantly. The validation criteria of the reduced order model ensured accuracy in the dynamic modeling results. The prediction outcome suggested robustness and reliability of the genetic algorithm for optimizing both oil recovery and CO 2 storage. The reservoir modeling approach used in this study illustrates an improved approach to optimizing oil production and CO 2 storage within partially depleted oil reservoirs such as FWU. Lastly, this study may serve as a benchmark for potential CO 2–EOR projects in the Anadarko basin and/or geologically similar basins throughout the world.« less
Co-Optimization of CO 2-EOR and Storage Processes in Mature Oil Reservoirs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ampomah, William; Balch, Robert S.; Grigg, Reid B.
This article presents an optimization methodology for CO 2 enhanced oil recovery in partially depleted reservoirs. A field-scale compositional reservoir flow model was developed for assessing the performance history of an active CO 2 flood and for optimizing both oil production and CO 2 storage in the Farnsworth Unit (FWU), Ochiltree County, Texas. A geological framework model constructed from geophysical, geological, and engineering data acquired from the FWU was the basis for all reservoir simulations and the optimization method. An equation of state was calibrated with laboratory fluid analyses and subsequently used to predict the thermodynamic minimum miscible pressure (MMP).more » Initial history calibrations of primary, secondary and tertiary recovery were conducted as the basis for the study. After a good match was achieved, an optimization approach consisting of a proxy or surrogate model was constructed with a polynomial response surface method (PRSM). The PRSM utilized an objective function that maximized both oil recovery and CO 2 storage. Experimental design was used to link uncertain parameters to the objective function. Control variables considered in this study included: water alternating gas cycle and ratio, production rates and bottom-hole pressure of injectors and producers. Other key parameters considered in the modeling process were CO 2 purchase, gas recycle and addition of infill wells and/or patterns. The PRSM proxy model was ‘trained’ or calibrated with a series of training simulations. This involved an iterative process until the surrogate model reached a specific validation criterion. A sensitivity analysis was first conducted to ascertain which of these control variables to retain in the surrogate model. A genetic algorithm with a mixed-integer capability optimization approach was employed to determine the optimum developmental strategy to maximize both oil recovery and CO 2 storage. The proxy model reduced the computational cost significantly. The validation criteria of the reduced order model ensured accuracy in the dynamic modeling results. The prediction outcome suggested robustness and reliability of the genetic algorithm for optimizing both oil recovery and CO 2 storage. The reservoir modeling approach used in this study illustrates an improved approach to optimizing oil production and CO 2 storage within partially depleted oil reservoirs such as FWU. Lastly, this study may serve as a benchmark for potential CO 2–EOR projects in the Anadarko basin and/or geologically similar basins throughout the world.« less
Dynamic Flood Vulnerability Mapping with Google Earth Engine
NASA Astrophysics Data System (ADS)
Tellman, B.; Kuhn, C.; Max, S. A.; Sullivan, J.
2015-12-01
Satellites capture the rate and character of environmental change from local to global levels, yet integrating these changes into flood exposure models can be cost or time prohibitive. We explore an approach to global flood modeling by leveraging satellite data with computing power in Google Earth Engine to dynamically map flood hazards. Our research harnesses satellite imagery in two main ways: first to generate a globally consistent flood inundation layer and second to dynamically model flood vulnerability. Accurate and relevant hazard maps rely on high quality observation data. Advances in publicly available spatial, spectral, and radar data together with cloud computing allow us to improve existing efforts to develop a comprehensive flood extent database to support model training and calibration. This talk will demonstrate the classification results of algorithms developed in Earth Engine designed to detect flood events by combining observations from MODIS, Landsat 8, and Sentinel-1. Our method to derive flood footprints increases the number, resolution, and precision of spatial observations for flood events both in the US, recorded in the NCDC (National Climatic Data Center) storm events database, and globally, as recorded events from the Colorado Flood Observatory database. This improved dataset can then be used to train machine learning models that relate spatial temporal flood observations to satellite derived spatial temporal predictor variables such as precipitation, antecedent soil moisture, and impervious surface. This modeling approach allows us to rapidly update models with each new flood observation, providing near real time vulnerability maps. We will share the water detection algorithms used with each satellite and discuss flood detection results with examples from Bihar, India and the state of New York. We will also demonstrate how these flood observations are used to train machine learning models and estimate flood exposure. The final stage of our comprehensive approach to flood vulnerability couples inundation extent with social data to determine which flood exposed communities have the greatest propensity for loss. Specifically, by linking model outputs to census derived social vulnerability estimates (Indian and US, respectively) to predict how many people are at risk.
Swift delineation of flood-prone areas over large European regions
NASA Astrophysics Data System (ADS)
Tavares da Costa, Ricardo; Castellarin, Attilio; Manfreda, Salvatore; Samela, Caterina; Domeneghetti, Alessio; Mazzoli, Paolo; Luzzi, Valerio; Bagli, Stefano
2017-04-01
According to the European Environment Agency (EEA Report No 1/2016), a significant share of the European population is estimated to be living on or near a floodplain, with Italy having the highest population density in flood-prone areas among the countries analysed. This tendency, tied with event frequency and magnitude (e.g.: the 24/11/2016 floods in Italy) and the fact that river floods may occur at large scales and at a transboundary level, where data is often sparse, presents a challenge in flood-risk management. The availability of consistent flood hazard and risk maps during prevention, preparedness, response and recovery phases are a valuable and important step forward in improving the effectiveness, efficiency and robustness of evidence-based decision making. The present work aims at testing and discussing the usefulness of pattern recognition techniques based on geomorphologic indices (Manfreda et al., J. Hydrol. Eng., 2011, Degiorgis et al., J Hydrol., 2012, Samela et al., J. Hydrol. Eng., 2015) for the simplified mapping of river flood-prone areas at large scales. The techniques are applied to 25m Digital Elevation Models (DEM) of the Danube, Po and Severn river watersheds, obtained from the Copernicus data and information funded by the European Union - EU-DEM layers. Results are compared to the Pan-European flood hazard maps derived by Alfieri et al. (Hydrol. Proc., 2013) using a set of distributed hydrological (LISFLOOD, van der Knijff et al., Int. J. Geogr. Inf. Sci., 2010, employed within the European Flood Awareness System, www.efas.eu) and hydraulic models (LISFLOOD-FP, Bates and De Roo, J. Hydrol., 2000). Our study presents different calibration and cross-validation exercises of the DEM-based mapping algorithms to assess to which extent, and with which accuracy, they can be reproduced over different regions of Europe. This work is being developed under the System-Risk project (www.system-risk.eu) that received funding from the European Union's Framework Programme for Research and Innovation Horizon 2020 under the Marie Skłodowska-Curie Grant Agreement No. 676027. Keywords: flood hazard, data-scarce regions, large-scale studies, pattern recognition, linear binary classifiers, basin geomorphology, DEM.
NASA Astrophysics Data System (ADS)
Li, J.
2017-12-01
Large-watershed flood simulation and forecasting is very important for a distributed hydrological model in the application. There are some challenges including the model's spatial resolution effect, model performance and accuracy and so on. To cope with the challenge of the model's spatial resolution effect, different model resolution including 1000m*1000m, 600m*600m, 500m*500m, 400m*400m, 200m*200m were used to build the distributed hydrological model—Liuxihe model respectively. The purpose is to find which one is the best resolution for Liuxihe model in Large-watershed flood simulation and forecasting. This study sets up a physically based distributed hydrological model for flood forecasting of the Liujiang River basin in south China. Terrain data digital elevation model (DEM), soil type and land use type are downloaded from the website freely. The model parameters are optimized by using an improved Particle Swarm Optimization(PSO) algorithm; And parameter optimization could reduce the parameter uncertainty that exists for physically deriving model parameters. The different model resolution (200m*200m—1000m*1000m ) are proposed for modeling the Liujiang River basin flood with the Liuxihe model in this study. The best model's spatial resolution effect for flood simulation and forecasting is 200m*200m.And with the model's spatial resolution reduction, the model performance and accuracy also become worse and worse. When the model resolution is 1000m*1000m, the flood simulation and forecasting result is the worst, also the river channel divided based on this resolution is differs from the actual one. To keep the model with an acceptable performance, minimum model spatial resolution is needed. The suggested threshold model spatial resolution for modeling the Liujiang River basin flood is a 500m*500m grid cell, but the model spatial resolution with a 200m*200m grid cell is recommended in this study to keep the model at a best performance.
Razavi Termeh, Seyed Vahid; Kornejady, Aiding; Pourghasemi, Hamid Reza; Keesstra, Saskia
2018-02-15
Flood is one of the most destructive natural disasters which cause great financial and life losses per year. Therefore, producing susceptibility maps for flood management are necessary in order to reduce its harmful effects. The aim of the present study is to map flood hazard over the Jahrom Township in Fars Province using a combination of adaptive neuro-fuzzy inference systems (ANFIS) with different metaheuristics algorithms such as ant colony optimization (ACO), genetic algorithm (GA), and particle swarm optimization (PSO) and comparing their accuracy. A total number of 53 flood locations areas were identified, 35 locations of which were randomly selected in order to model flood susceptibility and the remaining 16 locations were used to validate the models. Learning vector quantization (LVQ), as one of the supervised neural network methods, was employed in order to estimate factors' importance. Nine flood conditioning factors namely: slope degree, plan curvature, altitude, topographic wetness index (TWI), stream power index (SPI), distance from river, land use/land cover, rainfall, and lithology were selected and the corresponding maps were prepared in ArcGIS. The frequency ratio (FR) model was used to assign weights to each class within particular controlling factor, then the weights was transferred into MATLAB software for further analyses and to combine with metaheuristic models. The ANFIS-PSO was found to be the most practical model in term of producing the highly focused flood susceptibility map with lesser spatial distribution related to highly susceptible classes. The chi-square result attests the same, where the ANFIS-PSO had the highest spatial differentiation within flood susceptibility classes over the study area. The area under the curve (AUC) obtained from ROC curve indicated the accuracy of 91.4%, 91.8%, 92.6% and 94.5% for the respective models of FR, ANFIS-ACO, ANFIS-GA, and ANFIS-PSO ensembles. So, the ensemble of ANFIS-PSO was introduced as the premier model in the study area. Furthermore, LVQ results revealed that slope degree, rainfall, and altitude were the most effective factors. As regards the premier model, a total area of 44.74% was recognized as highly susceptible to flooding. The results of this study can be used as a platform for better land use planning in order to manage the highly susceptible zones to flooding and reduce the anticipated losses. Copyright © 2017 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Johnson, Betty D.
1996-01-01
Describes how the library staff at Stetson University in central Florida coped with flooding. Offers recommendations on storage; carpeting; fans and humidifiers; emergency phone contacts; labeled keys; protective coverings for books; safe storage for "disaster" materials and equipment; removal of less valuable materials; sandbags;…
ON THE HYDRAULICS OF STREAM FLOW ROUTING WITH BANK STORAGE
Bank storage is a process in which volumes of water are temporally retained by alluvial stream banks during flood events, and gradually released to partially sustain baseflow. This process has important hydrologic and ecological implications. In this paper, analytical solutions a...
Modeling of n-hexadecane and water sorption in wood
Ganna Baglayeva; Gautham Krishnamoorthy; Charles R. Frihart; Wayne S. Seamus; Jane O’Dell; Evguenii Kozliak
2016-01-01
Contamination of wooden framing structures with semivolatile organic chemicals is a common occurrence from the spillage of chemicals, such as impregnation with fuel oil hydrocarbons during floods. Little information is available to understand the penetration of fuel oil hydrocarbons into wood under ambient conditions. To imitate flood and storage scenarios, the...
USDA-ARS?s Scientific Manuscript database
Forty-five flood control reservoirs, authorized in the United States Flood Control Act of 1936, were installed between 1969 and 1982 in the Little Washita River Experimental Watershed (LWREW), located in central Oklahoma. Over time, these reservoirs have lost water storage capacity due to sedimentat...
Tropical stormwater floods: a sustainable solution
NASA Astrophysics Data System (ADS)
Molinie, Jack; Bade, Francois; Nagau, Jimmy; Nuiro, Paul
2017-04-01
Stormwater management is one of the most difficult problem of urban and suburban area. The urban runoff volume related to rain intensity and surfaces properties can lead to flood. Thereby, urban flooding creates considerable infrastructure problem, economics and human damages. In tropical countries, burgeoning human population coupled with unplanned urbanization altered the natural drainage. Consequently, classical intense rain around 100 cm/h produces frequent street flooding. In our case, we study the management of intense tropical rain, by using a network of individual rain storage tanks. The study area is economical and industrial zone installed in a coastal plain , with seventy per cent of impermeable surface (roads, parking lots, building roof, …) and thirty per cent of wetland (mangrove, …). Our solution is to delay the routes and parking lots runoff to the roof one. We propose sustainable individual water storage and a real time dynamical management, which permit to control the roof water arrival in the stormwater culvert. During the remaining time, the stored rainwater can be used for domestic activities instead of the use of drinking water.
Regionalization by fuzzy expert system based approach optimized by genetic algorithm
NASA Astrophysics Data System (ADS)
Chavoshi, Sattar; Azmin Sulaiman, Wan Nor; Saghafian, Bahram; Bin Sulaiman, Md. Nasir; Manaf, Latifah Abd
2013-04-01
SummaryIn recent years soft computing methods are being increasingly used to model complex hydrologic processes. These methods can simulate the real life processes without prior knowledge of the exact relationship between their components. The principal aim of this paper is perform hydrological regionalization based on soft computing concepts in the southern strip of the Caspian Sea basin, north of Iran. The basin with an area of 42,400 sq. km has been affected by severe floods in recent years that caused damages to human life and properties. Although some 61 hydrometric stations and 31 weather stations with 44 years of observed data (1961-2005) are operated in the study area, previous flood studies in this region have been hampered by insufficient and/or reliable observed rainfall-runoff records. In order to investigate the homogeneity (h) of catchments and overcome incompatibility that may occur on boundaries of cluster groups, a fuzzy expert system (FES) approach is used which incorporates physical and climatic characteristics, as well as flood seasonality and geographic location. Genetic algorithm (GA) was employed to adjust parameters of FES and optimize the system. In order to achieve the objective, a MATLAB programming code was developed which considers the heterogeneity criteria of less than 1 (H < 1) as the satisfying criteria. The adopted approach was found superior to the conventional hydrologic regionalization methods in the region because it employs greater number of homogeneity parameters and produces lower values of heterogeneity criteria.
A new service offered by rural environment to the city: stormwater reception.
NASA Astrophysics Data System (ADS)
Chiaradia, Enrico Antonio; Weber, Enrico; Masseroni, Daniele; Battista Bischetti, Gian; Gandolfi, Claudio
2017-04-01
Stormwaters are the main cause of urban floods in many urbanized areas. Historically, stormwater management practices have been focused on building infrastructures that achieve runoff attenuation through the storage of water volumes in large detention basins. However, this approach has proven to be insufficient to resolve the problem as well as it is difficult to implement in areas with a dense urban fabric. Nowadays, around the world, water managers are increasingly embracing "soft path" approaches, that aim to manage the excess of urban runoff through Green Infrastructures, where detention capacities are provided by the retention proprieties of soil and vegetation elements. Along the line of these new sustainable stormwater management practices, the aim of this study is to promote a further paradigm-shift with respect to the traditional practices i.e. to investigate the possibility to use the already existing green infrastructures of the peri-urban rural areas as reception element of the surplus of urban runoff. Many territories in Northern Italy, for example. are characterized by a high density of irrigation canals and agricultural fields that, in some cases, are isolated or pent-up inside urbanized areas. Both these elements may represent storage volumes for accumulating stormwater from urban areas. In this work, we implemented a holistic framework, based on Self Organized Map technique (SOM), with the objective to produce a spatial map of the stormwater reception level that can be provided by the rural environment. We elaborated physiographic characteristics of irrigation canals and agricultural fields through the SOM algorithm obtaining as output a series of cluster groups with the same level of receptivity. This procedure was applied on an area of 1933 km2 around the city of Milan and a map of 250x250m resolution was obtained with three different levels of stormwater reception capacity. About 50% of rural environment has a good level of reception and only 30% and 20% of rural areas have respectively a moderate and scarce level of reception. By the results we can conclude that the rural environment could become a valuable structural alternative to the traditional stormwater control methods, ascribing the rural environment to a new role in urban flood protection from.
NASA Astrophysics Data System (ADS)
Athaudage, Chandranath R. N.; Bradley, Alan B.; Lech, Margaret
2003-12-01
A dynamic programming-based optimization strategy for a temporal decomposition (TD) model of speech and its application to low-rate speech coding in storage and broadcasting is presented. In previous work with the spectral stability-based event localizing (SBEL) TD algorithm, the event localization was performed based on a spectral stability criterion. Although this approach gave reasonably good results, there was no assurance on the optimality of the event locations. In the present work, we have optimized the event localizing task using a dynamic programming-based optimization strategy. Simulation results show that an improved TD model accuracy can be achieved. A methodology of incorporating the optimized TD algorithm within the standard MELP speech coder for the efficient compression of speech spectral information is also presented. The performance evaluation results revealed that the proposed speech coding scheme achieves 50%-60% compression of speech spectral information with negligible degradation in the decoded speech quality.
Modeling multi-source flooding disaster and developing simulation framework in Delta
NASA Astrophysics Data System (ADS)
Liu, Y.; Cui, X.; Zhang, W.
2016-12-01
Most Delta regions of the world are densely populated and with advanced economies. However, due to impact of the multi-source flooding (upstream flood, rainstorm waterlogging, storm surge flood), the Delta regions is very vulnerable. The academic circles attach great importance to the multi-source flooding disaster in these areas. The Pearl River Delta urban agglomeration in south China is selected as the research area. Based on analysis of natural and environmental characteristics data of the Delta urban agglomeration(remote sensing data, land use data, topographic map, etc.), hydrological monitoring data, research of the uneven distribution and process of regional rainfall, the relationship between the underlying surface and the parameters of runoff, effect of flood storage pattern, we use an automatic or semi-automatic method for dividing spatial units to reflect the runoff characteristics in urban agglomeration, and develop an Multi-model Ensemble System in changing environment, including urban hydrologic model, parallel computational 1D&2D hydrodynamic model, storm surge forecast model and other professional models, the system will have the abilities like real-time setting a variety of boundary conditions, fast and real-time calculation, dynamic presentation of results, powerful statistical analysis function. The model could be optimized and improved by a variety of verification methods. This work was supported by the National Natural Science Foundation of China (41471427); Special Basic Research Key Fund for Central Public Scientific Research Institutes.
Genetic algorithm enhanced by machine learning in dynamic aperture optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Yongjun; Cheng, Weixing; Yu, Li Hua
With the aid of machine learning techniques, the genetic algorithm has been enhanced and applied to the multi-objective optimization problem presented by the dynamic aperture of the National Synchrotron Light Source II (NSLS-II) Storage Ring. During the evolution processes employed by the genetic algorithm, the population is classified into different clusters in the search space. The clusters with top average fitness are given “elite” status. Intervention on the population is implemented by repopulating some potentially competitive candidates based on the experience learned from the accumulated data. These candidates replace randomly selected candidates among the original data pool. The average fitnessmore » of the population is therefore improved while diversity is not lost. Maintaining diversity ensures that the optimization is global rather than local. The quality of the population increases and produces more competitive descendants accelerating the evolution process significantly. When identifying the distribution of optimal candidates, they appear to be located in isolated islands within the search space. Some of these optimal candidates have been experimentally confirmed at the NSLS-II storage ring. Furthermore, the machine learning techniques that exploit the genetic algorithm can also be used in other population-based optimization problems such as particle swarm algorithm.« less
Genetic algorithm enhanced by machine learning in dynamic aperture optimization
NASA Astrophysics Data System (ADS)
Li, Yongjun; Cheng, Weixing; Yu, Li Hua; Rainer, Robert
2018-05-01
With the aid of machine learning techniques, the genetic algorithm has been enhanced and applied to the multi-objective optimization problem presented by the dynamic aperture of the National Synchrotron Light Source II (NSLS-II) Storage Ring. During the evolution processes employed by the genetic algorithm, the population is classified into different clusters in the search space. The clusters with top average fitness are given "elite" status. Intervention on the population is implemented by repopulating some potentially competitive candidates based on the experience learned from the accumulated data. These candidates replace randomly selected candidates among the original data pool. The average fitness of the population is therefore improved while diversity is not lost. Maintaining diversity ensures that the optimization is global rather than local. The quality of the population increases and produces more competitive descendants accelerating the evolution process significantly. When identifying the distribution of optimal candidates, they appear to be located in isolated islands within the search space. Some of these optimal candidates have been experimentally confirmed at the NSLS-II storage ring. The machine learning techniques that exploit the genetic algorithm can also be used in other population-based optimization problems such as particle swarm algorithm.
Genetic algorithm enhanced by machine learning in dynamic aperture optimization
Li, Yongjun; Cheng, Weixing; Yu, Li Hua; ...
2018-05-29
With the aid of machine learning techniques, the genetic algorithm has been enhanced and applied to the multi-objective optimization problem presented by the dynamic aperture of the National Synchrotron Light Source II (NSLS-II) Storage Ring. During the evolution processes employed by the genetic algorithm, the population is classified into different clusters in the search space. The clusters with top average fitness are given “elite” status. Intervention on the population is implemented by repopulating some potentially competitive candidates based on the experience learned from the accumulated data. These candidates replace randomly selected candidates among the original data pool. The average fitnessmore » of the population is therefore improved while diversity is not lost. Maintaining diversity ensures that the optimization is global rather than local. The quality of the population increases and produces more competitive descendants accelerating the evolution process significantly. When identifying the distribution of optimal candidates, they appear to be located in isolated islands within the search space. Some of these optimal candidates have been experimentally confirmed at the NSLS-II storage ring. Furthermore, the machine learning techniques that exploit the genetic algorithm can also be used in other population-based optimization problems such as particle swarm algorithm.« less
Drexler, Judith Z.; Krauss, Ken W.; Sasser, M. Craig; Fuller, Christopher C.; Swarzenski, Christopher M.; Powell, Amber; Swanson, Kathleen M.; Orlando, James L.
2013-01-01
Carbon storage was compared between impounded and naturally tidal freshwater marshes along the Lower Waccamaw River in South Carolina, USA. Soil cores were collected in (1) naturally tidal, (2) moist soil (impounded, seasonally drained since ~1970), and (3) deeply flooded “treatments” (impounded, flooded to ~90 cm since ~2002). Cores were analyzed for % organic carbon, % total carbon, bulk density, and 210Pb and 137Cs for dating purposes. Carbon sequestration rates ranged from 25 to 200 g C m−2 yr−1 (moist soil), 80–435 g C m−2 yr−1 (naturally tidal), and 100–250 g C m−2 yr−1 (deeply flooded). The moist soil and naturally tidal treatments were compared over a period of 40 years. The naturally tidal treatment had significantly higher carbon storage (mean = 219 g C m−2 yr−1 vs. mean = 91 g C m−2 yr−1) and four times the vertical accretion rate (mean = 0.84 cm yr−1 vs. mean = 0.21 cm yr−1) of the moist soil treatment. The results strongly suggest that the long drainage period in moist soil management limits carbon storage over time. Managers across the National Wildlife Refuge system have an opportunity to increase carbon storage by minimizing drainage in impoundments as much as practicable.
Exploiting Concurrent Wake-Up Transmissions Using Beat Frequencies
2017-01-01
Wake-up receivers are the natural choice for wireless sensor networks because of their ultra-low power consumption and their ability to provide communications on demand. A downside of ultra-low power wake-up receivers is their low sensitivity caused by the passive demodulation of the carrier signal. In this article, we present a novel communication scheme by exploiting purposefully-interfering out-of-tune signals of two or more wireless sensor nodes, which produce the wake-up signal as the beat frequency of superposed carriers. Additionally, we introduce a communication algorithm and a flooding protocol based on this approach. Our experiments show that our approach increases the received signal strength up to 3 dB, improving communication robustness and reliability. Furthermore, we demonstrate the feasibility of our newly-developed protocols by means of an outdoor experiment and an indoor setup consisting of several nodes. The flooding algorithm achieves almost a 100% wake-up rate in less than 20 ms. PMID:28933749
Design of Restoration Method Based on Compressed Sensing and TwIST Algorithm
NASA Astrophysics Data System (ADS)
Zhang, Fei; Piao, Yan
2018-04-01
In order to improve the subjective and objective quality of degraded images at low sampling rates effectively,save storage space and reduce computational complexity at the same time, this paper proposes a joint restoration algorithm of compressed sensing and two step iterative threshold shrinkage (TwIST). The algorithm applies the TwIST algorithm which used in image restoration to the compressed sensing theory. Then, a small amount of sparse high-frequency information is obtained in frequency domain. The TwIST algorithm based on compressed sensing theory is used to accurately reconstruct the high frequency image. The experimental results show that the proposed algorithm achieves better subjective visual effects and objective quality of degraded images while accurately restoring degraded images.
A new concept to study the effect of climate change on different flood types
NASA Astrophysics Data System (ADS)
Nissen, Katrin; Nied, Manuela; Pardowitz, Tobias; Ulbrich, Uwe; Merz, Bruno
2014-05-01
Flooding is triggered by the interaction of various processes. Especially important are the hydrological conditions prior to the event (e.g. soil saturation, snow cover) and the meteorological conditions during flood development (e.g. rainfall, temperature). Depending on these (pre-) conditions different flood types may develop such as long-rain floods, short-rain floods, flash floods, snowmelt floods and rain-on-snow floods. A new concept taking these factors into account is introduced and applied to flooding in the Elbe River basin. During the period September 1957 to August 2002, 82 flood events are identified and classified according to their flood type. The hydrological and meteorological conditions at each day during the analysis period are detemined. In case of the hydrological conditions, a soil moisture pattern classification is carried out. Soil moisture is simulated with a rainfall-runoff model driven by atmospheric observations. Days of similar soil moisture patterns are identified by a principle component analysis and a subsequent cluster analysis on the leading principal components. The meteorological conditions are identified by applying a cluster analysis to the geopotential height, temperature and humidity fields of the ERA40 reanalysis data set using the SANDRA cluster algorithm. We are able to identify specific pattern combinations of hydrological pre-conditions and meteorological conditions which favour different flood types. Based on these results it is possible to analyse the effect of climate change on different flood types. As an example we show first results obtained using an ensemble of climate scenario simulations of ECHAM5 MPIOM model, taking only the changes in the meteorological conditions into account. According to the simulations, the frequency of the meteorological patterns favouring long-rain, short-rain and flash floods will not change significantly under future climate conditions. A significant increase is, however, predicted for the amount of precipitation associated with many of the relevant meteorological patterns. The increase varies between 12 and 67% depending on the weather pattern.
Enhancement of Beaconless Location-Based Routing with Signal Strength Assistance for Ad-Hoc Networks
NASA Astrophysics Data System (ADS)
Chen, Guowei; Itoh, Kenichi; Sato, Takuro
Routing in Ad-hoc networks is unreliable due to the mobility of the nodes. Location-based routing protocols, unlike other protocols which rely on flooding, excel in network scalability. Furthermore, new location-based routing protocols, like, e. g. BLR [1], IGF [2], & CBF [3] have been proposed, with the feature of not requiring beacons in MAC-layer, which improve more in terms of scalability. Such beaconless routing protocols can work efficiently in dense network areas. However, these protocols' algorithms have no ability to avoid from routing into sparse areas. In this article, historical signal strength has been added as a factor into the BLR algorithm, which avoids routing into sparse area, and consequently improves the global routing efficiency.
Estimating flood hydrographs and volumes for Alabama streams
Olin, D.A.; Atkins, J.B.
1988-01-01
The hydraulic design of highway drainage structures involves an evaluation of the effect of the proposed highway structures on lives, property, and stream stability. Flood hydrographs and associated flood volumes are useful tools in evaluating these effects. For design purposes, the Alabama Highway Department needs information on flood hydrographs and volumes associated with flood peaks of specific recurrence intervals (design floods) at proposed or existing bridge crossings. This report will provide the engineer with a method to estimate flood hydrographs, volumes, and lagtimes for rural and urban streams in Alabama with drainage areas less than 500 sq mi. Existing computer programs and methods to estimate flood hydrographs and volumes for ungaged streams have been developed in Georgia. These computer programs and methods were applied to streams in Alabama. The report gives detailed instructions on how to estimate flood hydrographs for ungaged rural or urban streams in Alabama with drainage areas less than 500 sq mi, without significant in-channel storage or regulations. (USGS)
Long-lasting floods buffer the thermal regime of the Pampas
NASA Astrophysics Data System (ADS)
Houspanossian, Javier; Kuppel, Sylvain; Nosetto, Marcelo; Di Bella, Carlos; Oricchio, Patricio; Barrucand, Mariana; Rusticucci, Matilde; Jobbágy, Esteban
2018-01-01
The presence of large water masses influences the thermal regime of nearby land shaping the local climate of coastal areas by the ocean or large continental lakes. Large surface water bodies have an ephemeral nature in the vast sedimentary plains of the Pampas (Argentina) where non-flooded periods alternate with flooding cycles covering up to one third of the landscape for several months. Based on temperature records from 17 sites located 1 to 700 km away from the Atlantic coast and MODIS land surface temperature data, we explore the effects of floods on diurnal and seasonal thermal ranges as well as temperature extremes. In non-flooded periods, there is a linear increase of mean diurnal thermal range (DTR) from the coast towards the interior of the region (DTR increasing from 10 to 16 K, 0.79 K/100 km, r 2 = 0.81). This relationship weakens during flood episodes when the DTR of flood-prone inland locations shows a decline of 2 to 4 K, depending on surface water coverage in the surrounding area. DTR even approaches typical coastal values 500 km away from the ocean in the most flooded location that we studied during the three flooding cycles recorded in the study period. Frosts-free periods, a key driver of the phenology of both natural and cultivated ecosystems, are extended by up to 55 days during floods, most likely as a result of enhanced ground heat storage across the landscape ( 2.7 fold change in day-night heat transfer) combined with other effects on the surface energy balance such as greater night evaporation rates. The reduced thermal range and longer frost-free periods affect plant growth development and may offer an opportunity for longer crop growing periods, which may not only contribute to partially compensating for regional production losses caused by floods, but also open avenues for flood mitigation through higher plant evapotranspirative water losses.
NASA Astrophysics Data System (ADS)
Saleh, F.; Ramaswamy, V.; Georgas, N.; Blumberg, A. F.; Wang, Y.
2016-12-01
Advances in computational resources and modeling techniques are opening the path to effectively integrate existing complex models. In the context of flood prediction, recent extreme events have demonstrated the importance of integrating components of the hydrosystem to better represent the interactions amongst different physical processes and phenomena. As such, there is a pressing need to develop holistic and cross-disciplinary modeling frameworks that effectively integrate existing models and better represent the operative dynamics. This work presents a novel Hydrologic-Hydraulic-Hydrodynamic Ensemble (H3E) flood prediction framework that operationally integrates existing predictive models representing coastal (New York Harbor Observing and Prediction System, NYHOPS), hydrologic (US Army Corps of Engineers Hydrologic Modeling System, HEC-HMS) and hydraulic (2-dimensional River Analysis System, HEC-RAS) components. The state-of-the-art framework is forced with 125 ensemble meteorological inputs from numerical weather prediction models including the Global Ensemble Forecast System, the European Centre for Medium-Range Weather Forecasts (ECMWF), the Canadian Meteorological Centre (CMC), the Short Range Ensemble Forecast (SREF) and the North American Mesoscale Forecast System (NAM). The framework produces, within a 96-hour forecast horizon, on-the-fly Google Earth flood maps that provide critical information for decision makers and emergency preparedness managers. The utility of the framework was demonstrated by retrospectively forecasting an extreme flood event, hurricane Sandy in the Passaic and Hackensack watersheds (New Jersey, USA). Hurricane Sandy caused significant damage to a number of critical facilities in this area including the New Jersey Transit's main storage and maintenance facility. The results of this work demonstrate that ensemble based frameworks provide improved flood predictions and useful information about associated uncertainties, thus improving the assessment of risks as when compared to a deterministic forecast. The work offers perspectives for short-term flood forecasts, flood mitigation strategies and best management practices for climate change scenarios.
NASA Astrophysics Data System (ADS)
McGrath, H.; Stefanakis, E.; Nastev, M.
2016-06-01
Conventional knowledge of the flood hazard alone (extent and frequency) is not sufficient for informed decision-making. The public safety community needs tools and guidance to adequately undertake flood hazard risk assessment in order to estimate respective damages and social and economic losses. While many complex computer models have been developed for flood risk assessment, they require highly trained personnel to prepare the necessary input (hazard, inventory of the built environment, and vulnerabilities) and analyze model outputs. As such, tools which utilize open-source software or are built within popular desktop software programs are appealing alternatives. The recently developed Rapid Risk Evaluation (ER2) application runs scenario based loss assessment analyses in a Microsoft Excel spreadsheet. User input is limited to a handful of intuitive drop-down menus utilized to describe the building type, age, occupancy and the expected water level. In anticipation of local depth damage curves and other needed vulnerability parameters, those from the U.S. FEMA's Hazus-Flood software have been imported and temporarily accessed in conjunction with user input to display exposure and estimated economic losses related to the structure and the content of the building. Building types and occupancies representative of those most exposed to flooding in Fredericton (New Brunswick) were introduced and test flood scenarios were run. The algorithm was successfully validated against results from the Hazus-Flood model for the same building types and flood depths.
DNA-COMPACT: DNA COMpression Based on a Pattern-Aware Contextual Modeling Technique
Li, Pinghao; Wang, Shuang; Kim, Jihoon; Xiong, Hongkai; Ohno-Machado, Lucila; Jiang, Xiaoqian
2013-01-01
Genome data are becoming increasingly important for modern medicine. As the rate of increase in DNA sequencing outstrips the rate of increase in disk storage capacity, the storage and data transferring of large genome data are becoming important concerns for biomedical researchers. We propose a two-pass lossless genome compression algorithm, which highlights the synthesis of complementary contextual models, to improve the compression performance. The proposed framework could handle genome compression with and without reference sequences, and demonstrated performance advantages over best existing algorithms. The method for reference-free compression led to bit rates of 1.720 and 1.838 bits per base for bacteria and yeast, which were approximately 3.7% and 2.6% better than the state-of-the-art algorithms. Regarding performance with reference, we tested on the first Korean personal genome sequence data set, and our proposed method demonstrated a 189-fold compression rate, reducing the raw file size from 2986.8 MB to 15.8 MB at a comparable decompression cost with existing algorithms. DNAcompact is freely available at https://sourceforge.net/projects/dnacompact/for research purpose. PMID:24282536
NASA Astrophysics Data System (ADS)
Blum, M.
2001-12-01
Mixed bedrock-alluvial valleys are the conveyor belts for sediment delivery to passive continental margins. Mapping, stratigraphic and sedimentologic investigations, and development of geochronological frameworks for large midlatitude rivers of this type, in Western Europe and the Texas Coastal Plain, provide for evaluation of fluvial responses to climate change over the last glacial-interglacial period, and the foundations for future quantitative evaluation of long profile evolution, changes through time in flood magnitude, and changes in storage and flux of sediments. This paper focuses on two issues. First, glacial vs. interglacial period fluvial systems are fundamentally different in terms of channel geometry, depositional style, and patterns of sediment storage. Glacial-period systems were dominated by coarse-grained channel belts (braided channels in Europe, large-wavelength meandering in Texas), and lacked fine-grained flood-plain deposits, whereas Holocene units, especially those of late Holocene age, contain appreciable thicknesses of flood-plain facies. Hence, extreme overbank flooding was not significant during the long glacial period, most flood events were contained within bankfull channel perimeters, and fine sediments were bypassed through the system to marine basins. By contrast, extreme overbank floods have been increasingly important during the relatively short Holocene, and a significant volume of fine sediment is sequestered in flood-plain settings. Second, glacial vs. interglacial systems exhibit different amplitudes and frequencies of fluvial adjustment to climate change. High-amplitude but low-frequency adjustments characterized the long glacial period, with 2-3 extended periods of lateral migration and sediment storage puncuated by episodes of valley incision. Low-amplitude but high-frequency adjustments have been more typical of the short Holocene, when there has been little net valley incision or net changes in sediment storage, but frequent changes in the magnitude and frequency of floods and periods of overbank flooding. This high-frequency signal is absent in landforms and deposits from the glacial period. Glacial vs. interglacial contrasts in process and stratigraphic results are the rule in most large unglaciated fluvial systems. 70-80 percent or more of any 100 kyr glacial-interglacial cycle is characterized by significant ice volume, cooler temperatures, mid-shelf or lower sea-level positions, and cooler-smaller ocean basins. A glacial-period process regime is therefore the norm, and an interglacial regime like that of the late Holocene is relatively unique and non-representative. Large unglaciated midlatitude fluvial systems may be in long-term equilibrium with a glacial-period environment, with long profiles graded to glacial-period sea-level positions, so fluvial systems respond to major changes in climate, discharge regimes, and sediment loads, but they appear to have been relatively insensitive to higher-frequency changes. Short interglacials like the Holocene are, by comparison, periods of abnormally high sea levels and relatively low-amplitude climate changes, but fluvial systems appear to exhibit a greatly increased sensitivity to subtle changes in discharge regimes that produce frequent periods of disequilibrium.
LFQC: a lossless compression algorithm for FASTQ files
Nicolae, Marius; Pathak, Sudipta; Rajasekaran, Sanguthevar
2015-01-01
Motivation: Next Generation Sequencing (NGS) technologies have revolutionized genomic research by reducing the cost of whole genome sequencing. One of the biggest challenges posed by modern sequencing technology is economic storage of NGS data. Storing raw data is infeasible because of its enormous size and high redundancy. In this article, we address the problem of storage and transmission of large FASTQ files using innovative compression techniques. Results: We introduce a new lossless non-reference based FASTQ compression algorithm named Lossless FASTQ Compressor. We have compared our algorithm with other state of the art big data compression algorithms namely gzip, bzip2, fastqz (Bonfield and Mahoney, 2013), fqzcomp (Bonfield and Mahoney, 2013), Quip (Jones et al., 2012), DSRC2 (Roguski and Deorowicz, 2014). This comparison reveals that our algorithm achieves better compression ratios on LS454 and SOLiD datasets. Availability and implementation: The implementations are freely available for non-commercial purposes. They can be downloaded from http://engr.uconn.edu/rajasek/lfqc-v1.1.zip. Contact: rajasek@engr.uconn.edu PMID:26093148
Probabilistic flood extent estimates from social media flood observations
NASA Astrophysics Data System (ADS)
Brouwer, Tom; Eilander, Dirk; van Loenen, Arnejan; Booij, Martijn J.; Wijnberg, Kathelijne M.; Verkade, Jan S.; Wagemaker, Jurjen
2017-05-01
The increasing number and severity of floods, driven by phenomena such as urbanization, deforestation, subsidence and climate change, create a growing need for accurate and timely flood maps. In this paper we present and evaluate a method to create deterministic and probabilistic flood maps from Twitter messages that mention locations of flooding. A deterministic flood map created for the December 2015 flood in the city of York (UK) showed good performance (F(2) = 0.69; a statistic ranging from 0 to 1, with 1 expressing a perfect fit with validation data). The probabilistic flood maps we created showed that, in the York case study, the uncertainty in flood extent was mainly induced by errors in the precise locations of flood observations as derived from Twitter data. Errors in the terrain elevation data or in the parameters of the applied algorithm contributed less to flood extent uncertainty. Although these maps tended to overestimate the actual probability of flooding, they gave a reasonable representation of flood extent uncertainty in the area. This study illustrates that inherently uncertain data from social media can be used to derive information about flooding.
NASA Astrophysics Data System (ADS)
Jackson, Bethanna; Trodahl, Martha; Maxwell, Deborah; Easton, Stuart
2016-04-01
This talk discusses recent progress in adapting the Land Utilisation and Capability Indicator (LUCI) framework to take account of the impact of detailed farm management on greenhouse gas emissions and on water, sediment and nutrient delivery to waterways. LUCI is a land management decision support framework which examines the impact of current and potential interventions on a variety of outcomes, including flood mitigation, water supply, greenhouse gas emissions, biodiversity, erosion, sediment and nutrient delivery to waterways, and agricultural production. The potential of the landscape to provide benefits is a function of both the biophysical properties of individual landscape elements and their configuration. Both are respected in LUCI where possible. For example, the hydrology, sediment and chemical routing algorithms are based on physical principles of hillslope flow, taking information on the storage and permeability capacity of elements within the landscape from soil and land use data and honoring physical thresholds, mass and energy balance constraints. LUCI discretizes hydrological response units within the landscape according to similarity of their hydraulic properties and preserves spatially explicit topographical routing. Implications of keeping the "status quo" or potential scenarios of land management change can then be evaluated under different meteorological or climatic events (e.g. flood return periods, rainfall events, droughts), cascading water through the hydrological response units using a "fill and spill" approach. These and other component algorithms are designed to be fast-running while maintaining physical consistency and fine spatial detail. This allows it to operate from subfield level scale to catchment, or even national scale, simultaneously. It analyses and communicates the spatial pattern of individual provision and tradeoffs/synergies between desired outcomes at detailed resolutions and provides suggestions on where management change could be most efficiently targeted to meet water quality targets while maintaining production. Historically, LUCI has inferred land management from nationally available land cover categorisations, so lacked the capacity to discriminate between differences in more detailed management (tillage information, type of irrigation system, stocking numbers and type, etc). However, recently a collaboration with a farmer cooperative has commenced. LUCI is being further developed to take in a range of more detailed management information, which can be entered directly to LUCI or easily integrated via existing farm management files. Example output using a variety of management scenarios and ongoing "validation" of LUCI's performance at the farm scale will be presented using New Zealand crop, beef and dairy farms as case studies.
NASA Astrophysics Data System (ADS)
Li, Q.; Wang, Y. L.; Li, H. C.; Zhang, M.; Li, C. Z.; Chen, X.
2017-12-01
Rainfall threshold plays an important role in flash flood warning. A simple and easy method, using Rational Equation to calculate rainfall threshold, was proposed in this study. The critical rainfall equation was deduced from the Rational Equation. On the basis of the Manning equation and the results of Chinese Flash Flood Survey and Evaluation (CFFSE) Project, the critical flow was obtained, and the net rainfall was calculated. Three aspects of the rainfall losses, i.e. depression storage, vegetation interception, and soil infiltration were considered. The critical rainfall was the sum of the net rainfall and the rainfall losses. Rainfall threshold was estimated after considering the watershed soil moisture using the critical rainfall. In order to demonstrate this method, Zuojiao watershed in Yunnan Province was chosen as study area. The results showed the rainfall thresholds calculated by the Rational Equation method were approximated to the rainfall thresholds obtained from CFFSE, and were in accordance with the observed rainfall during flash flood events. Thus the calculated results are reasonable and the method is effective. This study provided a quick and convenient way to calculated rainfall threshold of flash flood warning for the grass root staffs and offered technical support for estimating rainfall threshold.
Analysis of flood hazard under consideration of dike breaches
NASA Astrophysics Data System (ADS)
Vorogushyn, S.; Apel, H.; Lindenschmidt, K.-E.; Merz, B.
2009-04-01
The study focuses on the development and application of a new modelling system which allows a comprehensive flood hazard assessment along diked river reaches under consideration of dike failures. The proposed Inundation Hazard Assessment Model (IHAM) represents a hybrid probabilistic-deterministic model. It comprises three models interactively coupled at runtime. These are: (1) 1D unsteady hydrodynamic model of river channel and floodplain flow between dikes, (2) probabilistic dike breach model which determines possible dike breach locations, breach widths and breach outflow discharges, and (3) 2D raster-based diffusion wave storage cell model of the hinterland areas behind the dikes. Due to the unsteady nature of the 1D and 2D coupled models, the dependence between hydraulic load at various locations along the reach is explicitly considered. The probabilistic dike breach model describes dike failures due to three failure mechanisms: overtopping, piping and slope instability caused by the seepage flow through the dike core (micro-instability). Dike failures for each mechanism are simulated based on fragility functions. The probability of breach is conditioned by the uncertainty in geometrical and geotechnical dike parameters. The 2D storage cell model driven by the breach outflow boundary conditions computes an extended spectrum of flood intensity indicators such as water depth, flow velocity, impulse, inundation duration and rate of water rise. IHAM is embedded in a Monte Carlo simulation in order to account for the natural variability of the flood generation processes reflected in the form of input hydrographs and for the randomness of dike failures given by breach locations, times and widths. The scenario calculations for the developed synthetic input hydrographs for the main river and tributary were carried out for floods with return periods of T = 100; 200; 500; 1000 a. Based on the modelling results, probabilistic dike hazard maps could be generated that indicate the failure probability of each discretised dike section for every scenario magnitude. Besides the binary inundation patterns that indicate the probability of raster cells being inundated, IHAM generates probabilistic flood hazard maps. These maps display spatial patterns of the considered flood intensity indicators and their associated return periods. The probabilistic nature of IHAM allows for the generation of percentile flood hazard maps that indicate the median and uncertainty bounds of the flood intensity indicators. The uncertainty results from the natural variability of the flow hydrographs and randomness of dike breach processes. The same uncertainty sources determine the uncertainty in the flow hydrographs along the study reach. The simulations showed that the dike breach stochasticity has an increasing impact on hydrograph uncertainty in downstream direction. Whereas in the upstream part of the reach the hydrograph uncertainty is mainly stipulated by the variability of the flood wave form, the dike failures strongly shape the uncertainty boundaries in the downstream part of the reach. Finally, scenarios of polder deployment for the extreme floods with T = 200; 500; 1000 a were simulated with IHAM. The results indicate a rather weak reduction of the mean and median flow hydrographs in the river channel. However, the capping of the flow peaks resulted in a considerable reduction of the overtopping failures downstream of the polder with a simultaneous slight increase of the piping and slope micro-instability frequencies explained by a more durable average impoundment. The developed IHAM simulation system represents a new scientific tool for studying fluvial inundation dynamics under extreme conditions incorporating effects of technical flood protection measures. With its major outputs in form of novel probabilistic inundation and dike hazard maps, the IHAM system has a high practical value for decision support in flood management.
DNABIT Compress - Genome compression algorithm.
Rajarajeswari, Pothuraju; Apparao, Allam
2011-01-22
Data compression is concerned with how information is organized in data. Efficient storage means removal of redundancy from the data being stored in the DNA molecule. Data compression algorithms remove redundancy and are used to understand biologically important molecules. We present a compression algorithm, "DNABIT Compress" for DNA sequences based on a novel algorithm of assigning binary bits for smaller segments of DNA bases to compress both repetitive and non repetitive DNA sequence. Our proposed algorithm achieves the best compression ratio for DNA sequences for larger genome. Significantly better compression results show that "DNABIT Compress" algorithm is the best among the remaining compression algorithms. While achieving the best compression ratios for DNA sequences (Genomes),our new DNABIT Compress algorithm significantly improves the running time of all previous DNA compression programs. Assigning binary bits (Unique BIT CODE) for (Exact Repeats, Reverse Repeats) fragments of DNA sequence is also a unique concept introduced in this algorithm for the first time in DNA compression. This proposed new algorithm could achieve the best compression ratio as much as 1.58 bits/bases where the existing best methods could not achieve a ratio less than 1.72 bits/bases.
Cooperative Management of a Lithium-Ion Battery Energy Storage Network: A Distributed MPC Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fang, Huazhen; Wu, Di; Yang, Tao
2016-12-12
This paper presents a study of cooperative power supply and storage for a network of Lithium-ion energy storage systems (LiBESSs). We propose to develop a distributed model predictive control (MPC) approach for two reasons. First, able to account for the practical constraints of a LiBESS, the MPC can enable a constraint-aware operation. Second, a distributed management can cope with a complex network that integrates a large number of LiBESSs over a complex communication topology. With this motivation, we then build a fully distributed MPC algorithm from an optimization perspective, which is based on an extension of the alternating direction methodmore » of multipliers (ADMM) method. A simulation example is provided to demonstrate the effectiveness of the proposed algorithm.« less
A multi-source data assimilation framework for flood forecasting: Accounting for runoff routing lags
NASA Astrophysics Data System (ADS)
Meng, S.; Xie, X.
2015-12-01
In the flood forecasting practice, model performance is usually degraded due to various sources of uncertainties, including the uncertainties from input data, model parameters, model structures and output observations. Data assimilation is a useful methodology to reduce uncertainties in flood forecasting. For the short-term flood forecasting, an accurate estimation of initial soil moisture condition will improve the forecasting performance. Considering the time delay of runoff routing is another important effect for the forecasting performance. Moreover, the observation data of hydrological variables (including ground observations and satellite observations) are becoming easily available. The reliability of the short-term flood forecasting could be improved by assimilating multi-source data. The objective of this study is to develop a multi-source data assimilation framework for real-time flood forecasting. In this data assimilation framework, the first step is assimilating the up-layer soil moisture observations to update model state and generated runoff based on the ensemble Kalman filter (EnKF) method, and the second step is assimilating discharge observations to update model state and runoff within a fixed time window based on the ensemble Kalman smoother (EnKS) method. This smoothing technique is adopted to account for the runoff routing lag. Using such assimilation framework of the soil moisture and discharge observations is expected to improve the flood forecasting. In order to distinguish the effectiveness of this dual-step assimilation framework, we designed a dual-EnKF algorithm in which the observed soil moisture and discharge are assimilated separately without accounting for the runoff routing lag. The results show that the multi-source data assimilation framework can effectively improve flood forecasting, especially when the runoff routing has a distinct time lag. Thus, this new data assimilation framework holds a great potential in operational flood forecasting by merging observations from ground measurement and remote sensing retrivals.
Wetland storage to reduce flood damages in the Red River
Steven Shultz
2000-01-01
The restoration of previously drained wetlands to store water was not found to be an economically feasible strategy to reduce flood related damages in two sub-watersheds of the Red River Valley (the Maple River Watershed in North Dakota, and the Wild Rice Watershed of Minnesota). Restoring wetlands, while providing full ecological services, was less feasible, even...
NASA Astrophysics Data System (ADS)
Tien Bui, Dieu; Pradhan, Biswajeet; Nampak, Haleh; Bui, Quang-Thanh; Tran, Quynh-An; Nguyen, Quoc-Phi
2016-09-01
This paper proposes a new artificial intelligence approach based on neural fuzzy inference system and metaheuristic optimization for flood susceptibility modeling, namely MONF. In the new approach, the neural fuzzy inference system was used to create an initial flood susceptibility model and then the model was optimized using two metaheuristic algorithms, Evolutionary Genetic and Particle Swarm Optimization. A high-frequency tropical cyclone area of the Tuong Duong district in Central Vietnam was used as a case study. First, a GIS database for the study area was constructed. The database that includes 76 historical flood inundated areas and ten flood influencing factors was used to develop and validate the proposed model. Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Receiver Operating Characteristic (ROC) curve, and area under the ROC curve (AUC) were used to assess the model performance and its prediction capability. Experimental results showed that the proposed model has high performance on both the training (RMSE = 0.306, MAE = 0.094, AUC = 0.962) and validation dataset (RMSE = 0.362, MAE = 0.130, AUC = 0.911). The usability of the proposed model was evaluated by comparing with those obtained from state-of-the art benchmark soft computing techniques such as J48 Decision Tree, Random Forest, Multi-layer Perceptron Neural Network, Support Vector Machine, and Adaptive Neuro Fuzzy Inference System. The results show that the proposed MONF model outperforms the above benchmark models; we conclude that the MONF model is a new alternative tool that should be used in flood susceptibility mapping. The result in this study is useful for planners and decision makers for sustainable management of flood-prone areas.
XIAO, Xiangming; DONG, Jinwei; QIN, Yuanwei; WANG, Zongming
2016-01-01
Information of paddy rice distribution is essential for food production and methane emission calculation. Phenology-based algorithms have been utilized in the mapping of paddy rice fields by identifying the unique flooding and seedling transplanting phases using multi-temporal moderate resolution (500 m to 1 km) images. In this study, we developed simple algorithms to identify paddy rice at a fine resolution at the regional scale using multi-temporal Landsat imagery. Sixteen Landsat images from 2010–2012 were used to generate the 30 m paddy rice map in the Sanjiang Plain, northeast China—one of the major paddy rice cultivation regions in China. Three vegetation indices, Normalized Difference Vegetation Index (NDVI), Enhanced Vegetation Index (EVI), and Land Surface Water Index (LSWI), were used to identify rice fields during the flooding/transplanting and ripening phases. The user and producer accuracies of paddy rice on the resultant Landsat-based paddy rice map were 90% and 94%, respectively. The Landsat-based paddy rice map was an improvement over the paddy rice layer on the National Land Cover Dataset, which was generated through visual interpretation and digitalization on the fine-resolution images. The agricultural census data substantially underreported paddy rice area, raising serious concern about its use for studies on food security. PMID:27695637
An Innovative Thinking-Based Intelligent Information Fusion Algorithm
Hu, Liang; Liu, Gang; Zhou, Jin
2013-01-01
This study proposes an intelligent algorithm that can realize information fusion in reference to the relative research achievements in brain cognitive theory and innovative computation. This algorithm treats knowledge as core and information fusion as a knowledge-based innovative thinking process. Furthermore, the five key parts of this algorithm including information sense and perception, memory storage, divergent thinking, convergent thinking, and evaluation system are simulated and modeled. This algorithm fully develops innovative thinking skills of knowledge in information fusion and is a try to converse the abstract conception of brain cognitive science to specific and operable research routes and strategies. Furthermore, the influences of each parameter of this algorithm on algorithm performance are analyzed and compared with those of classical intelligent algorithms trough test. Test results suggest that the algorithm proposed in this study can obtain the optimum problem solution by less target evaluation times, improve optimization effectiveness, and achieve the effective fusion of information. PMID:23956699
An innovative thinking-based intelligent information fusion algorithm.
Lu, Huimin; Hu, Liang; Liu, Gang; Zhou, Jin
2013-01-01
This study proposes an intelligent algorithm that can realize information fusion in reference to the relative research achievements in brain cognitive theory and innovative computation. This algorithm treats knowledge as core and information fusion as a knowledge-based innovative thinking process. Furthermore, the five key parts of this algorithm including information sense and perception, memory storage, divergent thinking, convergent thinking, and evaluation system are simulated and modeled. This algorithm fully develops innovative thinking skills of knowledge in information fusion and is a try to converse the abstract conception of brain cognitive science to specific and operable research routes and strategies. Furthermore, the influences of each parameter of this algorithm on algorithm performance are analyzed and compared with those of classical intelligent algorithms trough test. Test results suggest that the algorithm proposed in this study can obtain the optimum problem solution by less target evaluation times, improve optimization effectiveness, and achieve the effective fusion of information.
DOT National Transportation Integrated Search
0000-01-01
n the Access Restoration Project Task 1.2 Report 1, the algorithms for detecting roadway debris piles and flooded areas were described in detail. Those algorithms take CRS data as input and automatically detect the roadway obstructions. Although the ...
Fang, Fang; Ni, Bing-Jie; Yu, Han-Qing
2009-06-01
In this study, weighted non-linear least-squares analysis and accelerating genetic algorithm are integrated to estimate the kinetic parameters of substrate consumption and storage product formation of activated sludge. A storage product formation equation is developed and used to construct the objective function for the determination of its production kinetics. The weighted least-squares analysis is employed to calculate the differences in the storage product concentration between the model predictions and the experimental data as the sum of squared weighted errors. The kinetic parameters for the substrate consumption and the storage product formation are estimated to be the maximum heterotrophic growth rate of 0.121/h, the yield coefficient of 0.44 mg CODX/mg CODS (COD, chemical oxygen demand) and the substrate half saturation constant of 16.9 mg/L, respectively, by minimizing the objective function using a real-coding-based accelerating genetic algorithm. Also, the fraction of substrate electrons diverted to the storage product formation is estimated to be 0.43 mg CODSTO/mg CODS. The validity of our approach is confirmed by the results of independent tests and the kinetic parameter values reported in literature, suggesting that this approach could be useful to evaluate the product formation kinetics of mixed cultures like activated sludge. More importantly, as this integrated approach could estimate the kinetic parameters rapidly and accurately, it could be applied to other biological processes.
NASA Astrophysics Data System (ADS)
Araujo, L.; Silva, F. P. D.; Moreira, D. M.; Vásquez P, I. L.; Justi da Silva, M. G. A.; Fernandes, N.; Rotunno Filho, O. C.
2017-12-01
Flash floods are characterized by a rapid rise in water levels, high flow rates and large amounts of debris. Several factors have relevance to the occurrence of these phenomena, including high precipitation rates, terrain slope, soil saturation degree, vegetation cover, soil type, among others. In general, the greater the precipitation intensity, the more likely is the occurrence of a significant increase in flow rate. Particularly on steep and rocky plains or heavily urbanized areas, relatively small rain rates can trigger a flash flood event. In addition, high rain rates in short time intervals can temporarily saturate the surface soil layer acting as waterproofing and favoring the occurrence of greater runoff rates due to non-infiltration of rainwater into the soil. Thus, although precipitation is considered the most important factor for flooding, the interaction between rainfall and the soil can sometimes be of greater importance. In this context, this work investigates the dynamic storage of water associated with flash flood events for Quitandinha river watershed, a tributary of Piabanha river, occurred between 2013 and 2014, by means of water balance analyses applied to three watersheds of varying magnitudes (9.25 km², 260 km² and 429 km²) along the rainy season under different time steps (hourly and daily) using remotely sensed and observational precipitation data. The research work is driven by the hypothesis of a hydrologically active bedrock layer, as the watershed is located in a humid region, having intemperate (fractured) rock layer, just below a shallow soil layer, in the higher part of the basin where steep slopes prevail. The results showed a delay of the variation of the dynamic storage in relation to rainfall peaks and water levels. Such behavior indicates that the surface soil layer, which is not very thick in the region, becomes rapidly saturated along rainfall events. Subsequently, the water infiltrates into the rocky layer and the water storage in the fractured bedrock assumes significant role due to its corresponding release to streams as storm flows.
49 CFR 1220.2 - Protection and storage of records.
Code of Federal Regulations, 2010 CFR
2010-10-01
... § 1220.2 Protection and storage of records. (a) The company shall protect records subject to this part from fires, floods, and other hazards, and safeguard the records from unnecessary exposure to deterioration from excessive humidity, dryness, or lack of ventilation. (b) The company shall notify the Board...
NASA Astrophysics Data System (ADS)
Vourlitis, G. L.; Dalmagro, H. J.; Arruda, P. H. Z. D.; Lathuilliere, M. J.; Borges Pinto, O.; Couto, E. G.; Nogueira, J. D. S.; Johnson, M. S.
2016-12-01
Wetlands have a great potential for carbon (C) storage because frequent waterlogging can inhibit microbial respiration. However, waterlogging can also promote methane (CH4) production, which reduces ecosystem C sequestration. Unfortunately, the C storage dynamics of seasonally flooded (hyperseasonal) tropical forests are poorly understood even though the large C stocks, warm temperature, and prolonged flooding have the potential to cause high rates of CO2 storage and CH4 emission. Thus, the aim of this study was to provide a continuous ecosystem-level quantification of CO2 and CH4 fluxes and carbon balance for a hyperseasonal forest in the Brazilian Pantanal using eddy covariance. Trace gas fluxes were measured using an eddy covariance system installed on a 28 m tall tower. The study area was chosen because it represents approximately 12% of the total area of the Pantanal, which consists of seasonal floodplains with an annual flood pulse that results from an intense rainy season (October to April) that is followed by an intense dry season (May to September). The measurements were performed over two flood cycles and an intervening drought period between the years 2014 and 2015. In 2015 the study area was flooded for 190 days, which was 22 days longer than in 2014. Mean (± SD) rates of CH4 flux during the 2014 and 2015 flooded period were 0.091 ± 0.04 µmol m-2 s-1 and 0.118 ± 0.04 µmol m-2 s-1, respectively, and almost zero (0.001 ± 0.0001 µmol m-2 s-1) during 2015 dry season. In contrast, mean CO2 flux rates during the flooded period were -1.58 and -1.50 µmol m-2 s-1 for 2014 and 2015, respectively, showing the net ecosystem CO2 uptake, while during the dry season, the forest was a net source of CO2 to the atmosphere of on average 0.73 µmol m-2 s-1. Total wet season carbon balance (CO2 + CH4) was virtually identical in 2014 and 2015 (ca. -255 gC m-2) even though the 2015 flood period was longer; however, the ecosystem lost 139 gC m-2 during the dry period of 2015. These data indicate that hyperseasonal forests of the Pantanal, and presumably other seasonally flooded tropical forests, are potentially large sources of CH4, but overall large C sinks.
Ishii, Audrey L.; Soong, David T.; Sharpe, Jennifer B.
2010-01-01
Illinois StreamStats (ILSS) is a Web-based application for computing selected basin characteristics and flood-peak quantiles based on the most recently (2010) published (Soong and others, 2004) regional flood-frequency equations at any rural stream location in Illinois. Limited streamflow statistics including general statistics, flow durations, and base flows also are available for U.S. Geological Survey (USGS) streamflow-gaging stations. ILSS can be accessed on the Web at http://streamstats.usgs.gov/ by selecting the State Applications hyperlink and choosing Illinois from the pull-down menu. ILSS was implemented for Illinois by obtaining and projecting ancillary geographic information system (GIS) coverages; populating the StreamStats database with streamflow-gaging station data; hydroprocessing the 30-meter digital elevation model (DEM) for Illinois to conform to streams represented in the National Hydrographic Dataset 1:100,000 stream coverage; and customizing the Web-based Extensible Markup Language (XML) programs for computing basin characteristics for Illinois. The basin characteristics computed by ILSS then were compared to the basin characteristics used in the published study, and adjustments were applied to the XML algorithms for slope and basin length. Testing of ILSS was accomplished by comparing flood quantiles computed by ILSS at a an approximately random sample of 170 streamflow-gaging stations computed by ILSS with the published flood quantile estimates. Differences between the log-transformed flood quantiles were not statistically significant at the 95-percent confidence level for the State as a whole, nor by the regions determined by each equation, except for region 1, in the northwest corner of the State. In region 1, the average difference in flood quantile estimates ranged from 3.76 percent for the 2-year flood quantile to 4.27 percent for the 500-year flood quantile. The total number of stations in region 1 was small (21) and the mean difference is not large (less than one-tenth of the average prediction error for the regression-equation estimates). The sensitivity of the flood-quantile estimates to differences in the computed basin characteristics are determined and presented in tables. A test of usage consistency was conducted by having at least 7 new users compute flood quantile estimates at 27 locations. The average maximum deviation of the estimate from the mode value at each site was 1.31 percent after four mislocated sites were removed. A comparison of manual 100-year flood-quantile computations with ILSS at 34 sites indicated no statistically significant difference. ILSS appears to be an accurate, reliable, and effective tool for flood-quantile estimates.
The remote sensing image segmentation mean shift algorithm parallel processing based on MapReduce
NASA Astrophysics Data System (ADS)
Chen, Xi; Zhou, Liqing
2015-12-01
With the development of satellite remote sensing technology and the remote sensing image data, traditional remote sensing image segmentation technology cannot meet the massive remote sensing image processing and storage requirements. This article put cloud computing and parallel computing technology in remote sensing image segmentation process, and build a cheap and efficient computer cluster system that uses parallel processing to achieve MeanShift algorithm of remote sensing image segmentation based on the MapReduce model, not only to ensure the quality of remote sensing image segmentation, improved split speed, and better meet the real-time requirements. The remote sensing image segmentation MeanShift algorithm parallel processing algorithm based on MapReduce shows certain significance and a realization of value.
Popescu, Dan; Ichim, Loretta; Stoican, Florin
2017-02-23
Floods are natural disasters which cause the most economic damage at the global level. Therefore, flood monitoring and damage estimation are very important for the population, authorities and insurance companies. The paper proposes an original solution, based on a hybrid network and complex image processing, to this problem. As first novelty, a multilevel system, with two components, terrestrial and aerial, was proposed and designed by the authors as support for image acquisition from a delimited region. The terrestrial component contains a Ground Control Station, as a coordinator at distance, which communicates via the internet with more Ground Data Terminals, as a fixed nodes network for data acquisition and communication. The aerial component contains mobile nodes-fixed wing type UAVs. In order to evaluate flood damage, two tasks must be accomplished by the network: area coverage and image processing. The second novelty of the paper consists of texture analysis in a deep neural network, taking into account new criteria for feature selection and patch classification. Color and spatial information extracted from chromatic co-occurrence matrix and mass fractal dimension were used as well. Finally, the experimental results in a real mission demonstrate the validity of the proposed methodologies and the performances of the algorithms.
Popescu, Dan; Ichim, Loretta; Stoican, Florin
2017-01-01
Floods are natural disasters which cause the most economic damage at the global level. Therefore, flood monitoring and damage estimation are very important for the population, authorities and insurance companies. The paper proposes an original solution, based on a hybrid network and complex image processing, to this problem. As first novelty, a multilevel system, with two components, terrestrial and aerial, was proposed and designed by the authors as support for image acquisition from a delimited region. The terrestrial component contains a Ground Control Station, as a coordinator at distance, which communicates via the internet with more Ground Data Terminals, as a fixed nodes network for data acquisition and communication. The aerial component contains mobile nodes—fixed wing type UAVs. In order to evaluate flood damage, two tasks must be accomplished by the network: area coverage and image processing. The second novelty of the paper consists of texture analysis in a deep neural network, taking into account new criteria for feature selection and patch classification. Color and spatial information extracted from chromatic co-occurrence matrix and mass fractal dimension were used as well. Finally, the experimental results in a real mission demonstrate the validity of the proposed methodologies and the performances of the algorithms. PMID:28241479
Global Assessment of Exploitable Surface Reservoir Storage under Climate Change
NASA Astrophysics Data System (ADS)
Liu, L.; Parkinson, S.; Gidden, M.; Byers, E.; Satoh, Y.; Riahi, K.
2016-12-01
Surface water reservoirs provide us with reliable water supply systems, hydropower generation, flood control, and recreation services. Reliable reservoirs can be robust measures for water security and can help smooth out challenging seasonal variability of river flows. Yet, reservoirs also cause flow fragmentation in rivers and can lead to flooding of upstream areas, thereby displacing existing land-uses and ecosystems. The anticipated population growth, land use and climate change in many regions globally suggest a critical need to assess the potential for appropriate reservoir capacity that can balance rising demands with long-term water security. In this research, we assessed exploitable reservoir potential under climate change and human development constraints by deriving storage-yield relationships for 235 river basins globally. The storage-yield relationships map the amount of storage capacity required to meet a given water demand based on a 30-year inflow sequence. Runoff data is simulated with an ensemble of Global Hydrological Models (GHMs) for each of five bias-corrected general circulation models (GCMs) under four climate change pathways. These data are used to define future 30-year inflows in each river basin for time period between 2010 and 2080. The calculated capacity is then combined with geographical information of environmental and human development exclusion zones to further limit the storage capacity expansion potential in each basin. We investigated the reliability of reservoir potentials across different climate change scenarios and Shared Socioeconomic Pathways (SSPs) to identify river basins where reservoir expansion will be particularly challenging. Preliminary results suggest large disparities in reservoir potential across basins: some basins have already approached exploitable reserves, while some others display abundant potential. Exclusions zones pose significant impact on the amount of actual exploitable storage and firm yields worldwide: 30% of reservoir potential would be unavailable because of land occupation by environmental and human development. Results from this study will help decision makers to understand the reliability of infrastructure systems particularly sensitive to future water availability.
A global flash flood forecasting system
NASA Astrophysics Data System (ADS)
Baugh, Calum; Pappenberger, Florian; Wetterhall, Fredrik; Hewson, Tim; Zsoter, Ervin
2016-04-01
The sudden and devastating nature of flash flood events means it is imperative to provide early warnings such as those derived from Numerical Weather Prediction (NWP) forecasts. Currently such systems exist on basin, national and continental scales in Europe, North America and Australia but rely on high resolution NWP forecasts or rainfall-radar nowcasting, neither of which have global coverage. To produce global flash flood forecasts this work investigates the possibility of using forecasts from a global NWP system. In particular we: (i) discuss how global NWP can be used for flash flood forecasting and discuss strengths and weaknesses; (ii) demonstrate how a robust evaluation can be performed given the rarity of the event; (iii) highlight the challenges and opportunities in communicating flash flood uncertainty to decision makers; and (iv) explore future developments which would significantly improve global flash flood forecasting. The proposed forecast system uses ensemble surface runoff forecasts from the ECMWF H-TESSEL land surface scheme. A flash flood index is generated using the ERIC (Enhanced Runoff Index based on Climatology) methodology [Raynaud et al., 2014]. This global methodology is applied to a series of flash floods across southern Europe. Results from the system are compared against warnings produced using the higher resolution COSMO-LEPS limited area model. The global system is evaluated by comparing forecasted warning locations against a flash flood database of media reports created in partnership with floodlist.com. To deal with the lack of objectivity in media reports we carefully assess the suitability of different skill scores and apply spatial uncertainty thresholds to the observations. To communicate the uncertainties of the flash flood system output we experiment with a dynamic region-growing algorithm. This automatically clusters regions of similar return period exceedence probabilities, thus presenting the at-risk areas at a spatial resolution appropriate to the NWP system. We then demonstrate how these warning areas could eventually complement existing global systems such as the Global Flood Awareness System (GloFAS), to give warnings of flash floods. This work demonstrates the possibility of creating a global flash flood forecasting system based on forecasts from existing global NWP systems. Future developments, in post-processing for example, will need to address an under-prediction bias, for extreme point rainfall, that is innate to current-generation global models.
Simulation of 1998-Big Flood in Changjiang River Catchment, China
NASA Astrophysics Data System (ADS)
Nakayama, T.; Watanabe, M.
2006-05-01
Almost every year, China is affected by severe flooding, which causes considerable economic loss and serious damage to towns and farms. Big floods are mainly concentrated in the middle and lower reaches of the "seven big rivers", which include the Changjiang (Yangtze) River, the Yellow (Huanghe) River, and the Huaihe River. The Changjiang River is the fourth largest water resource to the oceans after the Amazon, Zaire, and Orinoco Rivers. In addition to abnormal weather, artificial effects were considered as main causes of the big flood disaster in the Changjiang River catchment by the previous researches; (i) extreme deforestation and soil erosion in the upper reaches, (ii) shrinking of lake water volumes and their reduced connection with the Changjiang River due to reclamation of lakes that retarded water in the middle reaches, and (iii) restriction of channel capacity following levee construction. Because there is an urgent need to quantify these relations on the spatial scale of the whole catchment in order to prevent flood damage as small as possible, it is very important to evaluate the complicated phenomena of water/heat dynamics in the Changjiang River catchment by using process-based models. The present research focuses on simulating the water/heat dynamics for 1998 big-flood with 60-year recurrent period in the Changjiang River catchment. We compared the flood period of 1998 with the normal period of 1987-1988. We expanded the NIES Integrated Catchment-based Eco-hydrology (NICE) model (Nakayama and Watanabe, 2004; Nakayama et al., 2006) for the application to broader catchments in order to evaluate large- scale flooding in the Changjiang River (NICE-FLD). We simulated the water/heat dynamics in the entire catchment (3,000 km wide by 1,000 km long) with a resolution of 10 km mesh by using the NICE-FLD. The model reproduced excellently the river discharge, soil moisture, evapotranspiration, groundwater level, et al. Furthermore, we evaluated the role of flood storage capacity in the lakes and farms in relation to the water/heat budgets, and simulated the change of water/heat dynamics by human activity in order to help decision-making on sustainable development in the catchment.
Shi, Ce; Qian, Jianping; Han, Shuai; Fan, Beilei; Yang, Xinting; Wu, Xiaoming
2018-03-15
The study assessed the feasibility of developing a machine vision system based on pupil and gill color changes in tilapia for simultaneous prediction of total volatile basic nitrogen (TVB-N), thiobarbituric acid (TBA) and total viable counts (TVC) during storage at 4°C. The pupils and gills were chosen and color space conversion among RGB, HSI and L ∗ a ∗ b ∗ color spaces was performed automatically by an image processing algorithm. Multiple regression models were established by correlating pupil and gill color parameters with TVB-N, TVC and TBA (R 2 =0.989-0.999). However, assessment of freshness based on gill color is destructive and time-consuming because gill cover must be removed before images are captured. Finally, visualization maps of spoilage based on pupil color were achieved using image algorithms. The results show that assessment of tilapia pupil color parameters using machine vision can be used as a low-cost, on-line method for predicting freshness during 4°C storage. Copyright © 2017 Elsevier Ltd. All rights reserved.
Modeling Lake Storage Dynamics to support Arctic Boreal Vulnerability Experiment (ABoVE)
NASA Astrophysics Data System (ADS)
Vimal, S.; Lettenmaier, D. P.; Smith, L. C.; Smith, S.; Bowling, L. C.; Pavelsky, T.
2017-12-01
The Arctic and Boreal Zone (ABZ) of Canada and Alaska includes vast areas of permafrost, lakes, and wetlands. Permafrost thawing in this area is expected to increase due to the projected rise of temperature caused by climate change. Over the long term, this may reduce overall surface water area, but in the near-term, the opposite is being observed, with rising paludification (lake/wetland expansion). One element of NASA's ABoVE field experiment is observations of lake and wetland extent and surface elevations using NASA's AirSWOT airborne interferometric radar, accompanied by a high-resolution camera. One use of the WSE retrievals will be to constrain model estimates of lake storage dynamics. Here, we compare predictions using the lake dynamics algorithm within the Variable Infiltration Capacity (VIC) land surface scheme. The VIC lake algorithm includes representation of sub-grid topography, where the depth and area of seasonally-flooded areas are modeled as a function of topographic wetness index, basin area, and slope. The topography data used is from a new global digital elevation model, MERIT-DEM. We initially set up VIC at sites with varying permafrost conditions (i.e., no permafrost, discontinuous, continuous) in Saskatoon and Yellowknife, Canada, and Toolik Lake, Alaska. We constrained the uncalibrated model with the WSE at the time of the first ABoVE flight, and quantified the model's ability to predict WSE and ΔWSE during the time of the second flight. Finally, we evaluated the sensitivity of the VIC-lakes model and compared the three permafrost conditions. Our results quantify the sensitivity of surface water to permafrost state across the target sites. Furthermore, our evaluation of the lake modeling framework contributes to the modeling and mapping framework for lake and reservoir storage change evaluation globally as part of the SWOT mission, planned for launch in 2021.
Dean, David; Topping, David; Schmidt, John C.; Griffiths, Ronald; Sabol, Thomas
2016-01-01
The Rio Grande in the Big Bend region of Texas, USA, and Chihuahua and Coahuila, Mexico, undergoes rapid geomorphic changes as a result of its large sediment supply and variable hydrology; thus, it is a useful natural laboratory to investigate the relative importance of flow strength and sediment supply in controlling alluvial channel change. We analyzed a suite of sediment transport and geomorphic data to determine the cumulative influence of different flood types on changing channel form. In this study, physically based analyses suggest that channel change in the Rio Grande is controlled by both changes in flow strength and sediment supply over different spatial and temporal scales. Channel narrowing is primarily caused by substantial deposition of sediment supplied to the Rio Grande during tributary-sourced flash floods. Tributary floods have large suspended-sediment concentrations, occur for short durations, and attenuate rapidly downstream in the Rio Grande, depositing much of their sediment in downstream reaches. Long-duration floods on the mainstem have the capacity to enlarge the Rio Grande, and these floods, released from upstream dams, can either erode or deposit sediment in the Rio Grande depending upon the antecedent in-channel sediment supply and the magnitude and duration of the flood. Geomorphic and sediment transport analyses show that the locations and rates of sand erosion and deposition during long-duration floods are most strongly controlled by spatial changes in flow strength, largely through changes in channel slope. However, spatial differences in the in-channel sediment supply regulate sediment evacuation or accumulation over time in long reaches (greater than a kilometer).
NASA Astrophysics Data System (ADS)
Nakatsugawa, M.; Kobayashi, Y.; Okazaki, R.; Taniguchi, Y.
2017-12-01
This research aims to improve accuracy of water level prediction calculations for more effective river management. In August 2016, Hokkaido was visited by four typhoons, whose heavy rainfall caused severe flooding. In the Tokoro river basin of Eastern Hokkaido, the water level (WL) at the Kamikawazoe gauging station, which is at the lower reaches exceeded the design high-water level and the water rose to the highest level on record. To predict such flood conditions and mitigate disaster damage, it is necessary to improve the accuracy of prediction as well as to prolong the lead time (LT) required for disaster mitigation measures such as flood-fighting activities and evacuation actions by residents. There is the need to predict the river water level around the peak stage earlier and more accurately. Previous research dealing with WL prediction had proposed a method in which the WL at the lower reaches is estimated by the correlation with the WL at the upper reaches (hereinafter: "the water level correlation method"). Additionally, a runoff model-based method has been generally used in which the discharge is estimated by giving rainfall prediction data to a runoff model such as a storage function model and then the WL is estimated from that discharge by using a WL discharge rating curve (H-Q curve). In this research, an attempt was made to predict WL by applying the Random Forest (RF) method, which is a machine learning method that can estimate the contribution of explanatory variables. Furthermore, from the practical point of view, we investigated the prediction of WL based on a multiple correlation (MC) method involving factors using explanatory variables with high contribution in the RF method, and we examined the proper selection of explanatory variables and the extension of LT. The following results were found: 1) Based on the RF method tuned up by learning from previous floods, the WL for the abnormal flood case of August 2016 was properly predicted with a lead time of 6 h. 2) Based on the contribution of explanatory variables, factors were selected for the MC method. In this way, plausible prediction results were obtained.
Construction of Polarimetric Radar-Based Reference Rain Maps for the Iowa Flood Studies Campaign
NASA Technical Reports Server (NTRS)
Petersen, Walter; Wolff, David; Krajewski, Witek; Gatlin, Patrick
2015-01-01
The Global Precipitation Measurement (GPM) Mission Iowa Flood Studies (IFloodS) campaign was conducted in central and northeastern Iowa during the months of April-June, 2013. Specific science objectives for IFloodS included quantification of uncertainties in satellite and ground-based estimates of precipitation, 4-D characterization of precipitation physical processes and associated parameters (e.g., size distributions, water contents, types, structure etc.), assessment of the impact of precipitation estimation uncertainty and physical processes on hydrologic predictive skill, and refinement of field observations and data analysis approaches as they pertain to future GPM integrated hydrologic validation and related field studies. In addition to field campaign archival of raw and processed satellite data (including precipitation products), key ground-based platforms such as the NASA NPOL S-band and D3R Ka/Ku-band dual-polarimetric radars, University of Iowa X-band dual-polarimetric radars, a large network of paired rain gauge platforms, and a large network of 2D Video and Parsivel disdrometers were deployed. In something of a canonical approach, the radar (NPOL in particular), gauge and disdrometer observational assets were deployed to create a consistent high-quality distributed (time and space sampling) radar-based ground "reference" rainfall dataset, with known uncertainties, that could be used for assessing the satellite-based precipitation products at a range of space/time scales. Subsequently, the impact of uncertainties in the satellite products could be evaluated relative to the ground-benchmark in coupled weather, land-surface and distributed hydrologic modeling frameworks as related to flood prediction. Relative to establishing the ground-based "benchmark", numerous avenues were pursued in the making and verification of IFloodS "reference" dual-polarimetric radar-based rain maps, and this study documents the process and results as they pertain specifically to efforts using the NPOL radar dataset. The initial portions of the "process" involved dual-polarimetric quality control procedures which employed standard phase and correlation-based approaches to removal of clutter and non-meteorological echo. Calculation of a scale-adaptive KDP was accomplished using the method of Wang and Chandrasekar (2009; J. Atmos. Oceanic Tech.). A dual-polarimetric blockage algorithm based on Lang et al. (2009; J. Atmos. Oceanic Tech.) was then implemented to correct radar reflectivity and differential reflectivity at low elevation angles. Next, hydrometeor identification algorithms were run to identify liquid and ice hydrometeors. After the quality control and data preparation steps were completed several different dual-polarimetric rain estimation algorithms were employed to estimate rainfall rates using rainfall scans collected approximately every two to three minutes throughout the campaign. These algorithms included a polarimetrically-tuned Z-R algorithm that adjusts for drop oscillations (via Bringi et al., 2004, J. Atmos. Oceanic Tech.), and several different hybrid polarimetric variable approaches, including one that made use of parameters tuned to IFloodS 2D Video Disdrometer measurements. Finally, a hybrid scan algorithm was designed to merge the rain rate estimates from multiple low level elevation angle scans (where blockages could not be appropriately corrected) in order to create individual low-level rain maps. Individual rain maps at each time step were subsequently accumulated over multiple time scales for comparison to gauge network data. The comparison results and overall error character depended strongly on rain event type, polarimetric estimator applied, and range from the radar. We will present the outcome of these comparisons and their impact on constructing composited "reference" rainfall maps at select time and space scales.
Parallel processing optimization strategy based on MapReduce model in cloud storage environment
NASA Astrophysics Data System (ADS)
Cui, Jianming; Liu, Jiayi; Li, Qiuyan
2017-05-01
Currently, a large number of documents in the cloud storage process employed the way of packaging after receiving all the packets. From the local transmitter this stored procedure to the server, packing and unpacking will consume a lot of time, and the transmission efficiency is low as well. A new parallel processing algorithm is proposed to optimize the transmission mode. According to the operation machine graphs model work, using MPI technology parallel execution Mapper and Reducer mechanism. It is good to use MPI technology to implement Mapper and Reducer parallel mechanism. After the simulation experiment of Hadoop cloud computing platform, this algorithm can not only accelerate the file transfer rate, but also shorten the waiting time of the Reducer mechanism. It will break through traditional sequential transmission constraints and reduce the storage coupling to improve the transmission efficiency.
Gas flow calculation method of a ramjet engine
NASA Astrophysics Data System (ADS)
Kostyushin, Kirill; Kagenov, Anuar; Eremin, Ivan; Zhiltsov, Konstantin; Shuvarikov, Vladimir
2017-11-01
At the present study calculation methodology of gas dynamics equations in ramjet engine is presented. The algorithm is based on Godunov`s scheme. For realization of calculation algorithm, the system of data storage is offered, the system does not depend on mesh topology, and it allows using the computational meshes with arbitrary number of cell faces. The algorithm of building a block-structured grid is given. Calculation algorithm in the software package "FlashFlow" is implemented. Software package is verified on the calculations of simple configurations of air intakes and scramjet models.
NASA Astrophysics Data System (ADS)
Pavelic, Paul; Srisuk, Kriengsak; Saraphirom, Phayom; Nadee, Suwanchai; Pholkern, Kewaree; Chusanathas, Sumrit; Munyou, Sitisak; Tangsutthinon, Theerasak; Intarasut, Teerawash; Smakhtin, Vladimir
2012-11-01
SummaryThailand's naturally high seasonal endowment of water resources brings with it the regularly experienced problems associated with floods during the wet season and droughts during the dry season. Downstream-focused engineering solutions that address flooding are vital, but do not necessarily capture the potential for basin-scale improvements to water security, food production and livelihood enhancement. Managed aquifer recharge, typically applied to annual harvesting of wet season flows in dry climates, can also be applied to capture, store and recover episodic extreme flood events in humid environments. In the Chao Phraya River Basin it is estimated that surplus flows recorded downstream above a critical threshold could be harvested and recharged within the shallow alluvial aquifers in a distributed manner upstream of flood prone areas without significantly impacting existing large-medium storages or the Gulf and deltaic ecosystems. Capturing peak flows approximately 1 year in four by dedicating around 200 km2 of land to groundwater recharge would reduce the magnitude of flooding and socio-economic impacts and generate around USD 250 M/year in export earnings for smallholder rainfed farmers through dry season cash cropping without unduly compromising the demands of existing water users. It is proposed that farmers in upstream riparian zones be co-opted as flood harvesters and thus contribute to improved floodwater management through simple water management technologies that enable agricultural lands to be put to higher productive use. Local-scale site suitability and technical performance assessments along with revised governance structures would be required. It is expected that such an approach would also be applicable to other coastal-discharging basins in Thailand and potentially throughout the Asia region.
Research on crude oil storage and transportation based on optimization algorithm
NASA Astrophysics Data System (ADS)
Yuan, Xuhua
2018-04-01
At present, the optimization theory and method have been widely used in the optimization scheduling and optimal operation scheme of complex production systems. Based on C++Builder 6 program development platform, the theoretical research results are implemented by computer. The simulation and intelligent decision system of crude oil storage and transportation inventory scheduling are designed. The system includes modules of project management, data management, graphics processing, simulation of oil depot operation scheme. It can realize the optimization of the scheduling scheme of crude oil storage and transportation system. A multi-point temperature measuring system for monitoring the temperature field of floating roof oil storage tank is developed. The results show that by optimizing operating parameters such as tank operating mode and temperature, the total transportation scheduling costs of the storage and transportation system can be reduced by 9.1%. Therefore, this method can realize safe and stable operation of crude oil storage and transportation system.
Estimating magnitude and frequency of floods using the PeakFQ 7.0 program
Veilleux, Andrea G.; Cohn, Timothy A.; Flynn, Kathleen M.; Mason, Jr., Robert R.; Hummel, Paul R.
2014-01-01
Flood-frequency analysis provides information about the magnitude and frequency of flood discharges based on records of annual maximum instantaneous peak discharges collected at streamgages. The information is essential for defining flood-hazard areas, for managing floodplains, and for designing bridges, culverts, dams, levees, and other flood-control structures. Bulletin 17B (B17B) of the Interagency Advisory Committee on Water Data (IACWD; 1982) codifies the standard methodology for conducting flood-frequency studies in the United States. B17B specifies that annual peak-flow data are to be fit to a log-Pearson Type III distribution. Specific methods are also prescribed for improving skew estimates using regional skew information, tests for high and low outliers, adjustments for low outliers and zero flows, and procedures for incorporating historical flood information. The authors of B17B identified various needs for methodological improvement and recommended additional study. In response to these needs, the Advisory Committee on Water Information (ACWI, successor to IACWD; http://acwi.gov/, Subcommittee on Hydrology (SOH), Hydrologic Frequency Analysis Work Group (HFAWG), has recommended modest changes to B17B. These changes include adoption of a generalized method-of-moments estimator denoted the Expected Moments Algorithm (EMA) (Cohn and others, 1997) and a generalized version of the Grubbs-Beck test for low outliers (Cohn and others, 2013). The SOH requested that the USGS implement these changes in a user-friendly, publicly accessible program.
The influence of antecedent conditions on flood risk in sub-Saharan Africa
NASA Astrophysics Data System (ADS)
Bischiniotis, Konstantinos; van den Hurk, Bart; Coughlan de Perez, Erin; Jongman, Brenden; Veldkamp, Ted; Aerts, Jeroen
2017-04-01
Traditionally, flood risk management has focused on long-term flood protection measures. However, many countries are often not able to afford hard infrastructure that provides sufficient safety levels due to the high investment costs. As a consequence, they rely more on post disaster response and timely warning systems. Most early warning systems have predominantly focused on precipitation as the main predictive factor, having usually lead times of hours or days. However, other variables could also play a role. For instance, anomalous positive water storage, soil saturation and evapotranspiration are physical factors that may influence the length of the flood build-up period. This period can vary from some days to several months before the event and it is particularly important in flood risk management since longer flood warning lead times during this period could result in better flood preparation actions. This study addresses how the antecedent conditions of historical reported flood events over the period 1980 to 2010 in sub-Saharan Africa relate to flood generation. The seasonal-scale conditions are reflected in the Standardized Precipitation Evapotranspiration Index (SPEI), which is calculated using monthly precipitation and temperature data and accounts for the wetness/dryness of an area. Antecedent conditions are separated into a) a short term 'weather-scale' period (0-7 days) and b) a 'seasonal-scale' period (up to 6 months) before the flood event in such a way that they do not overlap. Total 7-day precipitation, which is based on daily meteorological data, was used to evaluate the short-term weather-scale conditions. Using a pair of coordinates, derived from the NatCatSERVICE database on global flood losses, each flood event is positioned on a 0.5°x 0.5° grid cell. The antecedent SPEI conditions of the two periods and their joint influence in flood generation are compared to the same period conditions of the other years of the dataset. First results revealed that many floods were preceded by high SPEI for several months before the flooding event, showing that the area was saturated with a long lead-time. Those that were not preceded by high SPEI had very extreme short-term precipitation that caused the flood event. Furthermore, the importance of seasonal-scale conditions is quantified, which in turn might help humanitarian organizations and decision-makers extend the period of the preventive flood risk management planning.
Wong, Ling Ai; Shareef, Hussain; Mohamed, Azah; Ibrahim, Ahmad Asrul
2014-01-01
This paper presents the application of enhanced opposition-based firefly algorithm in obtaining the optimal battery energy storage systems (BESS) sizing in photovoltaic generation integrated radial distribution network in order to mitigate the voltage rise problem. Initially, the performance of the original firefly algorithm is enhanced by utilizing the opposition-based learning and introducing inertia weight. After evaluating the performance of the enhanced opposition-based firefly algorithm (EOFA) with fifteen benchmark functions, it is then adopted to determine the optimal size for BESS. Two optimization processes are conducted where the first optimization aims to obtain the optimal battery output power on hourly basis and the second optimization aims to obtain the optimal BESS capacity by considering the state of charge constraint of BESS. The effectiveness of the proposed method is validated by applying the algorithm to the 69-bus distribution system and by comparing the performance of EOFA with conventional firefly algorithm and gravitational search algorithm. Results show that EOFA has the best performance comparatively in terms of mitigating the voltage rise problem. PMID:25054184
Wong, Ling Ai; Shareef, Hussain; Mohamed, Azah; Ibrahim, Ahmad Asrul
2014-01-01
This paper presents the application of enhanced opposition-based firefly algorithm in obtaining the optimal battery energy storage systems (BESS) sizing in photovoltaic generation integrated radial distribution network in order to mitigate the voltage rise problem. Initially, the performance of the original firefly algorithm is enhanced by utilizing the opposition-based learning and introducing inertia weight. After evaluating the performance of the enhanced opposition-based firefly algorithm (EOFA) with fifteen benchmark functions, it is then adopted to determine the optimal size for BESS. Two optimization processes are conducted where the first optimization aims to obtain the optimal battery output power on hourly basis and the second optimization aims to obtain the optimal BESS capacity by considering the state of charge constraint of BESS. The effectiveness of the proposed method is validated by applying the algorithm to the 69-bus distribution system and by comparing the performance of EOFA with conventional firefly algorithm and gravitational search algorithm. Results show that EOFA has the best performance comparatively in terms of mitigating the voltage rise problem.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colón, Yamil J.; Gómez-Gualdrón, Diego A.; Snurr, Randall Q.
Metal-organic frameworks (MOFs) are promising materials for a range of energy and environmental applications. Here we describe in detail a computational algorithm and code to generate MOFs based on edge-transitive topological nets for subsequent evaluation via molecular simulation. This algorithm has been previously used by us to construct and evaluate 13 512 MOFs of 41 different topologies for cryo-adsorbed hydrogen storage. Grand canonical Monte Carlo simulations are used here to evaluate the 13 512 structures for the storage of gaseous fuels such as hydrogen and methane and nondistillative separation of xenon/krypton mixtures at various operating conditions. MOF performance for bothmore » gaseous fuel storage and xenon/krypton separation is influenced by topology. Simulation data suggest that gaseous fuel storage performance is topology-dependent due to MOF properties such as void fraction and surface area combining differently in different topologies, whereas xenon/krypton separation performance is topology-dependent due to how topology constrains the pore size distribution.« less
Hydrograph simulation models of the Hillsborough and Alafia Rivers, Florida: a preliminary report
Turner, James F.
1972-01-01
Mathematical (digital) models that simulate flood hydrographs from rainfall records have been developed for the following gaging stations in the Hillsborough and Alafia River basins of west-central Florida: Hillsborough River near Tampa, Alafia River at Lithia, and north Prong Alafia River near Keysville. These models, which were developed from historical streamflow and and rainfall records, are based on rainfall-runoff and unit-hydrograph procedures involving an arbitrary separation of the flood hydrograph. These models assume the flood hydrograph to be composed of only two flow components, direct (storm) runoff, and base flow. Expressions describing these two flow components are derived from streamflow and rainfall records and are combined analytically to form algorithms (models), which are programmed for processing on a digital computing system. Most Hillsborough and Alafia River flood discharges can be simulated with expected relative errors less than or equal to 30 percent and flood peaks can be simulated with average relative errors less than 15 percent. Because of the inadequate rainfall network that is used in obtaining input data for the North Prong Alafia River model, simulated peaks are frequently in error by more than 40 percent, particularly for storms having highly variable areal rainfall distribution. Simulation errors are the result of rainfall sample errors and, to a lesser extent, model inadequacy. Data errors associated with the determination of mean basin precipitation are the result of the small number and poor areal distribution of rainfall stations available for use in the study. Model inadequacy, however, is attributed to the basic underlying theory, particularly the rainfall-runoff relation. These models broaden and enhance existing water-management capabilities within these basins by allowing the establishment and implementation of programs providing for continued development in these areas. Specifically, the models serve not only as a basis for forecasting floods, but also for simulating hydrologic information needed in flood-plain mapping and delineating and evaluating alternative flood control and abatement plans.
NASA Astrophysics Data System (ADS)
Tellman, B.; Schwarz, B.
2014-12-01
This talk describes the development of a web application to predict and communicate vulnerability to floods given publicly available data, disaster science, and geotech cloud capabilities. The proof of concept in Google Earth Engine API with initial testing on case studies in New York and Utterakhand India demonstrates the potential of highly parallelized cloud computing to model socio-ecological disaster vulnerability at high spatial and temporal resolution and in near real time. Cloud computing facilitates statistical modeling with variables derived from large public social and ecological data sets, including census data, nighttime lights (NTL), and World Pop to derive social parameters together with elevation, satellite imagery, rainfall, and observed flood data from Dartmouth Flood Observatory to derive biophysical parameters. While more traditional, physically based hydrological models that rely on flow algorithms and numerical methods are currently unavailable in parallelized computing platforms like Google Earth Engine, there is high potential to explore "data driven" modeling that trades physics for statistics in a parallelized environment. A data driven approach to flood modeling with geographically weighted logistic regression has been initially tested on Hurricane Irene in southeastern New York. Comparison of model results with observed flood data reveals a 97% accuracy of the model to predict flooded pixels. Testing on multiple storms is required to further validate this initial promising approach. A statistical social-ecological flood model that could produce rapid vulnerability assessments to predict who might require immediate evacuation and where could serve as an early warning. This type of early warning system would be especially relevant in data poor places lacking the computing power, high resolution data such as LiDar and stream gauges, or hydrologic expertise to run physically based models in real time. As the data-driven model presented relies on globally available data, the only real time data input required would be typical data from a weather service, e.g. precipitation or coarse resolution flood prediction. However, model uncertainty will vary locally depending upon the resolution and frequency of observed flood and socio-economic damage impact data.
Characterisation of flooding in Alexandria in October 2015 and suggested mitigating measures
NASA Astrophysics Data System (ADS)
Bhattacharya, Biswa; Zevenbergen, Chris; Wahaab, R. A. Wahaab R. A.; Elbarki, W. A. I. Elbarki W. A. I.; Busker, T. Busker T.; Salinas Rodriguez, C. N. A. Salinas Rodriguez C. N. A.
2017-04-01
In October 2015 Alexandria (Egypt) experienced exceptional flooding. The flooding was caused by heavy rainfall in a short period of time in a city which normally does not receive a large amount of rainfall. The heavy rainfall caused a tremendous volume of runoff, which the city's drainage system was unable to drain off to the Mediterranean Sea. Seven people have died due to the flood, and there were huge direct and indirect damages. The city does not have a flood forecasting system. An analysis with rainfall forecast from the European Centre for Medium Range Weather Forecast (ECMWF) showed that the extreme rainfall could have been forecasted about a week back. Naturally, if a flood forecasting model was in place the flooding could have been predicted well in advance. Alexandria, along with several other Arab cities, are not prepared at all for natural hazards. Preparedness actions leading to improved adaptation and resilience are not in place. The situation is being further exacerbated with rapid urbanisation and climate change. The local authorities estimate that about 30000 new buildings have been (illegally) constructed during the last five years at a location near the main pumping station (Max Point). This issue may have a very serious adverse effect on hydrology and requires further study to estimate the additional runoff from the newly urbanised areas. The World Bank has listed Alexandria as one of the five coastal cities, which may have very significant risk of coastal flooding due to the climate change. Setting up of a flood forecasting model along with an evidence-based research on the drainage system's capacity is seen as immediate actions that can significantly improve the preparedness of the city towards flooding. Furthermore, the region has got a number of large lakes, which potentially can be used to store extra water as a flood mitigation measure. Two water bodies, namely the Maryot Lake and the Airport Lake, are identified from which water can be pumped out in advance to keep storage available in case of flooding. Keywords: Alexandria, flood, Egypt, rainfall, forecasting.
Cloud Optimized Image Format and Compression
NASA Astrophysics Data System (ADS)
Becker, P.; Plesea, L.; Maurer, T.
2015-04-01
Cloud based image storage and processing requires revaluation of formats and processing methods. For the true value of the massive volumes of earth observation data to be realized, the image data needs to be accessible from the cloud. Traditional file formats such as TIF and NITF were developed in the hay day of the desktop and assumed fast low latency file access. Other formats such as JPEG2000 provide for streaming protocols for pixel data, but still require a server to have file access. These concepts no longer truly hold in cloud based elastic storage and computation environments. This paper will provide details of a newly evolving image storage format (MRF) and compression that is optimized for cloud environments. Although the cost of storage continues to fall for large data volumes, there is still significant value in compression. For imagery data to be used in analysis and exploit the extended dynamic range of the new sensors, lossless or controlled lossy compression is of high value. Compression decreases the data volumes stored and reduces the data transferred, but the reduced data size must be balanced with the CPU required to decompress. The paper also outlines a new compression algorithm (LERC) for imagery and elevation data that optimizes this balance. Advantages of the compression include its simple to implement algorithm that enables it to be efficiently accessed using JavaScript. Combing this new cloud based image storage format and compression will help resolve some of the challenges of big image data on the internet.
Gellis, Allen C.; Noe, Gregory B.; Clune, John W.; Myers, Michael K.; Hupp, Cliff R.; Schenk, Edward R.; Schwarz, Gregory E.
2015-01-01
Management implications of this study indicate that both agriculture and streambanks are important sources of sediment in Linganore Creek where the delivery of agriculture sediment was 4 percent and the delivery of streambank sediment was 44 percent. Fourth order streambanks, on average, had the highest rates of bank erosion. Combining the sediment fingerprinting and sediment budget results indicates that 96 percent of the eroded fine-grained sediment from agriculture went into storage. Flood plains and ponds are effective storage sites of sediment in the Linganore Creek watershed. Flood plains stored 8 percent of all eroded sediment with 4th and 5th order flood plains, on average, storing the most sediment. Small ponds in the Linganore Creek watershed, which drained 16 percent of the total watershed area, stored 15 percent of all eroded sediment. Channel beds were relatively stable with the greatest erosion generally occurring in 4th and 5th order streams.
Storage in California’s reservoirs and snowpack in this time of drought
Dettinger, Michael; Anderson, Michael L.
2015-01-01
The San Francisco Bay and Sacramento–San Joaquin Delta (Delta) are the recipients of inflows from a watershed that spans much of California and that has ties to nearly the entire state. Historically, California has buffered its water supplies and flood risks both within—and beyond—the Delta’s catchment by developing many reservoirs, large and small, high and low. Most of these reservoirs carry water from wet winter seasons—when water demands are low and flood risks are high—to dry, warm seasons (and years) when demands are high and little precipitation falls. Many reservoirs are also used to catch and delay (or spread in time) flood flows that otherwise might cause damage to communities and floodplains. This essay describes the status of surface-water and snowpack storage conditions in California in spring 2015, providing context for better understanding where the state’s water stores stand as we enter summer 2015.
Supervised classification of aerial imagery and multi-source data fusion for flood assessment
NASA Astrophysics Data System (ADS)
Sava, E.; Harding, L.; Cervone, G.
2015-12-01
Floods are among the most devastating natural hazards and the ability to produce an accurate and timely flood assessment before, during, and after an event is critical for their mitigation and response. Remote sensing technologies have become the de-facto approach for observing the Earth and its environment. However, satellite remote sensing data are not always available. For these reasons, it is crucial to develop new techniques in order to produce flood assessments during and after an event. Recent advancements in data fusion techniques of remote sensing with near real time heterogeneous datasets have allowed emergency responders to more efficiently extract increasingly precise and relevant knowledge from the available information. This research presents a fusion technique using satellite remote sensing imagery coupled with non-authoritative data such as Civil Air Patrol (CAP) and tweets. A new computational methodology is proposed based on machine learning algorithms to automatically identify water pixels in CAP imagery. Specifically, wavelet transformations are paired with multiple classifiers, run in parallel, to build models discriminating water and non-water regions. The learned classification models are first tested against a set of control cases, and then used to automatically classify each image separately. A measure of uncertainty is computed for each pixel in an image proportional to the number of models classifying the pixel as water. Geo-tagged tweets are continuously harvested and stored on a MongoDB and queried in real time. They are fused with CAP classified data, and with satellite remote sensing derived flood extent results to produce comprehensive flood assessment maps. The final maps are then compared with FEMA generated flood extents to assess their accuracy. The proposed methodology is applied on two test cases, relative to the 2013 floods in Boulder CO, and the 2015 floods in Texas.
DNABIT Compress – Genome compression algorithm
Rajarajeswari, Pothuraju; Apparao, Allam
2011-01-01
Data compression is concerned with how information is organized in data. Efficient storage means removal of redundancy from the data being stored in the DNA molecule. Data compression algorithms remove redundancy and are used to understand biologically important molecules. We present a compression algorithm, “DNABIT Compress” for DNA sequences based on a novel algorithm of assigning binary bits for smaller segments of DNA bases to compress both repetitive and non repetitive DNA sequence. Our proposed algorithm achieves the best compression ratio for DNA sequences for larger genome. Significantly better compression results show that “DNABIT Compress” algorithm is the best among the remaining compression algorithms. While achieving the best compression ratios for DNA sequences (Genomes),our new DNABIT Compress algorithm significantly improves the running time of all previous DNA compression programs. Assigning binary bits (Unique BIT CODE) for (Exact Repeats, Reverse Repeats) fragments of DNA sequence is also a unique concept introduced in this algorithm for the first time in DNA compression. This proposed new algorithm could achieve the best compression ratio as much as 1.58 bits/bases where the existing best methods could not achieve a ratio less than 1.72 bits/bases. PMID:21383923
NASA Astrophysics Data System (ADS)
Unland, N. P.; Cartwright, I.; Cendón, D. I.; Chisari, R.
2014-02-01
The residence time of groundwater within 50 m of the Tambo River, South East Australia, has been estimated through the combined use of 3H and 14C. Groundwater residence times increase towards the Tambo River which implies a gaining river system and not increasing bank storage with proximity to the Tambo River. Major ion concentrations and δ2H and δ18O values of bank water also indicate that bank infiltration does not significantly impact groundwater chemistry under baseflow and post-flood conditions, suggesting that the gaining nature of the river may be driving the return of bank storage water back into the Tambo River within days of peak flood conditions. The covariance between 3H and 14C indicates the leakage and mixing between old (~17 200 yr) groundwater from a semi-confined aquifer and younger groundwater (<100 yr) near the river where confining layers are less prevalent. The presence of this semi-confined aquifer has also been used to help explain the absence of bank storage, as rapid pressure propagation into the semi-confined aquifer during flooding will minimise bank infiltration. This study illustrates the complex nature of river groundwater interactions and the potential downfall in assuming simple or idealised conditions when conducting hydrogeological studies.
Hierarchical trie packet classification algorithm based on expectation-maximization clustering.
Bi, Xia-An; Zhao, Junxia
2017-01-01
With the development of computer network bandwidth, packet classification algorithms which are able to deal with large-scale rule sets are in urgent need. Among the existing algorithms, researches on packet classification algorithms based on hierarchical trie have become an important packet classification research branch because of their widely practical use. Although hierarchical trie is beneficial to save large storage space, it has several shortcomings such as the existence of backtracking and empty nodes. This paper proposes a new packet classification algorithm, Hierarchical Trie Algorithm Based on Expectation-Maximization Clustering (HTEMC). Firstly, this paper uses the formalization method to deal with the packet classification problem by means of mapping the rules and data packets into a two-dimensional space. Secondly, this paper uses expectation-maximization algorithm to cluster the rules based on their aggregate characteristics, and thereby diversified clusters are formed. Thirdly, this paper proposes a hierarchical trie based on the results of expectation-maximization clustering. Finally, this paper respectively conducts simulation experiments and real-environment experiments to compare the performances of our algorithm with other typical algorithms, and analyzes the results of the experiments. The hierarchical trie structure in our algorithm not only adopts trie path compression to eliminate backtracking, but also solves the problem of low efficiency of trie updates, which greatly improves the performance of the algorithm.
Techniques for shuttle trajectory optimization
NASA Technical Reports Server (NTRS)
Edge, E. R.; Shieh, C. J.; Powers, W. F.
1973-01-01
The application of recently developed function-space Davidon-type techniques to the shuttle ascent trajectory optimization problem is discussed along with an investigation of the recently developed PRAXIS algorithm for parameter optimization. At the outset of this analysis, the major deficiency of the function-space algorithms was their potential storage problems. Since most previous analyses of the methods were with relatively low-dimension problems, no storage problems were encountered. However, in shuttle trajectory optimization, storage is a problem, and this problem was handled efficiently. Topics discussed include: the shuttle ascent model and the development of the particular optimization equations; the function-space algorithms; the operation of the algorithm and typical simulations; variable final-time problem considerations; and a modification of Powell's algorithm.
Warehouse multipoint temperature and humidity monitoring system design based on Kingview
NASA Astrophysics Data System (ADS)
Ou, Yanghui; Wang, Xifu; Liu, Jingyun
2017-04-01
Storage is the key link of modern logistics. Warehouse environment monitoring is an important part of storage safety management. To meet the storage requirements of different materials, guarantee their quality in the greatest extent, which has great significance. In the warehouse environment monitoring, the most important parameters are air temperature and relative humidity. In this paper, a design of warehouse multipoint temperature and humidity monitoring system based on King view, which realizes the multipoint temperature and humidity data real-time acquisition, monitoring and storage in warehouse by using temperature and humidity sensor. Also, this paper will take the bulk grain warehouse as an example and based on the data collected in real-time monitoring, giving the corresponding expert advice that combined with the corresponding algorithm, providing theoretical guidance to control the temperature and humidity in grain warehouse.
NASA Astrophysics Data System (ADS)
Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng
2016-09-01
This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.
The GOES-R Product Generation Architecture - Post CDR Update
NASA Astrophysics Data System (ADS)
Dittberner, G.; Kalluri, S.; Weiner, A.
2012-12-01
The GOES-R system will substantially improve the accuracy of information available to users by providing data from significantly enhanced instruments, which will generate an increased number and diversity of products with higher resolution, and much shorter relook times. Considerably greater compute and memory resources are necessary to achieve the necessary latency and availability for these products. Over time, new and updated algorithms are expected to be added and old ones removed as science advances and new products are developed. The GOES-R GS architecture is being planned to maintain functionality so that when such changes are implemented, operational product generation will continue without interruption. The primary parts of the PG infrastructure are the Service Based Architecture (SBA) and the Data Fabric (DF). SBA is the middleware that encapsulates and manages science algorithms that generate products. It is divided into three parts, the Executive, which manages and configures the algorithm as a service, the Dispatcher, which provides data to the algorithm, and the Strategy, which determines when the algorithm can execute with the available data. SBA is a distributed architecture, with services connected to each other over a compute grid and is highly scalable. This plug-and-play architecture allows algorithms to be added, removed, or updated without affecting any other services or software currently running and producing data. Algorithms require product data from other algorithms, so a scalable and reliable messaging is necessary. The SBA uses the DF to provide this data communication layer between algorithms. The DF provides an abstract interface over a distributed and persistent multi-layered storage system (e.g., memory based caching above disk-based storage) and an event management system that allows event-driven algorithm services to know when instrument data are available and where they reside. Together, the SBA and the DF provide a flexible, high performance architecture that can meet the needs of product processing now and as they grow in the future.
The GOES-R Product Generation Architecture
NASA Astrophysics Data System (ADS)
Dittberner, G. J.; Kalluri, S.; Hansen, D.; Weiner, A.; Tarpley, A.; Marley, S.
2011-12-01
The GOES-R system will substantially improve users' ability to succeed in their work by providing data with significantly enhanced instruments, higher resolution, much shorter relook times, and an increased number and diversity of products. The Product Generation architecture is designed to provide the computer and memory resources necessary to achieve the necessary latency and availability for these products. Over time, new and updated algorithms are expected to be added and old ones removed as science advances and new products are developed. The GOES-R GS architecture is being planned to maintain functionality so that when such changes are implemented, operational product generation will continue without interruption. The primary parts of the PG infrastructure are the Service Based Architecture (SBA) and the Data Fabric (DF). SBA is the middleware that encapsulates and manages science algorithms that generate products. It is divided into three parts, the Executive, which manages and configures the algorithm as a service, the Dispatcher, which provides data to the algorithm, and the Strategy, which determines when the algorithm can execute with the available data. SBA is a distributed architecture, with services connected to each other over a compute grid and is highly scalable. This plug-and-play architecture allows algorithms to be added, removed, or updated without affecting any other services or software currently running and producing data. Algorithms require product data from other algorithms, so a scalable and reliable messaging is necessary. The SBA uses the DF to provide this data communication layer between algorithms. The DF provides an abstract interface over a distributed and persistent multi-layered storage system (e.g., memory based caching above disk-based storage) and an event management system that allows event-driven algorithm services to know when instrument data are available and where they reside. Together, the SBA and the DF provide a flexible, high performance architecture that can meet the needs of product processing now and as they grow in the future.
GOES-R GS Product Generation Infrastructure Operations
NASA Astrophysics Data System (ADS)
Blanton, M.; Gundy, J.
2012-12-01
GOES-R GS Product Generation Infrastructure Operations: The GOES-R Ground System (GS) will produce a much larger set of products with higher data density than previous GOES systems. This requires considerably greater compute and memory resources to achieve the necessary latency and availability for these products. Over time, new algorithms could be added and existing ones removed or updated, but the GOES-R GS cannot go down during this time. To meet these GOES-R GS processing needs, the Harris Corporation will implement a Product Generation (PG) infrastructure that is scalable, extensible, extendable, modular and reliable. The primary parts of the PG infrastructure are the Service Based Architecture (SBA), which includes the Distributed Data Fabric (DDF). The SBA is the middleware that encapsulates and manages science algorithms that generate products. The SBA is divided into three parts, the Executive, which manages and configures the algorithm as a service, the Dispatcher, which provides data to the algorithm, and the Strategy, which determines when the algorithm can execute with the available data. The SBA is a distributed architecture, with services connected to each other over a compute grid and is highly scalable. This plug-and-play architecture allows algorithms to be added, removed, or updated without affecting any other services or software currently running and producing data. Algorithms require product data from other algorithms, so a scalable and reliable messaging is necessary. The SBA uses the DDF to provide this data communication layer between algorithms. The DDF provides an abstract interface over a distributed and persistent multi-layered storage system (memory based caching above disk-based storage) and an event system that allows algorithm services to know when data is available and to get the data that they need to begin processing when they need it. Together, the SBA and the DDF provide a flexible, high performance architecture that can meet the needs of product processing now and as they grow in the future.
Effect of soil and cover conditions on soil-water relationships
George R., Jr. Trimble; Charles E. Hale; H. Spencer Potter
1951-01-01
People who make flood-control surveys for the U.S. Department of Agriculture are concerned with the physical condition of the soils in the watersheds. The condition of the soil determines how fast water moves into and through the soil, and how much water is held in storage. The condition of the soil has a great influence on stream flow, erosion, floods and water supply...
A Two-Dimensional Linear Bicharacteristic FDTD Method
NASA Technical Reports Server (NTRS)
Beggs, John H.
2002-01-01
The linear bicharacteristic scheme (LBS) was originally developed to improve unsteady solutions in computational acoustics and aeroacoustics. The LBS has previously been extended to treat lossy materials for one-dimensional problems. It is a classical leapfrog algorithm, but is combined with upwind bias in the spatial derivatives. This approach preserves the time-reversibility of the leapfrog algorithm, which results in no dissipation, and it permits more flexibility by the ability to adopt a characteristic based method. The use of characteristic variables allows the LBS to include the Perfectly Matched Layer boundary condition with no added storage or complexity. The LBS offers a central storage approach with lower dispersion than the Yee algorithm, plus it generalizes much easier to nonuniform grids. It has previously been applied to two and three-dimensional free-space electromagnetic propagation and scattering problems. This paper extends the LBS to the two-dimensional case. Results are presented for point source radiation problems, and the FDTD algorithm is chosen as a convenient reference for comparison.
Tidal Turbine Array Optimization Based on the Discrete Particle Swarm Algorithm
NASA Astrophysics Data System (ADS)
Wu, Guo-wei; Wu, He; Wang, Xiao-yong; Zhou, Qing-wei; Liu, Xiao-man
2018-06-01
In consideration of the resource wasted by unreasonable layout scheme of tidal current turbines, which would influence the ratio of cost and power output, particle swarm optimization algorithm is introduced and improved in the paper. In order to solve the problem of optimal array of tidal turbines, the discrete particle swarm optimization (DPSO) algorithm has been performed by re-defining the updating strategies of particles' velocity and position. This paper analyzes the optimization problem of micrositing of tidal current turbines by adjusting each turbine's position, where the maximum value of total electric power is obtained at the maximum speed in the flood tide and ebb tide. Firstly, the best installed turbine number is generated by maximizing the output energy in the given tidal farm by the Farm/Flux and empirical method. Secondly, considering the wake effect, the reasonable distance between turbines, and the tidal velocities influencing factors in the tidal farm, Jensen wake model and elliptic distribution model are selected for the turbines' total generating capacity calculation at the maximum speed in the flood tide and ebb tide. Finally, the total generating capacity, regarded as objective function, is calculated in the final simulation, thus the DPSO could guide the individuals to the feasible area and optimal position. The results have been concluded that the optimization algorithm, which increased 6.19% more recourse output than experience method, can be thought as a good tool for engineering design of tidal energy demonstration.
NASA Astrophysics Data System (ADS)
Wardah, T.; Abu Bakar, S. H.; Bardossy, A.; Maznorizan, M.
2008-07-01
SummaryFrequent flash-floods causing immense devastation in the Klang River Basin of Malaysia necessitate an improvement in the real-time forecasting systems being used. The use of meteorological satellite images in estimating rainfall has become an attractive option for improving the performance of flood forecasting-and-warning systems. In this study, a rainfall estimation algorithm using the infrared (IR) information from the Geostationary Meteorological Satellite-5 (GMS-5) is developed for potential input in a flood forecasting system. Data from the records of GMS-5 IR images have been retrieved for selected convective cells to be trained with the radar rain rate in a back-propagation neural network. The selected data as inputs to the neural network, are five parameters having a significant correlation with the radar rain rate: namely, the cloud-top brightness-temperature of the pixel of interest, the mean and the standard deviation of the temperatures of the surrounding five by five pixels, the rate of temperature change, and the sobel operator that indicates the temperature gradient. In addition, three numerical weather prediction (NWP) products, namely the precipitable water content, relative humidity, and vertical wind, are also included as inputs. The algorithm is applied for the areal rainfall estimation in the upper Klang River Basin and compared with another technique that uses power-law regression between the cloud-top brightness-temperature and radar rain rate. Results from both techniques are validated against previously recorded Thiessen areal-averaged rainfall values with coefficient correlation values of 0.77 and 0.91 for the power-law regression and the artificial neural network (ANN) technique, respectively. An extra lead time of around 2 h is gained when the satellite-based ANN rainfall estimation is coupled with a rainfall-runoff model to forecast a flash-flood event in the upper Klang River Basin.
NASA Astrophysics Data System (ADS)
Zhang, Hao; Chen, Minghua; Parekh, Abhay; Ramchandran, Kannan
2011-09-01
We design a distributed multi-channel P2P Video-on-Demand (VoD) system using "plug-and-play" helpers. Helpers are heterogenous "micro-servers" with limited storage, bandwidth and number of users they can serve simultaneously. Our proposed system has the following salient features: (1) it jointly optimizes over helper-user connection topology, video storage distribution and transmission bandwidth allocation; (2) it minimizes server load, and is adaptable to varying supply and demand patterns across multiple video channels irrespective of video popularity; and (3) it is fully distributed and requires little or no maintenance overhead. The combinatorial nature of the problem and the system demand for distributed algorithms makes the problem uniquely challenging. By utilizing Lagrangian decomposition and Markov chain approximation based arguments, we address this challenge by designing two distributed algorithms running in tandem: a primal-dual storage and bandwidth allocation algorithm and a "soft-worst-neighbor-choking" topology-building algorithm. Our scheme provably converges to a near-optimal solution, and is easy to implement in practice. Packet-level simulation results show that the proposed scheme achieves minimum sever load under highly heterogeneous combinations of supply and demand patterns, and is robust to system dynamics of user/helper churn, user/helper asynchrony, and random delays in the network.
The Impact of Corps Flood Control Reservoirs in the June 2008 Upper Mississippi Flood
NASA Astrophysics Data System (ADS)
Charley, W. J.; Stiman, J. A.
2008-12-01
The US Army Corps of Engineers is responsible for a multitude of flood control project on the Mississippi River and its tributaries, including levees that protect land from flooding, and dams to help regulate river flows. The first six months of 2008 were the wettest on record in the upper Mississippi Basin. During the first 2 weeks of June, rainfall over the Midwest ranged from 6 to as much as 16 inches, overwhelming the flood protection system, causing massive flooding and damage. Most severely impacted were the States of Iowa, Illinois, Indiana, Missouri, and Wisconsin. In Iowa, flooding occurred on almost every river in the state. On the Iowa River, record flooding occurred from Marshalltown, Iowa, downstream to its confluence with the Mississippi River. At several locations, flooding exceeded the 500-year event. The flooding affected agriculture, transportation, and infrastructure, including homes, businesses, levees, and other water-control structures. It has been estimated that there was at least 7 billion dollars in damages. While the flooding in Iowa was extraordinary, Corps of Engineers flood control reservoirs helped limit damage and prevent loss of life, even though some reservoirs were filled beyond their design capacity. Coralville Reservoir on the Iowa River, for example, filled to 135% of its design flood storage capacity, with stage a record five feet over the crest of the spillway. In spite of this, the maximum reservoir release was limited to 39,500 cfs, while a peak inflow of 57,000 cfs was observed. CWMS, the Corps Water Management System, is used to help regulate Corps reservoirs, as well as track and evaluate flooding and flooding potential. CWMS is a comprehensive data acquisition and hydrologic modeling system for short-term decision support of water control operations in real time. It encompasses data collection, validation and transformation, data storage, visualization, real time model simulation for decision-making support, and data dissemination. The system uses precipitation and flow data, collected in real-time, along with forecasted flow from the National Weather Service to model and optimize reservoir operations and forecast downstream flows and stages, providing communities accurate and timely information to aid their flood-fighting. This involves integrating several simulation modeling programs, including HEC-HMS to forecast flows, HEC-ResSim to model reservoir operations and HEC-RAS to compute forecasted stage hydrographs. An inundation boundary and depth map of water in the flood plain can be calculated from the HEC-RAS results using ArcInfo. By varying future precipitation and releases, engineers can evaluate different "What if?" scenarios. The effectiveness of this tool and Corps reservoirs are examined.
NASA Astrophysics Data System (ADS)
van der Zwan, Rene
2013-04-01
The Rijnland water system is situated in the western part of the Netherlands, and is a low-lying area of which 90% is below sea-level. The area covers 1,100 square kilometres, where 1.3 million people live, work, travel and enjoy leisure. The District Water Control Board of Rijnland is responsible for flood defence, water quantity and quality management. This includes design and maintenance of flood defence structures, control of regulating structures for an adequate water level management, and waste water treatment. For water quantity management Rijnland uses, besides an online monitoring network for collecting water level and precipitation data, a real time control decision support system. This decision support system consists of deterministic hydro-meteorological forecasts with a 24-hr forecast horizon, coupled with a control module that provides optimal operation schedules for the storage basin pumping stations. The uncertainty of the rainfall forecast is not forwarded in the hydrological prediction. At this moment 65% of the pumping capacity of the storage basin pumping stations can be automatically controlled by the decision control system. Within 5 years, after renovation of two other pumping stations, the total capacity of 200 m3/s will be automatically controlled. In critical conditions there is a need of both a longer forecast horizon and a probabilistic forecast. Therefore ensemble precipitation forecasts of the ECMWF are already consulted off-line during dry-spells, and Rijnland is running a pilot operational system providing 10-day water level ensemble forecasts. The use of EPS during dry-spells and the findings of the pilot will be presented. Challenges and next steps towards on-line implementation of ensemble forecasts for risk-based operational management of the Rijnland water system will be discussed. An important element in that discussion is the question: will policy and decision makers, operator and citizens adapt this Anticipatory Water management, including temporary lower storage basin levels and a reduction in extra investments for infrastructural measures.
Storage assignment optimization in a multi-tier shuttle warehousing system
NASA Astrophysics Data System (ADS)
Wang, Yanyan; Mou, Shandong; Wu, Yaohua
2016-03-01
The current mathematical models for the storage assignment problem are generally established based on the traveling salesman problem(TSP), which has been widely applied in the conventional automated storage and retrieval system(AS/RS). However, the previous mathematical models in conventional AS/RS do not match multi-tier shuttle warehousing systems(MSWS) because the characteristics of parallel retrieval in multiple tiers and progressive vertical movement destroy the foundation of TSP. In this study, a two-stage open queuing network model in which shuttles and a lift are regarded as servers at different stages is proposed to analyze system performance in the terms of shuttle waiting period (SWP) and lift idle period (LIP) during transaction cycle time. A mean arrival time difference matrix for pairwise stock keeping units(SKUs) is presented to determine the mean waiting time and queue length to optimize the storage assignment problem on the basis of SKU correlation. The decomposition method is applied to analyze the interactions among outbound task time, SWP, and LIP. The ant colony clustering algorithm is designed to determine storage partitions using clustering items. In addition, goods are assigned for storage according to the rearranging permutation and the combination of storage partitions in a 2D plane. This combination is derived based on the analysis results of the queuing network model and on three basic principles. The storage assignment method and its entire optimization algorithm method as applied in a MSWS are verified through a practical engineering project conducted in the tobacco industry. The applying results show that the total SWP and LIP can be reduced effectively to improve the utilization rates of all devices and to increase the throughput of the distribution center.
Langland, Michael J.; Hainly, Robert A.
1997-01-01
The Susquehanna River drains about 27,510 square miles in New York, Pennsylvania, and Maryland, contributes nearly 50 percent of the freshwater discharge to the Chesapeake Bay, and contributes nearly 66 percent of the annual nitrogen load, 40 percent of the phosphorus load, and 25 percent of the suspended-sediment load from non-tidal parts of the Bay during a year of average streamflow. A reservoir system formed by three hydroelectric dams on the lower Susquehanna River is currently trapping a major part of the phosphorus and suspended-sediment loads from the basin and, to a lesser extent, the nitrogen loads.In the summer of 1996, the U. S. Geological Survey collected bathymetric data along 64 cross sections and 40 bottom-sediment samples along 14 selected cross sections in the lower Susquehanna River reservoir system to determine the remaining sediment-storage capacity, refine the current estimate of when the system may reach sediment-storage capacity, document changes in the reservoir system after the January 1996 flood, and determine the remaining nutrient mass in Conowingo Reservoir. Results from the 1996 survey indicate an estimated total of 14,800,000 tons of sediment were scoured from the reservoir system from 1993 (date of previous bathymetric survey) through 1996. This includes the net sediment change of 4,700,000 tons based on volume change in the reservoir system computed from the 1993 and 1996 surveys, the 6,900,000 tons of sediment deposited from 1993 through 1996, and the 3,200,000 tons of sediment transported into the reservoir system during the January 1996 flood. The January 1996 flood, which exceeded a 100-year recurrence interval, scoured about the same amount of sediment that normally would be deposited in the reservoir system during a 4- to 6-year period.Concentrations of total nitrogen in bottom sediments in the Conowingo Reservoir ranged from 1,500 to 6,900 mg/kg (milligrams per kilogram); 75 percent of the concentrations were between 3,000 and 5,000 mg/kg. About 96 percent of the concentrations of total nitrogen consisted of organic nitrogen. Concentrations of total phosphorus in bottom sediments ranged from 286 to 1,390 mg/kg. About 84 percent of the concentrations of total phosphorus were comprised of inorganic phosphorus. The ratio of concentrations of plant-available phosphorus to concentrations of total phosphorus ranged from 0.6 to 3.5 percent; ratios generally decreased in a downstream direction.About 29,000 acre-feet, or 42,000,000 tons, of sediment can be deposited before Conowingo Reservoir reaches sediment-storage capacity. Assuming the average annual sediment-deposition rate remains unchanged and no scour occurs due to floods, the reservoir system could reach sediment-storage capacity in about 17 years. The reservoir system currently is trapping about 2 percent of the nitrogen, 45 percent of the phosphorus, and 70 percent of the suspended sediment transported by the river to the upper Chesapeake Bay. Once the reservoir reaches sediment-storage capacity, an estimated 250-percent increase in the current annual loads of suspended sediment, a 2-percent increase in the current annual loads of total nitrogen, and a 70-percent increase in the current annual loads of total phosphorus from the Susquehanna River to Chesapeake Bay can be expected. If the goal of a 40-percent reduction in controllable phosphorus load from the Susquehanna River Basin is met before the reservoirs reach sediment-storage capacity, the 40-percent reduction goal will probably be exceeded when the reservoir system reaches sediment-storage capacity.
Recession-based hydrological models for estimating low flows in ungauged catchments in the Himalayas
NASA Astrophysics Data System (ADS)
Rees, H. G.; Holmes, M. G. R.; Young, A. R.; Kansakar, S. R.
The Himalayan region of Nepal and northern India experiences hydrological extremes from monsoonal floods during July to September, when most of the annual precipitation falls, to periods of very low flows during the dry season (December to February). While the monsoon floods cause acute disasters such as loss of human life and property, mudslides and infrastructure damage, the lack of water during the dry season has a chronic impact on the lives of local people. The management of water resources in the region is hampered by relatively sparse hydrometerological networks and consequently, many resource assessments are required in catchments where no measurements exist. A hydrological model for estimating dry season flows in ungauged catchments, based on recession curve behaviour, has been developed to address this problem. Observed flows were fitted to a second order storage model to enable average annual recession behaviour to be examined. Regionalised models were developed, using a calibration set of 26 catchments, to predict three recession curve parameters: the storage constant; the initial recession flow and the start date of the recession. Relationships were identified between: the storage constant and catchment area; the initial recession flow and elevation (acting as a surrogate for rainfall); and the start date of the recession and geographic location. An independent set of 13 catchments was used to evaluate the robustness of the models. The regional models predicted the average volume of water in an annual recession period (1st of October to the 1st of February) with an average error of 8%, while mid-January flows were predicted to within ±50% for 79% of the catchments in the data set.
NASA Astrophysics Data System (ADS)
Jin, Minglei; Jin, Weiqi; Li, Yiyang; Li, Shuo
2015-08-01
In this paper, we propose a novel scene-based non-uniformity correction algorithm for infrared image processing-temporal high-pass non-uniformity correction algorithm based on grayscale mapping (THP and GM). The main sources of non-uniformity are: (1) detector fabrication inaccuracies; (2) non-linearity and variations in the read-out electronics and (3) optical path effects. The non-uniformity will be reduced by non-uniformity correction (NUC) algorithms. The NUC algorithms are often divided into calibration-based non-uniformity correction (CBNUC) algorithms and scene-based non-uniformity correction (SBNUC) algorithms. As non-uniformity drifts temporally, CBNUC algorithms must be repeated by inserting a uniform radiation source which SBNUC algorithms do not need into the view, so the SBNUC algorithm becomes an essential part of infrared imaging system. The SBNUC algorithms' poor robustness often leads two defects: artifacts and over-correction, meanwhile due to complicated calculation process and large storage consumption, hardware implementation of the SBNUC algorithms is difficult, especially in Field Programmable Gate Array (FPGA) platform. The THP and GM algorithm proposed in this paper can eliminate the non-uniformity without causing defects. The hardware implementation of the algorithm only based on FPGA has two advantages: (1) low resources consumption, and (2) small hardware delay: less than 20 lines, it can be transplanted to a variety of infrared detectors equipped with FPGA image processing module, it can reduce the stripe non-uniformity and the ripple non-uniformity.
Error-proofing test system of industrial components based on image processing
NASA Astrophysics Data System (ADS)
Huang, Ying; Huang, Tao
2018-05-01
Due to the improvement of modern industrial level and accuracy, conventional manual test fails to satisfy the test standards of enterprises, so digital image processing technique should be utilized to gather and analyze the information on the surface of industrial components, so as to achieve the purpose of test. To test the installation parts of automotive engine, this paper employs camera to capture the images of the components. After these images are preprocessed including denoising, the image processing algorithm relying on flood fill algorithm is used to test the installation of the components. The results prove that this system has very high test accuracy.
ATES/heat pump simulations performed with ATESSS code
NASA Astrophysics Data System (ADS)
Vail, L. W.
1989-01-01
Modifications to the Aquifer Thermal Energy Storage System Simulator (ATESSS) allow simulation of aquifer thermal energy storage (ATES)/heat pump systems. The heat pump algorithm requires a coefficient of performance (COP) relationship of the form: COP = COP sub base + alpha (T sub ref minus T sub base). Initial applications of the modified ATES code to synthetic building load data for two sizes of buildings in two U.S. cities showed insignificant performance advantage of a series ATES heat pump system over a conventional groundwater heat pump system. The addition of algorithms for a cooling tower and solar array improved performance slightly. Small values of alpha in the COP relationship are the principal reason for the limited improvement in system performance. Future studies at Pacific Northwest Laboratory (PNL) are planned to investigate methods to increase system performance using alternative system configurations and operations scenarios.
Light-weight reference-based compression of FASTQ data.
Zhang, Yongpeng; Li, Linsen; Yang, Yanli; Yang, Xiao; He, Shan; Zhu, Zexuan
2015-06-09
The exponential growth of next generation sequencing (NGS) data has posed big challenges to data storage, management and archive. Data compression is one of the effective solutions, where reference-based compression strategies can typically achieve superior compression ratios compared to the ones not relying on any reference. This paper presents a lossless light-weight reference-based compression algorithm namely LW-FQZip to compress FASTQ data. The three components of any given input, i.e., metadata, short reads and quality score strings, are first parsed into three data streams in which the redundancy information are identified and eliminated independently. Particularly, well-designed incremental and run-length-limited encoding schemes are utilized to compress the metadata and quality score streams, respectively. To handle the short reads, LW-FQZip uses a novel light-weight mapping model to fast map them against external reference sequence(s) and produce concise alignment results for storage. The three processed data streams are then packed together with some general purpose compression algorithms like LZMA. LW-FQZip was evaluated on eight real-world NGS data sets and achieved compression ratios in the range of 0.111-0.201. This is comparable or superior to other state-of-the-art lossless NGS data compression algorithms. LW-FQZip is a program that enables efficient lossless FASTQ data compression. It contributes to the state of art applications for NGS data storage and transmission. LW-FQZip is freely available online at: http://csse.szu.edu.cn/staff/zhuzx/LWFQZip.
FPGA based charge acquisition algorithm for soft x-ray diagnostics system
NASA Astrophysics Data System (ADS)
Wojenski, A.; Kasprowicz, G.; Pozniak, K. T.; Zabolotny, W.; Byszuk, A.; Juszczyk, B.; Kolasinski, P.; Krawczyk, R. D.; Zienkiewicz, P.; Chernyshova, M.; Czarski, T.
2015-09-01
Soft X-ray (SXR) measurement systems working in tokamaks or with laser generated plasma can expect high photon fluxes. Therefore it is necessary to focus on data processing algorithms to have the best possible efficiency in term of processed photon events per second. This paper refers to recently designed algorithm and data-flow for implementation of charge data acquisition in FPGA. The algorithms are currently on implementation stage for the soft X-ray diagnostics system. In this paper despite of the charge processing algorithm is also described general firmware overview, data storage methods and other key components of the measurement system. The simulation section presents algorithm performance and expected maximum photon rate.
Leveraging social system networks in ubiquitous high-data-rate health systems.
Massey, Tammara; Marfia, Gustavo; Stoelting, Adam; Tomasi, Riccardo; Spirito, Maurizio A; Sarrafzadeh, Majid; Pau, Giovanni
2011-05-01
Social system networks with high data rates and limited storage will discard data if the system cannot connect and upload the data to a central server. We address the challenge of limited storage capacity in mobile health systems during network partitions with a heuristic that achieves efficiency in storage capacity by modifying the granularity of the medical data during long intercontact periods. Patterns in the connectivity, reception rate, distance, and location are extracted from the social system network and leveraged in the global algorithm and online heuristic. In the global algorithm, the stochastic nature of the data is modeled with maximum likelihood estimation based on the distribution of the reception rates. In the online heuristic, the correlation between system position and the reception rate is combined with patterns in human mobility to estimate the intracontact and intercontact time. The online heuristic performs well with a low data loss of 2.1%-6.1%.
Code of Federal Regulations, 2013 CFR
2013-07-01
... dunes, severe wind or soil erosion, frequent flooding, avalanches and areas of unstable geology...-handling, preparation, extraction or storage facilities, and other capital-intensive activities. Costs of...
Code of Federal Regulations, 2012 CFR
2012-07-01
... dunes, severe wind or soil erosion, frequent flooding, avalanches and areas of unstable geology...-handling, preparation, extraction or storage facilities, and other capital-intensive activities. Costs of...
Code of Federal Regulations, 2014 CFR
2014-07-01
... dunes, severe wind or soil erosion, frequent flooding, avalanches and areas of unstable geology...-handling, preparation, extraction or storage facilities, and other capital-intensive activities. Costs of...
Lenz, Bernard N.; Saad, David A.; Fitzpatrick, Faith A.
2003-01-01
The effects of land cover on flooding and base-flow characteristics of Whittlesey Creek, Bayfield County, Wis., were examined in a study that involved ground-water-flow and rainfall-runoff modeling. Field data were collected during 1999-2001 for synoptic base flow, streambed head and temperature, precipitation, continuous streamflow and stream stage, and other physical characteristics. Well logs provided data for potentiometric-surface altitudes and stratigraphic descriptions. Geologic, soil, hydrography, altitude, and historical land-cover data were compiled into a geographic information system and used in two ground-water-flow models (GFLOW and MODFLOW) and a rainfall-runoff model (SWAT). A deep ground-water system intersects Whittlesey Creek near the confluence with the North Fork, producing a steady base flow of 17?18 cubic feet per second. Upstream from the confluence, the creek has little or no base flow; flow is from surface runoff and a small amount of perched ground water. Most of the base flow to Whittlesey Creek originates as recharge through the permeable sands in the center of the Bayfield Peninsula to the northwest of the surface-water-contributing basin. Based on simulations, model-wide changes in recharge caused a proportional change in simulated base flow for Whittlesey Creek. Changing the simulated amount of recharge by 25 to 50 percent in only the ground-water-contributing area results in relatively small changes in base flow to Whittlesey Creek (about 2?11 percent). Simulated changes in land cover within the Whittlesey Creek surface-water-contributing basin would have minimal effects on base flow and average annual runoff, but flood peaks (based on daily mean flows on peak-flow days) could be affected. Based on the simulations, changing the basin land cover to a reforested condition results in a reduction in flood peaks of about 12 to 14 percent for up to a 100-yr flood. Changing the basin land cover to 25 percent urban land or returning basin land cover to the intensive row-crop agriculture of the 1920s results in flood peaks increasing by as much as 18 percent. The SWAT model is limited to a daily time step, which is adequate for describing the surface-water/ground-water interaction and percentage changes. It may not, however, be adequate in describing peak flow because the instantaneous peak flow in Whittlesey Creek during a flood can be more than twice the magnitude of the daily mean flow during that same flood. In addition, the storage and infiltration capacities of wetlands in the basin are not fully understood and need further study.
Towards an Efficient Flooding Scheme Exploiting 2-Hop Backward Information in MANETs
NASA Astrophysics Data System (ADS)
Le, Trong Duc; Choo, Hyunseung
Flooding is an indispensable operation for providing control or routing functionalities to mobile ad hoc networks (MANETs). Previously, many flooding schemes have been studied with the intention of curtailing the problems of severe redundancies, contention, and collisions in traditional implementations. A recent approach with relatively high efficiency is 1HI by Liu et al., which uses only 1-hop neighbor information. The scheme achieves local optimality in terms of the number of retransmission nodes with time complexity &Theta(n log n), where n is the number of neighbors of a node; however, this method tends to make many redundant transmissions. In this paper, we present a novel flooding algorithm, 2HBI (2-hop backward information), that efficiently reduces the number of retransmission nodes and solves the broadcast storm problem in ad hoc networks using our proposed concept, “2-hop backward information.” The most significant feature of the proposed algorithm is that it does not require any extra communication overhead other than the exchange of 1-hop HELLO messages but maintains high deliverability. Comprehensive computer simulations show that the proposed scheme significantly reduces redundant transmissions in 1HI and in pure flooding, up to 38% and 91%, respectively; accordingly it alleviates contention and collisions in networks.
Modelling Inland Flood Events for Hazard Maps in Taiwan
NASA Astrophysics Data System (ADS)
Ghosh, S.; Nzerem, K.; Sassi, M.; Hilberts, A.; Assteerawatt, A.; Tillmanns, S.; Mathur, P.; Mitas, C.; Rafique, F.
2015-12-01
Taiwan experiences significant inland flooding, driven by torrential rainfall from plum rain storms and typhoons during summer and fall. From last 13 to 16 years data, 3,000 buildings were damaged by such floods annually with a loss US$0.41 billion (Water Resources Agency). This long, narrow island nation with mostly hilly/mountainous topography is located at tropical-subtropical zone with annual average typhoon-hit-frequency of 3-4 (Central Weather Bureau) and annual average precipitation of 2502mm (WRA) - 2.5 times of the world's average. Spatial and temporal distributions of countrywide precipitation are uneven, with very high local extreme rainfall intensities. Annual average precipitation is 3000-5000mm in the mountainous regions, 78% of it falls in May-October, and the 1-hour to 3-day maximum rainfall are about 85 to 93% of the world records (WRA). Rivers in Taiwan are short with small upstream areas and high runoff coefficients of watersheds. These rivers have the steepest slopes, the shortest response time with rapid flows, and the largest peak flows as well as specific flood peak discharge (WRA) in the world. RMS has recently developed a countrywide inland flood model for Taiwan, producing hazard return period maps at 1arcsec grid resolution. These can be the basis for evaluating and managing flood risk, its economic impacts, and insured flood losses. The model is initiated with sub-daily historical meteorological forcings and calibrated to daily discharge observations at about 50 river gauges over the period 2003-2013. Simulations of hydrologic processes, via rainfall-runoff and routing models, are subsequently performed based on a 10000 year set of stochastic forcing. The rainfall-runoff model is physically based continuous, semi-distributed model for catchment hydrology. The 1-D wave propagation hydraulic model considers catchment runoff in routing and describes large-scale transport processes along the river. It also accounts for reservoir storage. Major historical flood events have been successfully simulated along with spatial patterns of flows. Comparison of stochastic discharge statistics w.r.t. observed ones from Hydrological Year Books of Taiwan over all recorded years are also in good agreement.
Impact of Prairie Cover on Hydraulic Conductivity and Storm Water Runoff
NASA Astrophysics Data System (ADS)
Herkes, D. M. G.; Gori, A.; Juan, A.
2017-12-01
Houston has long struggled to find effective solutions to its historic flooding problems. Conventional strategies have revolved around constructing hard infrastructure such as levees or regional detention ponds to reduce flood impacts. However, there has been a recent shift to explore the implementation of nature-based solutions in reducing flood impacts. This is due to the price of structural mechanisms, as well as their failure to adequately protect areas from flooding during the latest flood events. One alternative could be utilizing the natural water retention abilities of native Texas prairies. This study examines the effect of Texas prairie areas in increasing soil infiltration capacities, thereby increasing floodwater storage and reducing surface runoff. For this purpose, an infiltration study of 15 sites was conducted on lands owned by the Katy Prairie Conservancy within Cypress Creek watershed. Located in Northwest Houston, it is an area which had been heavily impacted by recent flood events. Each sampling site was selected to represent a particular land cover or vegetation type, ranging from developed open space to native prairies. Field test results are then compared to literature values of soil infiltration capacity in order to determine the infiltration benefit of each vegetation type. Test results show that certain vegetation, especially prairies, significantly increase the infiltration capacity of the underlying soil. For example, the hydraulic conductivity of prairie on sandy loam soil is approximately an order of magnitude higher than that of the soil itself. Finally, a physics-based hydrologic model is utilized to evaluate the flood reduction potential of native Texas prairie. This model represents Cypress Creek watershed in gridded cell format, and allows varying hydraulic and infiltration parameters at each cell. Design storms are run to obtain flow hydrographs for selected watch points in the study area. Two scenarios are simulated and compared: 1) infiltration capacity from soil only and 2) the augmented infiltration capacity of soil due to vegetation. Modeled results show a notable decrease in both total runoff volume and peak flows under the augmented infiltration scenario. This decrease demonstrates the benefit of native Texas prairie land in reducing flood risks.
Cloud Based Metalearning System for Predictive Modeling of Biomedical Data
Vukićević, Milan
2014-01-01
Rapid growth and storage of biomedical data enabled many opportunities for predictive modeling and improvement of healthcare processes. On the other side analysis of such large amounts of data is a difficult and computationally intensive task for most existing data mining algorithms. This problem is addressed by proposing a cloud based system that integrates metalearning framework for ranking and selection of best predictive algorithms for data at hand and open source big data technologies for analysis of biomedical data. PMID:24892101
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-30
... Reservoir due to AVC and Excess Capacity Master Contract operations and potential contributions to flooding... Southeastern for storage of non-Fry-Ark Project water in Pueblo Reservoir, a feature of the Fry-Ark Project... storage in Pueblo Reservoir for entities within its boundaries in the Upper Arkansas basin, Lower Arkansas...
Variable parameter McCarthy-Muskingum routing method considering lateral flow
NASA Astrophysics Data System (ADS)
Yadav, Basant; Perumal, Muthiah; Bardossy, Andras
2015-04-01
The fully mass conservative variable parameter McCarthy-Muskingum (VPMM) method recently proposed by Perumal and Price (2013) for routing floods in channels and rivers without considering lateral flow is extended herein for accounting uniformly distributed lateral flow contribution along the reach. The proposed procedure is applied for studying flood wave movement in a 24.2 km river stretch between Rottweil and Oberndorf gauging stations of Neckar River in Germany wherein significant lateral flow contribution by intermediate catchment rainfall prevails during flood wave movement. The geometrical elements of the cross-sectional information of the considered routing river stretch without considering lateral flow are estimated using the Robust Parameter Estimation (ROPE) algorithm that allows for arriving at the best performing set of bed width and side slope of a trapezoidal section. The performance of the VPMM method is evaluated using the Nash-Sutcliffe model efficiency criterion as the objective function to be maximized using the ROPE algorithm. The twenty-seven flood events in the calibration set are considered to identify the relationship between 'total rainfall' and 'total losses' as well as to optimize the geometric characteristics of the prismatic channel (width and slope of the trapezoidal section). Based on this analysis, a relationship between total rainfall and total loss of the intermediate catchment is obtained and then used to estimate the lateral flow in the reach. Assuming the lateral flow hydrograph is of the form of inflow hydrograph and using the total intervening catchment runoff estimated from the relationship, the uniformly distributed lateral flow rate qL at any instant of time is estimated for its use in the VPMM routing method. All the 27 flood events are simulated using this routing approach considering lateral flow along the reach. Many of these simulations are able to simulate the observed hydrographs very closely. The proposed approach of accounting lateral flow using the VPMM method is independently verified by routing flood hydrograph of 6 flood events which are not used in the total rainfall vs total loss relationship established for the intervening catchment of the studied river reach. Close reproduction of the outflow hydrographs of these independent events using the proposed VPMM method accounting for lateral flow demonstrate the practical utility of the method.
NASA Astrophysics Data System (ADS)
Ganguly, S.; Kumar, U.; Nemani, R. R.; Kalia, S.; Michaelis, A.
2016-12-01
In this work, we use a Fully Constrained Least Squares Subpixel Learning Algorithm to unmix global WELD (Web Enabled Landsat Data) to obtain fractions or abundances of substrate (S), vegetation (V) and dark objects (D) classes. Because of the sheer nature of data and compute needs, we leveraged the NASA Earth Exchange (NEX) high performance computing architecture to optimize and scale our algorithm for large-scale processing. Subsequently, the S-V-D abundance maps were characterized into 4 classes namely, forest, farmland, water and urban areas (with NPP-VIIRS - national polar orbiting partnership visible infrared imaging radiometer suite nighttime lights data) over California, USA using Random Forest classifier. Validation of these land cover maps with NLCD (National Land Cover Database) 2011 products and NAFD (North American Forest Dynamics) static forest cover maps showed that an overall classification accuracy of over 91% was achieved, which is a 6% improvement in unmixing based classification relative to per-pixel based classification. As such, abundance maps continue to offer an useful alternative to high-spatial resolution data derived classification maps for forest inventory analysis, multi-class mapping for eco-climatic models and applications, fast multi-temporal trend analysis and for societal and policy-relevant applications needed at the watershed scale.
An unstructured grid, three-dimensional model based on the shallow water equations
Casulli, V.; Walters, R.A.
2000-01-01
A semi-implicit finite difference model based on the three-dimensional shallow water equations is modified to use unstructured grids. There are obvious advantages in using unstructured grids in problems with a complicated geometry. In this development, the concept of unstructured orthogonal grids is introduced and applied to this model. The governing differential equations are discretized by means of a semi-implicit algorithm that is robust, stable and very efficient. The resulting model is relatively simple, conserves mass, can fit complicated boundaries and yet is sufficiently flexible to permit local mesh refinements in areas of interest. Moreover, the simulation of the flooding and drying is included in a natural and straightforward manner. These features are illustrated by a test case for studies of convergence rates and by examples of flooding on a river plain and flow in a shallow estuary. Copyright ?? 2000 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Ballesteros-Cánovas, Juan Antonio; Stoffel, Markus; Trappmann, Daniel; Shekhar, Mayank; Bhattacharyya, Amalava
2016-04-01
Floods are a common natural hazard in the Western Indian Himalayas. They usually occur when humid monsoon airs are lifted along the Himalayan relief, thereby creating intense orographic rainfall and runoff, a process which is often enhanced by simultaneous snowmelt. Monsoon floods are considered a major threat in the region and frequently affect inhabited valleys, disturbing the status quo of communities, stressing the future welfare and condition of their economic development. Given the assumption that ongoing and future climatic changes may impact on monsoon patterns and extreme precipitation, the implementation of adaptation policies in this region is critically needed in order to improve local resilience of Himalayan communities. However, its success implementation is highly dependent on system knowledge and hence reliable baseline data of past disasters. In this communication, we demonstrate how newly gained knowledge on past flood incidents may improve flood hazard and risk assessments. Based on growth-ring analysis of trees growing in the floodplains and other, more classical paleo-hydrology techniques, we reconstruct the regional flood activity for the last decades. This information is then included as non-systematic data into the regional flood frequency by using Bayesian Markov Monte Carlo Chain algorithms, so as to analyse the impact of the additional data on flood hazard assessments. Moreover, through a detailed analysis of three flood risk hotspots, we demonstrate how the newly gained knowledge on past flood disasters derived from indirect proxies can explain failures in the implementation of disaster risk management (DRM). Our methodology allowed identification of thirty-four unrecorded flood events at the study sites located in the upper reaches since the early 20th century, and thus completion of the existing flood history in the region based on flow measurements in the lower part of the catchment. We observe that 56% of the floods occurred simultaneously in more than two catchments, and that in 15% of the cases more than four catchments were affected. By contrast, 44% of event years were related with one specific catchment, corroborating the assumption that large-scale atmospheric conditions and specific weather and/or geomorphic conditions may operate as triggers of floods in Kullu district. The inclusion of peak discharge data related with these ungauged extreme flood events into the regional flood frequency evidenced that flood hazard was systematically underestimated. Our results allowed to highlight the potential causes of three paradigmatic cases of flood disaster incidents at Kullus district, suggesting that the lack of knowledge on past flood disaster could play an important role in Disaster Risk managment (DRM) at three actors-levels i.e. civil engineering, local authorities and inhabitants. These observations show that reliable DRM implementation is conditioned by lack of data to characterize the flood process, and therefore put in value the palaeohydrological approach used in this study.
The Optimization of Automatically Generated Compilers.
1987-01-01
than their procedural counterparts, and are also easier to analyze for storage optimizations; (2) AGs can be algorithmically checked to be non-circular...Providing algorithms to move the storage for many attributes from the For structure tree into global stacks and variables. -Dd(2) Creating AEs which build and...54 3.5.2. Partitioning algorithm
De Vleeschauwer, K; Weustenraad, J; Nolf, C; Wolfs, V; De Meulder, B; Shannon, K; Willems, P
2014-01-01
Urbanization and climate change trends put strong pressures on urban water systems. Temporal variations in rainfall, runoff and water availability increase, and need to be compensated for by innovative adaptation strategies. One of these is stormwater retention and infiltration in open and/or green spaces in the city (blue-green water integration). This study evaluated the efficiency of three adaptation strategies for the city of Turnhout in Belgium, namely source control as a result of blue-green water integration, retention basins located downstream of the stormwater sewers, and end-of-pipe solutions based on river flood control reservoirs. The efficiency of these options is quantified by the reduction in sewer and river flood frequencies and volumes, and sewer overflow volumes. This is done by means of long-term simulations (100-year rainfall simulations) using an integrated conceptual sewer-river model calibrated to full hydrodynamic sewer and river models. Results show that combining open, green zones in the city with stormwater retention and infiltration for only 1% of the total city runoff area would lead to a 30 to 50% reduction in sewer flood volumes for return periods in the range 10-100 years. This is due to the additional surface storage and infiltration and consequent reduction in urban runoff. However, the impact of this source control option on downstream river floods is limited. Stormwater retention downstream of the sewer system gives a strong reduction in peak discharges to the receiving river. However due to the difference in response time between the sewer and river systems, this does not lead to a strong reduction in river flood frequency. The paper shows the importance of improving the interface between urban design and water management, and between sewer and river flood management.
Motor Control and Regulation for a Flywheel Energy Storage System
NASA Technical Reports Server (NTRS)
Kenny, Barbara; Lyons, Valerie
2003-01-01
This talk will focus on the motor control algorithms used to regulate the flywheel system at the NASA Glenn Research Center. First a discussion of the inner loop torque control technique will be given. It is based on the principle of field orientation and is implemented without a position or speed sensor (sensorless control). Then the outer loop charge and discharge algorithm will be presented. This algorithm controls the acceleration of the flywheel during charging and the deceleration while discharging. The algorithm also allows the flywheel system to regulate the DC bus voltage during the discharge cycle.
NASA Astrophysics Data System (ADS)
Lissak, Candide; Fort, Monique; Arnaud-Fassetta, Gilles; Mathieu, Alexandre; Malet, Jean-Philippe; Carlier, Benoit; Betard, François; Cossart, Etienne; Madelin, Malika; Viel, Vincent; Charney, Bérengère; Bletterie, Xavier
2014-05-01
The Guil River catchment (Queyras, Southern French Alps) is prone to hydro-geomorphic hazards related to catastrophic floods, with an amplification of their impacts due to strong hillslope-channel connectivity such as in 1957 (> R.I. 100 yr), and more recently in 2000 (R.I. 30 yr). In both cases, the rainfall intensity, aggravated by pre-existing saturated soils, explained the immediate response of the fluvial system and the subsequent destabilisation of slopes. This resulted in serious damages to infrastructure and buildings in the valley bottom, mostly along some specific reaches and confluences with debris flow prone tributaries. After each event, new protective structures are built. One of the purposes of this study, undertaken in the frame of the SAMCO (ANR) project, was to understand the hydro-geomorphological functioning of this upper Alpine catchment in a context of hazards mitigation and sustainable management of sediment yield, transfer and deposition. To determine the main sediment storages that could be mobilised during the next major hydro-meteorological events, the first step of our study consists in the identification and characterisation of areas that play a role into the sediment transfer processing. From environmental characteristics (channel geometric, vegetation cover…) and anthropogenic factors (hydraulic infrastructures, urban development…), a semi-automatic method provides a typology of contribution areas with sediment storages sensitive to erosion, or areas that will be prone to deposition of sediments during the next flooding event. The second step of the study is focused on the sediment storages with their characterisation and connectivity to the trunk channel. Taking into account the entire catchment and including the torrential system, this phase analyses the sedimentary transfers from the identification and classification of sediment storages to the evaluation of the degree of connectivity with the main or secondary channels. The proposed methodology is based on data directly derived from GIS analysis using interpretation of aerial photographs, regional scale Digital Elevation Model (DEM), high-resolution DEM derived from airborne-based LiDAR, and field survey. The data thus obtained can be used in the final geomorphological map. Future investigations will quantify the contribution of each sub-catchment in the global sediment budget of the Guil catchment. For a better assessment of sediment fluxes and sediment delivery into the main channel network, tracers (pit-tags) and diachronic Terrestrial Laser Scanning will be performed in selected sub-catchments in order to measure erosion rates and contribution to the sediment yield in the valley bottoms during the floods, avalanches and rainfall seasonal events.
NASA Astrophysics Data System (ADS)
Martinez-Gutierrez, Genaro
Baja California Sur (Mexico), as well as mainland Mexico, is affected by tropical cyclone storms, which originate in the eastern north Pacific. Historical records show that Baja has been damaged by intense summer storms. An arid to semiarid climate characterizes the study area, where precipitation mainly occurs during the summer and winter seasons. Natural and anthropogenic changes have impacted the landscape of southern Baja. The present research documents the effects of tropical storms over the southern region of Baja California for a period of approximately twenty-six years. The goal of the research is to demonstrate how remote sensing can be used to detect the important effects of tropical storms including: (a) evaluation of change detection algorithms, and (b) delineating changes to the landscape including coastal modification, fluvial erosion and deposition, vegetation change, river avulsion using change detection algorithms. Digital image processing methods with temporal Landsat satellite remotely sensed data from the North America Landscape Characterization archive (NALC), Thematic Mapper (TM), and Enhanced Thematic Mapper (ETM) images were used to document the landscape change. Two image processing methods were tested including Image differencing (ID), and Principal Component Analysis (PCA). Landscape changes identified with the NALC archive and TM images showed that the major changes included a rapid change of land use in the towns of San Jose del Cabo and Cabo San Lucas between 1973 and 1986. The features detected using the algorithms included flood deposits within the channels of active streams, erosion banks, and new channels caused by channel avulsion. Despite the 19 year period covered by the NALC data and approximately 10 year intervals between acquisition dates, there were changed features that could be identified in the images. The TM images showed that flooding from Hurricane Isis (1998) produced new large deposits within the stream channels. This research has shown that remote sensing based change detection can delineate the effects of flooding on the landscape at scales down to the nominal resolution of the sensor. These findings indicate that many other applications for change detection are both viable and important. These include disaster response, flood hazard planning, geomorphic studies, water supply management in deserts.
Deciphering flood frequency curves from a coupled human-nature system perspective
NASA Astrophysics Data System (ADS)
Li, H. Y.; Abeshu, G. W.; Wang, W.; Ye, S.; Guo, J.; Bloeschl, G.; Leung, L. R.
2017-12-01
Most previous studies and applications in deriving or applying FFC are underpinned by the stationarity assumption. To examine the theoretical robustness of this basic assumption, we analyzed the observed FFCs at hundreds of catchments in the contiguous United States along the gradients of climate conditions and human influences. The shape of FFCs is described using three similarity indices: mean annual floods (MAF), coefficient of variance (CV), and a seasonality index defined using circular statistics. The characteristics of catchments are quantified with a small number of dimensionless indices, including particularly: 1) the climatic aridity index, AI, which is a measure of the competition between energy and water availability; 2) reservoir impact index, defined as the total upstream reservoir storage capacity normalized by the annual streamflow volume. The linkages between these two sets of indices are then explored based on a combination of mathematical derivations of the Budyko formula, simple but physically based reservoir operation models, and other auxiliary data. It is found that the shape of FFCs shifts from arid to humid climate, and from periods with weak human influences to periods with strong influences. The seasonality of floods is found to be largely controlled by the synchronization between the seasonal cycles of precipitation and solar radiation in pristine catchments, but also by the reservoir regulation capacity in managed catchments. Our findings may help improve flood-risk assessment and mitigation in both natural and regulated river systems across various climate gradients.
Bassett Creek Watershed, Hennepin County, Minnesota. Feasibility Report for Control. Appendixes.
1976-03-01
maintenance of the creek corridor . The local interests objected to any plan that would impair the aesthetics of the creek. The needs of the watershed with...OPEN CHANNEL CORRIDOR TO THE MISSISSIPPI RIVR (Alternate 5-E) ...... .............. D-26 COMBINATIONS OF NONSTRUCTURAL AND STRUCTURAL ALTERNATIVES...AND DRE TURNEL (Alternate 6-D) . . ... . . . . . . . . . . D-30 FLOOD STORAGE AND FLOOD PROOFIM. WIT7 AN O(IUI SPACE-- OPEN CHANNEL CORRIDOR TO THE
Sacramento Metropolitan Area, California
1992-02-01
restriction would apply to virtually all of West Sacramento. Future conditions in the bypass areas are expected to remain essentially the same. During...frequency, the stage-frequency curve in the study area essentially becomes flat because of the large storage volume behind upstream levee breaches. This curve...and 400-year flood plains are also essentially the same (15 to 16 feet) because of the following: 1) the flood volume for each event is sufficient to
Distributed Economic Dispatch in Microgrids Based on Cooperative Reinforcement Learning.
Liu, Weirong; Zhuang, Peng; Liang, Hao; Peng, Jun; Huang, Zhiwu; Weirong Liu; Peng Zhuang; Hao Liang; Jun Peng; Zhiwu Huang; Liu, Weirong; Liang, Hao; Peng, Jun; Zhuang, Peng; Huang, Zhiwu
2018-06-01
Microgrids incorporated with distributed generation (DG) units and energy storage (ES) devices are expected to play more and more important roles in the future power systems. Yet, achieving efficient distributed economic dispatch in microgrids is a challenging issue due to the randomness and nonlinear characteristics of DG units and loads. This paper proposes a cooperative reinforcement learning algorithm for distributed economic dispatch in microgrids. Utilizing the learning algorithm can avoid the difficulty of stochastic modeling and high computational complexity. In the cooperative reinforcement learning algorithm, the function approximation is leveraged to deal with the large and continuous state spaces. And a diffusion strategy is incorporated to coordinate the actions of DG units and ES devices. Based on the proposed algorithm, each node in microgrids only needs to communicate with its local neighbors, without relying on any centralized controllers. Algorithm convergence is analyzed, and simulations based on real-world meteorological and load data are conducted to validate the performance of the proposed algorithm.
England, John F.; Salas, José D.; Jarrett, Robert D.
2003-01-01
The expected moments algorithm (EMA) [Cohn et al., 1997] and the Bulletin 17B [Interagency Committee on Water Data, 1982] historical weighting procedure (B17H) for the log Pearson type III distribution are compared by Monte Carlo computer simulation for cases in which historical and/or paleoflood data are available. The relative performance of the estimators was explored for three cases: fixed‐threshold exceedances, a fixed number of large floods, and floods generated from a different parent distribution. EMA can effectively incorporate four types of historical and paleoflood data: floods where the discharge is explicitly known, unknown discharges below a single threshold, floods with unknown discharge that exceed some level, and floods with discharges described in a range. The B17H estimator can utilize only the first two types of historical information. Including historical/paleoflood data in the simulation experiments significantly improved the quantile estimates in terms of mean square error and bias relative to using gage data alone. EMA performed significantly better than B17H in nearly all cases considered. B17H performed as well as EMA for estimating X100 in some limited fixed‐threshold exceedance cases. EMA performed comparatively much better in other fixed‐threshold situations, for the single large flood case, and in cases when estimating extreme floods equal to or greater than X500. B17H did not fully utilize historical information when the historical period exceeded 200 years. Robustness studies using GEV‐simulated data confirmed that EMA performed better than B17H. Overall, EMA is preferred to B17H when historical and paleoflood data are available for flood frequency analysis.
NASA Astrophysics Data System (ADS)
England, John F.; Salas, José D.; Jarrett, Robert D.
2003-09-01
The expected moments algorithm (EMA) [, 1997] and the Bulletin 17B [, 1982] historical weighting procedure (B17H) for the log Pearson type III distribution are compared by Monte Carlo computer simulation for cases in which historical and/or paleoflood data are available. The relative performance of the estimators was explored for three cases: fixed-threshold exceedances, a fixed number of large floods, and floods generated from a different parent distribution. EMA can effectively incorporate four types of historical and paleoflood data: floods where the discharge is explicitly known, unknown discharges below a single threshold, floods with unknown discharge that exceed some level, and floods with discharges described in a range. The B17H estimator can utilize only the first two types of historical information. Including historical/paleoflood data in the simulation experiments significantly improved the quantile estimates in terms of mean square error and bias relative to using gage data alone. EMA performed significantly better than B17H in nearly all cases considered. B17H performed as well as EMA for estimating X100 in some limited fixed-threshold exceedance cases. EMA performed comparatively much better in other fixed-threshold situations, for the single large flood case, and in cases when estimating extreme floods equal to or greater than X500. B17H did not fully utilize historical information when the historical period exceeded 200 years. Robustness studies using GEV-simulated data confirmed that EMA performed better than B17H. Overall, EMA is preferred to B17H when historical and paleoflood data are available for flood frequency analysis.
Spieker, Andrew Maute
1970-01-01
Water management can be an integral part of urban comprehensive planning in a large metropolitan area. Water both imposes constraints on land use and offers opportunities for coordinated land and water management. Salt Creek basin in Cook and Du Page Counties of the Chicago metropolitan area is typical of rapidly developing suburban areas and has been selected to illustrate some of these constraints and opportunities and to suggest the effects of alternative solutions. The present study concentrates on the related problems of ground-water recharge, water quality, management of flood plains, and flood-control measures. Salt Creek basin has a drainage area of 150 square miles. It is in flat to. gently rolling terrain, underlain by glacial drift as much as 200 feet thick which covers a dolomite aquifer. In 1964, the population of the basin was about 400,000, and 40 percent of the land was in urban development. The population is expected to number 550,000 to 650,000 by 1990, and most of the land will be taken by urban development. Salt Creek is a sluggish stream, typical of small drainage channels in the headwaters area of northeastern Illinois. Low flows of 15 to 25 cubic feet per second in the lower part of the basin consist largely of sewage effluent. Nearly all the public water supplies in the basin depend on ground water. Of the total pumpage of 27.5 million gallons per day, 17.5 million gallons per day is pumped from the deep (Cambrian-Ordovician) aquifers and 10 million gallons per day is pumped from the shallow (Silurian dolomite and glacial drift) aquifers. The potential yield of the shallow aquifers, particularly glacial drift in the northern part of the basin, far exceeds present use. The largest concentration of pumpage from the shallow ,aquifers is in the Hinsdale-La Grange area. Salt Creek serves as an important source of recharge to these supplies, particularly just east of Hinsdale. The entire reach of Salt Creek south and east of Elmhurst can be regarded as an area of potential recharge to the shallow aquifers. Preservation of the effectiveness of these potential recharge areas should be considered in land-use planning. Salt Creek is polluted in times of both low and high flow. Most communities in the basin in Du Page County discharge their treated sewage into the creek, whereas those in Cook County transfer their sewage to plants of the Metropolitan Sanitary District outside the basin. During periods of high runoff, combined storm runoff and overflow from sanitary sewers enter the creek. Such polluted water detracts from the stream's esthetic and recreational potential and poses a threat to ground-water supplies owing to induced recharge of polluted water to shallow aquifers. Alternative approaches .to the pollution problem include improvement of the degree of sewage treatment, detention and treatment of storm runoff, dilution of sewage through flow augmentation, or transfer of sewage from the basin to a central treatment plant. To result in an enhanced environment, the streambed would have to be cleansed of accumulated sludge deposits. The overbank flooding in Salt Creek basin every 2 to 3 years presents problems because of encroachments and developments on the flood plains. Flood plains in an urban area can be managed by identifying them, by recognizing that either their natural storage capacity or equivalent artificial capacity is needed to accommodate floods, and by planning land use accordingly. Examples of effective floodplain management include (1) preservation of greenbelts or regional parks along stream courses, (2) use of flood plains for recreation, parking lots. or other low-intensity uses, (3) use of flood-proofed commercial buildings, and (4) provision for compensatory storage to replace natural storage capacity. Results of poor flood-plain management include uncontrolled residential development and encroachment by fill into natural storage areas where no compensatory storage has been
Edge-Based Efficient Search over Encrypted Data Mobile Cloud Storage
Liu, Fang; Cai, Zhiping; Xiao, Nong; Zhao, Ziming
2018-01-01
Smart sensor-equipped mobile devices sense, collect, and process data generated by the edge network to achieve intelligent control, but such mobile devices usually have limited storage and computing resources. Mobile cloud storage provides a promising solution owing to its rich storage resources, great accessibility, and low cost. But it also brings a risk of information leakage. The encryption of sensitive data is the basic step to resist the risk. However, deploying a high complexity encryption and decryption algorithm on mobile devices will greatly increase the burden of terminal operation and the difficulty to implement the necessary privacy protection algorithm. In this paper, we propose ENSURE (EfficieNt and SecURE), an efficient and secure encrypted search architecture over mobile cloud storage. ENSURE is inspired by edge computing. It allows mobile devices to offload the computation intensive task onto the edge server to achieve a high efficiency. Besides, to protect data security, it reduces the information acquisition of untrusted cloud by hiding the relevance between query keyword and search results from the cloud. Experiments on a real data set show that ENSURE reduces the computation time by 15% to 49% and saves the energy consumption by 38% to 69% per query. PMID:29652810
Edge-Based Efficient Search over Encrypted Data Mobile Cloud Storage.
Guo, Yeting; Liu, Fang; Cai, Zhiping; Xiao, Nong; Zhao, Ziming
2018-04-13
Smart sensor-equipped mobile devices sense, collect, and process data generated by the edge network to achieve intelligent control, but such mobile devices usually have limited storage and computing resources. Mobile cloud storage provides a promising solution owing to its rich storage resources, great accessibility, and low cost. But it also brings a risk of information leakage. The encryption of sensitive data is the basic step to resist the risk. However, deploying a high complexity encryption and decryption algorithm on mobile devices will greatly increase the burden of terminal operation and the difficulty to implement the necessary privacy protection algorithm. In this paper, we propose ENSURE (EfficieNt and SecURE), an efficient and secure encrypted search architecture over mobile cloud storage. ENSURE is inspired by edge computing. It allows mobile devices to offload the computation intensive task onto the edge server to achieve a high efficiency. Besides, to protect data security, it reduces the information acquisition of untrusted cloud by hiding the relevance between query keyword and search results from the cloud. Experiments on a real data set show that ENSURE reduces the computation time by 15% to 49% and saves the energy consumption by 38% to 69% per query.
A Hybrid Shared-Memory Parallel Max-Tree Algorithm for Extreme Dynamic-Range Images.
Moschini, Ugo; Meijster, Arnold; Wilkinson, Michael H F
2018-03-01
Max-trees, or component trees, are graph structures that represent the connected components of an image in a hierarchical way. Nowadays, many application fields rely on images with high-dynamic range or floating point values. Efficient sequential algorithms exist to build trees and compute attributes for images of any bit depth. However, we show that the current parallel algorithms perform poorly already with integers at bit depths higher than 16 bits per pixel. We propose a parallel method combining the two worlds of flooding and merging max-tree algorithms. First, a pilot max-tree of a quantized version of the image is built in parallel using a flooding method. Later, this structure is used in a parallel leaf-to-root approach to compute efficiently the final max-tree and to drive the merging of the sub-trees computed by the threads. We present an analysis of the performance both on simulated and actual 2D images and 3D volumes. Execution times are about better than the fastest sequential algorithm and speed-up goes up to on 64 threads.
NASA Astrophysics Data System (ADS)
Delaney, C.; Mendoza, J.; Whitin, B.; Hartman, R. K.
2017-12-01
Ensemble Forecast Operations (EFO) is a risk based approach of reservoir flood operations that incorporates ensemble streamflow predictions (ESPs) made by NOAA's California-Nevada River Forecast Center (CNRFC). With the EFO approach, each member of an ESP is individually modeled to forecast system conditions and calculate risk of reaching critical operational thresholds. Reservoir release decisions are computed which seek to manage forecasted risk to established risk tolerance levels. A water management model was developed for Lake Mendocino, a 111,000 acre-foot reservoir located near Ukiah, California, to evaluate the viability of the EFO alternative to improve water supply reliability but not increase downstream flood risk. Lake Mendocino is a dual use reservoir, which is owned and operated for flood control by the United States Army Corps of Engineers and is operated for water supply by the Sonoma County Water Agency. Due to recent changes in the operations of an upstream hydroelectric facility, this reservoir has suffered from water supply reliability issues since 2007. The EFO alternative was simulated using a 26-year (1985-2010) ESP hindcast generated by the CNRFC, which approximates flow forecasts for 61 ensemble members for a 15-day horizon. Model simulation results of the EFO alternative demonstrate a 36% increase in median end of water year (September 30) storage levels over existing operations. Additionally, model results show no increase in occurrence of flows above flood stage for points downstream of Lake Mendocino. This investigation demonstrates that the EFO alternative may be a viable approach for managing Lake Mendocino for multiple purposes (water supply, flood mitigation, ecosystems) and warrants further investigation through additional modeling and analysis.
Regional interdisciplinary paleoflood approach to assess extreme flood potential
Jarrett, Robert D.; Tomlinson, Edward M.
2000-01-01
In the past decade, there has been a growing interest of dam safety officials to incorporate a risk‐based analysis for design‐flood hydrology. Extreme or rare floods, with probabilities in the range of about 10−3 to 10−7 chance of occurrence per year, are of continuing interest to the hydrologic and engineering communities for purposes of planning and design of structures such as dams [National Research Council, 1988]. The National Research Council stresses that as much information as possible about floods needs to be used for evaluation of the risk and consequences of any decision. A regional interdisciplinary paleoflood approach was developed to assist dam safety officials and floodplain managers in their assessments of the risk of large floods. The interdisciplinary components included documenting maximum paleofloods and a regional analyses of contemporary extreme rainfall and flood data to complement a site‐specific probable maximum precipitation study [Tomlinson and Solak, 1997]. The cost‐effective approach, which can be used in many other hydrometeorologic settings, was applied to Elkhead Reservoir in Elkhead Creek (531 km2) in northwestern Colorado; the regional study area was 10,900 km2. Paleoflood data using bouldery flood deposits and noninundation surfaces for 88 streams were used to document maximum flood discharges that have occurred during the Holocene. Several relative dating methods were used to determine the age of paleoflood deposits and noninundation surfaces. No evidence of substantial flooding was found in the study area. The maximum paleoflood of 135 m3 s−1 for Elkhead Creek is about 13% of the site‐specific probable maximum flood of 1020 m3 s−1. Flood‐frequency relations using the expected moments algorithm, which better incorporates paleoflood data, were developed to assess the risk of extreme floods. Envelope curves encompassing maximum rainfall (181 sites) and floods (218 sites) were developed for northwestern Colorado to help define maximum contemporary and Holocene flooding in Elkhead Creek and in a regional frequency context. Study results for Elkhead Reservoir were accepted by the Colorado State Engineer for dam safety certification.
da Silva Ferreira, Cristiane; Piedade, Maria Teresa Fernandez; Tiné, Marco Aurélio Silva; Rossatto, Davi Rodrigo; Parolin, Pia; Buckeridge, Marcos Silveira
2009-01-01
Background and Aims In the Amazonian floodplains plants withstand annual periods of flooding which can last 7 months. Under these conditions seedlings remain submerged in the dark for long periods since light penetration in the water is limited. Himatanthus sucuuba is a tree species found in the ‘várzea’ (VZ) floodplains and adjacent non-flooded ‘terra-firme’ (TF) forests. Biochemical traits which enhance flood tolerance and colonization success of H. sucuuba in periodically flooded environments were investigated. Methods Storage carbohydrates of seeds of VZ and TF populations were extracted and analysed by HPAEC/PAD. Starch was analysed by enzyme (glucoamylase) degradation followed by quantification of glucose oxidase. Carbohydrate composition of roots of VZ and TF seedlings was studied after experimental exposure to a 15-d period of submersion in light versus darkness. Key Results The endosperm contains a large proportion of the seed reserves, raffinose being the main non-structural carbohydrate. Around 93 % of the cell wall storage polysaccharides (percentage dry weight basis) in the endosperm of VZ seeds was composed of mannose, while soluble sugars accounted for 2·5%. In contrast, 74 % of the endosperm in TF seeds was composed of galactomannans, while 22 % of the endosperm was soluble sugars. This suggested a larger carbohydrate allocation to germination in TF populations whereas VZ populations allocate comparatively more to carbohydrates mobilized during seedling development. The concentration of root non-structural carbohydrates in non-flooded seedlings strongly decreased after a 15-d period of darkness, whereas flooded seedlings were less affected. These effects were more pronounced in TF seedlings, which showed significantly lower root non-structural carbohydrate concentrations. Conclusions There seem to be metabolic adjustments in VZ but not TF seedlings that lead to adaptation to the combined stresses of darkness and flooding. This seems to be important for the survival of the species in these contrasting environments, leading these populations to different directions during evolution. PMID:19770164
Hydraulics of epiphreatic flow of a karst aquifer
NASA Astrophysics Data System (ADS)
Gabrovšek, Franci; Peric, Borut; Kaufmann, Georg
2018-05-01
The nature of epiphreatic flow remains an important research challenge in karst hydrology. This study focuses on the flood propagation along the epiphreatic system of Reka-Timavo system (Kras/Carso Plateau, Slovenia/Italy). It is based on long-term monitoring of basic physical parameters (pressure/level, temperature, specific electric conductivity) of ground water in six active caves belonging to the flow system. The system vigorously responds to flood events, with stage rising >100 m in some of the caves. Besides presenting the response of the system to flood events of different scales, the work focuses on the interpretation of recorded hydrographs in view of the known distribution and size of conduits and basic hydraulic relations. Furthermore, the hydrographs were used to infer the unknown geometry between the observation points. This way, the main flow restrictors, overflow passages and large epiphreatic storages were identified. The assumptions were tested with a hydraulic model, where the inversion procedure was used for an additional parameter optimisation. Time series of temperature and specific electric conductivity were used to assess the apparent velocities of flow between consecutive points.
Hierarchical trie packet classification algorithm based on expectation-maximization clustering
Bi, Xia-an; Zhao, Junxia
2017-01-01
With the development of computer network bandwidth, packet classification algorithms which are able to deal with large-scale rule sets are in urgent need. Among the existing algorithms, researches on packet classification algorithms based on hierarchical trie have become an important packet classification research branch because of their widely practical use. Although hierarchical trie is beneficial to save large storage space, it has several shortcomings such as the existence of backtracking and empty nodes. This paper proposes a new packet classification algorithm, Hierarchical Trie Algorithm Based on Expectation-Maximization Clustering (HTEMC). Firstly, this paper uses the formalization method to deal with the packet classification problem by means of mapping the rules and data packets into a two-dimensional space. Secondly, this paper uses expectation-maximization algorithm to cluster the rules based on their aggregate characteristics, and thereby diversified clusters are formed. Thirdly, this paper proposes a hierarchical trie based on the results of expectation-maximization clustering. Finally, this paper respectively conducts simulation experiments and real-environment experiments to compare the performances of our algorithm with other typical algorithms, and analyzes the results of the experiments. The hierarchical trie structure in our algorithm not only adopts trie path compression to eliminate backtracking, but also solves the problem of low efficiency of trie updates, which greatly improves the performance of the algorithm. PMID:28704476
A back-fitting algorithm to improve real-time flood forecasting
NASA Astrophysics Data System (ADS)
Zhang, Xiaojing; Liu, Pan; Cheng, Lei; Liu, Zhangjun; Zhao, Yan
2018-07-01
Real-time flood forecasting is important for decision-making with regards to flood control and disaster reduction. The conventional approach involves a postprocessor calibration strategy that first calibrates the hydrological model and then estimates errors. This procedure can simulate streamflow consistent with observations, but obtained parameters are not optimal. Joint calibration strategies address this issue by refining hydrological model parameters jointly with the autoregressive (AR) model. In this study, five alternative schemes are used to forecast floods. Scheme I uses only the hydrological model, while scheme II includes an AR model for error correction. In scheme III, differencing is used to remove non-stationarity in the error series. A joint inference strategy employed in scheme IV calibrates the hydrological and AR models simultaneously. The back-fitting algorithm, a basic approach for training an additive model, is adopted in scheme V to alternately recalibrate hydrological and AR model parameters. The performance of the five schemes is compared with a case study of 15 recorded flood events from China's Baiyunshan reservoir basin. Our results show that (1) schemes IV and V outperform scheme III during the calibration and validation periods and (2) scheme V is inferior to scheme IV in the calibration period, but provides better results in the validation period. Joint calibration strategies can therefore improve the accuracy of flood forecasting. Additionally, the back-fitting recalibration strategy produces weaker overcorrection and a more robust performance compared with the joint inference strategy.
Liu, Chen-Yi; Goertzen, Andrew L
2013-07-21
An iterative position-weighted centre-of-gravity algorithm was developed and tested for positioning events in a silicon photomultiplier (SiPM)-based scintillation detector for positron emission tomography. The algorithm used a Gaussian-based weighting function centred at the current estimate of the event location. The algorithm was applied to the signals from a 4 × 4 array of SiPM detectors that used individual channel readout and a LYSO:Ce scintillator array. Three scintillator array configurations were tested: single layer with 3.17 mm crystal pitch, matched to the SiPM size; single layer with 1.5 mm crystal pitch; and dual layer with 1.67 mm crystal pitch and a ½ crystal offset in the X and Y directions between the two layers. The flood histograms generated by this algorithm were shown to be superior to those generated by the standard centre of gravity. The width of the Gaussian weighting function of the algorithm was optimized for different scintillator array setups. The optimal width of the Gaussian curve was found to depend on the amount of light spread. The algorithm required less than 20 iterations to calculate the position of an event. The rapid convergence of this algorithm will readily allow for implementation on a front-end detector processing field programmable gate array for use in improved real-time event positioning and identification.
Research on compressive sensing reconstruction algorithm based on total variation model
NASA Astrophysics Data System (ADS)
Gao, Yu-xuan; Sun, Huayan; Zhang, Tinghua; Du, Lin
2017-12-01
Compressed sensing for breakthrough Nyquist sampling theorem provides a strong theoretical , making compressive sampling for image signals be carried out simultaneously. In traditional imaging procedures using compressed sensing theory, not only can it reduces the storage space, but also can reduce the demand for detector resolution greatly. Using the sparsity of image signal, by solving the mathematical model of inverse reconfiguration, realize the super-resolution imaging. Reconstruction algorithm is the most critical part of compression perception, to a large extent determine the accuracy of the reconstruction of the image.The reconstruction algorithm based on the total variation (TV) model is more suitable for the compression reconstruction of the two-dimensional image, and the better edge information can be obtained. In order to verify the performance of the algorithm, Simulation Analysis the reconstruction result in different coding mode of the reconstruction algorithm based on the TV reconstruction algorithm. The reconstruction effect of the reconfigurable algorithm based on TV based on the different coding methods is analyzed to verify the stability of the algorithm. This paper compares and analyzes the typical reconstruction algorithm in the same coding mode. On the basis of the minimum total variation algorithm, the Augmented Lagrangian function term is added and the optimal value is solved by the alternating direction method.Experimental results show that the reconstruction algorithm is compared with the traditional classical algorithm based on TV has great advantages, under the low measurement rate can be quickly and accurately recovers target image.
NASA Astrophysics Data System (ADS)
Mahalakshmi; Murugesan, R.
2018-04-01
This paper regards with the minimization of total cost of Greenhouse Gas (GHG) efficiency in Automated Storage and Retrieval System (AS/RS). A mathematical model is constructed based on tax cost, penalty cost and discount cost of GHG emission of AS/RS. A two stage algorithm namely positive selection based clonal selection principle (PSBCSP) is used to find the optimal solution of the constructed model. In the first stage positive selection principle is used to reduce the search space of the optimal solution by fixing a threshold value. In the later stage clonal selection principle is used to generate best solutions. The obtained results are compared with other existing algorithms in the literature, which shows that the proposed algorithm yields a better result compared to others.
A Review of Flood-Related Storage and Remobilization of Heavy Metal Pollutants in River Systems.
Ciszewski, Dariusz; Grygar, Tomáš Matys
Recently observed rapid climate changes have focused the attention of researchers and river managers on the possible effects of increased flooding frequency on the mobilization and redistribution of historical pollutants within some river systems. This text summarizes regularities in the flood-related transport, channel-to-floodplain transfer, and storage and remobilization of heavy metals, which are the most persistent environmental pollutants in river systems. Metal-dispersal processes are essentially much more variable in alluvia than in soils of non-inundated areas due to the effects of flood-sediment sorting and the mixing of pollutants with grains of different origins in a catchment, resulting in changes of one to two orders of magnitude in metal content over distances of centimetres. Furthermore, metal remobilization can be more intensive in alluvia than in soils as a result of bank erosion, prolonged floodplain inundation associated with reducing conditions alternating with oxygen-driven processes of dry periods and frequent water-table fluctuations, which affect the distribution of metals at low-lying strata. Moreover, metal storage and remobilization are controlled by river channelization, but their influence depends on the period and extent of the engineering works. Generally, artificial structures such as groynes, dams or cut-off channels performed before pollution periods favour the entrapment of polluted sediments, whereas the floodplains of lined river channels that adjust to new, post-channelization hydraulic conditions become a permanent sink for fine polluted sediments, which accumulate solely during overbank flows. Metal mobilization in such floodplains takes place only by slow leaching, and their sediments, which accrete at a moderate rate, are the best archives of the catchment pollution with heavy metals.
Chang, Li-Chiu; Chen, Pin-An; Chang, Fi-John
2012-08-01
A reliable forecast of future events possesses great value. The main purpose of this paper is to propose an innovative learning technique for reinforcing the accuracy of two-step-ahead (2SA) forecasts. The real-time recurrent learning (RTRL) algorithm for recurrent neural networks (RNNs) can effectively model the dynamics of complex processes and has been used successfully in one-step-ahead forecasts for various time series. A reinforced RTRL algorithm for 2SA forecasts using RNNs is proposed in this paper, and its performance is investigated by two famous benchmark time series and a streamflow during flood events in Taiwan. Results demonstrate that the proposed reinforced 2SA RTRL algorithm for RNNs can adequately forecast the benchmark (theoretical) time series, significantly improve the accuracy of flood forecasts, and effectively reduce time-lag effects.
Dimitrov, I. K.; Zhang, X.; Solovyov, V. F.; ...
2015-07-07
Recent advances in second-generation (YBCO) high-temperature superconducting wire could potentially enable the design of super high performance energy storage devices that combine the high energy density of chemical storage with the high power of superconducting magnetic storage. However, the high aspect ratio and the considerable filament size of these wires require the concomitant development of dedicated optimization methods that account for the critical current density in type-II superconductors. In this study, we report on the novel application and results of a CPU-efficient semianalytical computer code based on the Radia 3-D magnetostatics software package. Our algorithm is used to simulate andmore » optimize the energy density of a superconducting magnetic energy storage device model, based on design constraints, such as overall size and number of coils. The rapid performance of the code is pivoted on analytical calculations of the magnetic field based on an efficient implementation of the Biot-Savart law for a large variety of 3-D “base” geometries in the Radia package. The significantly reduced CPU time and simple data input in conjunction with the consideration of realistic input variables, such as material-specific, temperature, and magnetic-field-dependent critical current densities, have enabled the Radia-based algorithm to outperform finite-element approaches in CPU time at the same accuracy levels. Comparative simulations of MgB 2 and YBCO-based devices are performed at 4.2 K, in order to ascertain the realistic efficiency of the design configurations.« less
Souza, Sarah C R; Mazzafera, Paulo; Sodek, Ladaslav
2016-05-01
Nitrogen fixation of the nodule of soybean is highly sensitive to oxygen deficiency such as provoked by waterlogging of the root system. This study aimed to evaluate the effects of flooding on N metabolism in nodules of soybean. Flooding resulted in a marked decrease of asparagine (the most abundant amino acid) and a concomitant accumulation of γ-aminobutyric acid (GABA). Flooding also resulted in a strong reduction of the incorporation of (15)N2 in amino acids. Nodule amino acids labelled before flooding rapidly lost (15)N during flooding, except for GABA, which initially increased and declined slowly thereafter. Both nitrogenase activity and the expression of nifH and nifD genes were strongly decreased on flooding. Expression of the asparagine synthetase genes SAS1 and SAS2 was reduced, especially the former. Expression of genes encoding the enzyme glutamic acid decarboxylase (GAD1, GAD4, GAD5) was also strongly suppressed except for GAD2 which increased. Almost all changes observed during flooding were reversible after draining. Possible changes in asparagine and GABA metabolism that may explain the marked fluctuations of these amino acids during flooding are discussed. It is suggested that the accumulation of GABA has a storage role during flooding stress.
Performance of a system of reservoirs on futuristic front
NASA Astrophysics Data System (ADS)
Saha, Satabdi; Roy, Debasri; Mazumdar, Asis
2017-10-01
Application of simulation model HEC-5 to analyze the performance of the DVC Reservoir System (a multipurpose system with a network of five reservoirs and one barrage) on the river Damodar in Eastern India in meeting projected future demand as well as controlling flood for synthetically generated future scenario is addressed here with a view to develop an appropriate strategy for its operation. Thomas-Fiering model (based on Markov autoregressive model) has been adopted for generation of synthetic scenario (monthly streamflow series) and subsequently downscaling of modeled monthly streamflow to daily values was carried out. The performance of the system (analysed on seasonal basis) in terms of `Performance Indices' (viz., both quantity based reliability and time based reliability, mean daily deficit, average failure period, resilience and maximum vulnerability indices) for the projected scenario with enhanced demand turned out to be poor compared to that for historical scenario. However, judicious adoption of resource enhancement (marginal reallocation of reservoir storage capacity) and demand management strategy (curtailment of projected high water requirements and trading off between demands) was found to be a viable option for improvement of the performance of the reservoir system appreciably [improvement being (1-51 %), (2-35 %), (16-96 %), (25-50 %), (8-36 %) and (12-30 %) for the indices viz., quantity based reliability, time based reliability, mean daily deficit, average failure period, resilience and maximum vulnerability, respectively] compared to that with normal storage and projected demand. Again, 100 % reliability for flood control for current as well as future synthetically generated scenarios was noted. The results from the study would assist concerned authority in successful operation of reservoirs in the context of growing demand and dwindling resource.
NASA Astrophysics Data System (ADS)
Nghiem, S. V.; Brakenridge, G. R.; Nguyen, D. T.
2017-12-01
Hurricane Harvey inflicted historical catastrophic flooding across extensive regions around Houston and southeast Texas after making landfall on 25 August 2017. The Federal Emergency Management Agency (FEMA) requested urgent supports for flood mapping and monitoring in an emergency response to the extreme flood situation. An innovative satellite remote sensing method, called the Depolarization Reduction Algorithm for Global Observations of inundatioN (DRAGON), has been developed and implemented for use with Sentinel synthetic aperture radar (SAR) satellite data at a resolution of 10 meters to identify, map, and monitor inundation including pre-existing water bodies and newly flooded areas. Results from this new method are hydrologically consistent and have been verified with known surface waters (e.g., coastal ocean, rivers, lakes, reservoirs, etc.), with clear-sky high-resolution WorldView images (where waves can be seen on surface water in inundated areas within a small spatial coverage), and with other flood maps from the consortium of Global Flood Partnership derived from multiple satellite datasets (including clear-sky Landsat and MODIS at lower resolutions). Figure 1 is a high-resolution (4K UHD) image of a composite inundation map for the region around Rosharon (in Brazoria County, south of Houston, Texas). This composite inundation map reveals extensive flooding on 29 August 2017 (four days after Hurricane Harvey made landfall), and the inundation was still persistent in most of the west and south of Rosharon one week later (5 September 2017) while flooding was reduced in the east of Rosharon. Hurricane Irma brought flooding to a number of areas in Florida. As of 10 September 2017, Sentinel SAR flood maps reveal inundation in the Florida Panhandle and over lowland surfaces on several islands in the Florida Keys. However, Sentinel SAR results indicate that flooding along the Florida coast was not extreme despite Irma was a Category-5 hurricane that might have inflicted a potentially strong storm surge. DRAGON flood mapping products over various regions in Texas and in Florida were provided to FEMA. Figure 1. Composite inundation map derived from Sentinel SAR data for the region around Rosharon on 9/5/2017 (orange), inundation on 8/29/2017 (yellow), and pre-existing surface waters on 8/5/2017 (blue).
Hydro-meteorological risk reduction and climate change adaptation in the Sava River Basin
NASA Astrophysics Data System (ADS)
Brilly, Mitja; Šraj, Mojca; Kryžanowski, Andrej
2017-04-01
The Sava River Basin covered the teritory of several countries. There were, in past thirty years, several flood hazard events with almost hundred years return period. Parts of the basin suffer by severe droughts also. In the presentation we covered questions of: • Flood hazard in complex hydrology structure • Landslide and flush flood in mountainous regions • Floods on karst polje • Flood risk management in the complex international and hydrological condition. • Impact of man made structures: hydropower storages, inundation ponds, river regulation, alternate streams, levees system, pumping stations, Natura 2000 areas etc. • How to manage droughts in the international river basin The basin is well covered by information and managed by international the SRB Commission (http://savacommission.org/) that could help. We develop study for climate change impact on floods on entire river basin financing by UNECE. There is also study provide climate change impact on the water management provide by World Bank and on which we take part. Recently is out call by world bank for study »Flood risk management plan for the SRB«.
Hydrometeorological network for flood monitoring and modeling
NASA Astrophysics Data System (ADS)
Efstratiadis, Andreas; Koussis, Antonis D.; Lykoudis, Spyros; Koukouvinos, Antonis; Christofides, Antonis; Karavokiros, George; Kappos, Nikos; Mamassis, Nikos; Koutsoyiannis, Demetris
2013-08-01
Due to its highly fragmented geomorphology, Greece comprises hundreds of small- to medium-size hydrological basins, in which often the terrain is fairly steep and the streamflow regime ephemeral. These are typically affected by flash floods, occasionally causing severe damages. Yet, the vast majority of them lack flow-gauging infrastructure providing systematic hydrometric data at fine time scales. This has obvious impacts on the quality and reliability of flood studies, which typically use simplistic approaches for ungauged basins that do not consider local peculiarities in sufficient detail. In order to provide a consistent framework for flood design and to ensure realistic predictions of the flood risk -a key issue of the 2007/60/EC Directive- it is essential to improve the monitoring infrastructures by taking advantage of modern technologies for remote control and data management. In this context and in the research project DEUCALION, we have recently installed and are operating, in four pilot river basins, a telemetry-based hydro-meteorological network that comprises automatic stations and is linked to and supported by relevant software. The hydrometric stations measure stage, using 50-kHz ultrasonic pulses or piezometric sensors, or both stage (piezometric) and velocity via acoustic Doppler radar; all measurements are being temperature-corrected. The meteorological stations record air temperature, pressure, relative humidity, wind speed and direction, and precipitation. Data transfer is made via GPRS or mobile telephony modems. The monitoring network is supported by a web-based application for storage, visualization and management of geographical and hydro-meteorological data (ENHYDRIS), a software tool for data analysis and processing (HYDROGNOMON), as well as an advanced model for flood simulation (HYDROGEIOS). The recorded hydro-meteorological observations are accessible over the Internet through the www-application. The system is operational and its functionality has been implemented as open-source software for use in a wide range of applications in the field of water resources monitoring and management, such as the demonstration case study outlined in this work.
QPSO-Based Adaptive DNA Computing Algorithm
Karakose, Mehmet; Cigdem, Ugur
2013-01-01
DNA (deoxyribonucleic acid) computing that is a new computation model based on DNA molecules for information storage has been increasingly used for optimization and data analysis in recent years. However, DNA computing algorithm has some limitations in terms of convergence speed, adaptability, and effectiveness. In this paper, a new approach for improvement of DNA computing is proposed. This new approach aims to perform DNA computing algorithm with adaptive parameters towards the desired goal using quantum-behaved particle swarm optimization (QPSO). Some contributions provided by the proposed QPSO based on adaptive DNA computing algorithm are as follows: (1) parameters of population size, crossover rate, maximum number of operations, enzyme and virus mutation rate, and fitness function of DNA computing algorithm are simultaneously tuned for adaptive process, (2) adaptive algorithm is performed using QPSO algorithm for goal-driven progress, faster operation, and flexibility in data, and (3) numerical realization of DNA computing algorithm with proposed approach is implemented in system identification. Two experiments with different systems were carried out to evaluate the performance of the proposed approach with comparative results. Experimental results obtained with Matlab and FPGA demonstrate ability to provide effective optimization, considerable convergence speed, and high accuracy according to DNA computing algorithm. PMID:23935409
Deglacial climate modulated by the storage and release of Arctic sea ice
NASA Astrophysics Data System (ADS)
Condron, A.; Coletti, A. J.; Bradley, R. S.
2017-12-01
Periods of abrupt climate cooling during the last deglaciation (20 - 8 kyr ago) are often attributed to glacial outburst floods slowing the Atlantic meridional overturning circulation (AMOC). Here, we present results from a series of climate model simulations showing that the episodic break-up and mobilization of thick, perennial, Arctic sea ice during this time would have released considerable volumes of freshwater directly to the Nordic Seas, where processes regulating large-scale climate occur. Massive sea ice export events to the North Atlantic are generated whenever the transport of sea ice is enhanced, either by changes in atmospheric circulation, rising sea level submerging the Bering land bridge, or glacial outburst floods draining into the Arctic Ocean from the Mackenzie River. We find that the volumes of freshwater released to the Nordic Seas are similar to, or larger than, those estimated to have come from terrestrial outburst floods, including the discharge at the onset of the Younger Dryas. Our results provide the first evidence that the storage and release of Arctic sea ice helped drive deglacial climate change by modulating the strength of the AMOC.
Ehsan, Shoaib; Clark, Adrian F.; ur Rehman, Naveed; McDonald-Maier, Klaus D.
2015-01-01
The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF), allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video). Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44%) in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems. PMID:26184211
Ehsan, Shoaib; Clark, Adrian F; Naveed ur Rehman; McDonald-Maier, Klaus D
2015-07-10
The integral image, an intermediate image representation, has found extensive use in multi-scale local feature detection algorithms, such as Speeded-Up Robust Features (SURF), allowing fast computation of rectangular features at constant speed, independent of filter size. For resource-constrained real-time embedded vision systems, computation and storage of integral image presents several design challenges due to strict timing and hardware limitations. Although calculation of the integral image only consists of simple addition operations, the total number of operations is large owing to the generally large size of image data. Recursive equations allow substantial decrease in the number of operations but require calculation in a serial fashion. This paper presents two new hardware algorithms that are based on the decomposition of these recursive equations, allowing calculation of up to four integral image values in a row-parallel way without significantly increasing the number of operations. An efficient design strategy is also proposed for a parallel integral image computation unit to reduce the size of the required internal memory (nearly 35% for common HD video). Addressing the storage problem of integral image in embedded vision systems, the paper presents two algorithms which allow substantial decrease (at least 44.44%) in the memory requirements. Finally, the paper provides a case study that highlights the utility of the proposed architectures in embedded vision systems.
NASA Astrophysics Data System (ADS)
Subiyanto, Sawitri
2017-12-01
One of the waters that has been contaminated by industrial waste and domestic waste is the waters of West Flood Canal in Semarang City which is the estuary of the river system, which passes through the Western City of Semarang which is dense with residential and industrial. So, it is necessary to have information about the assessment of water quality in the estuary of the West Flood Canal. Remote sensing technology can analyze the results of recording the spectral characteristics of water with water quality parameters. One of the parameters for assessing water quality is Chlorophyll-a and Total Suspended Solid, can be estimated through remote sensing technology using multispectral Lansat-8 Satellite images data from April, June, and August, 2017 and there are three selected algorithms. Based on the results of TSS and Chlorophyll-A processing, the TSS shows values greater than or equal to 100 which can be said that West Flood Canal is damaged (hypertrophic). While the chlorophyll-a shows a value less than 100 indicating Eutrophic status (threatened). This is caused by the number of suspended materials in the water surface and also because of the disturbance of water vegetation in the form of weeds that destroy the function of the actual West Canal Flood.
NASA Astrophysics Data System (ADS)
Banerjee, Kakoli; Prasad, R. A.
2014-10-01
The whole gamut of Genetic data is ever increasing exponentially. The human genome in its base format occupies almost thirty terabyte of data and doubling its size every two and a half year. It is well-know that computational resources are limited. The most important resource which genetic data requires in its collection, storage and retrieval is its storage space. Storage is limited. Computational performance is also dependent on storage and execution time. Transmission capabilities are also directly dependent on the size of the data. Hence Data compression techniques become an issue of utmost importance when we confront with the task of handling such giganticdatabases like GenBank. Decompression is also an issue when such huge databases are being handled. This paper is intended not only to provide genetic data compression but also partially decompress the genetic sequences.
NASA Astrophysics Data System (ADS)
Wang, Xiaoming; Lin, Yaguang; Zhang, Shanshan; Cai, Zhipeng
2017-05-01
Sudden disasters such as earthquake, flood and hurricane necessitate the employment of communication networks to carry out emergency response activities. Routing has a significant impact on the functionality, performance and flexibility of communication networks. In this article, the routing problem is studied considering the delivery ratio of messages, the overhead ratio of messages and the average delay of messages in mobile opportunistic networks (MONs) for enterprise-level emergency response communications in sudden disaster scenarios. Unlike the traditional routing methods for MONS, this article presents a new two-stage spreading and forwarding dynamic routing algorithm based on the proposed social activity degree and physical contact factor for mobile customers. A new modelling method for describing a dynamic evolving process of the topology structure of a MON is first proposed. Then a multi-copy spreading strategy based on the social activity degree of nodes and a single-copy forwarding strategy based on the physical contact factor between nodes are designed. Compared with the most relevant routing algorithms such as Epidemic, Prophet, Labelled-sim, Dlife-comm and Distribute-sim, the proposed routing algorithm can significantly increase the delivery ratio of messages, and decrease the overhead ratio and average delay of messages.
Comparison of two matrix data structures for advanced CSM testbed applications
NASA Technical Reports Server (NTRS)
Regelbrugge, M. E.; Brogan, F. A.; Nour-Omid, B.; Rankin, C. C.; Wright, M. A.
1989-01-01
The first section describes data storage schemes presently used by the Computational Structural Mechanics (CSM) testbed sparse matrix facilities and similar skyline (profile) matrix facilities. The second section contains a discussion of certain features required for the implementation of particular advanced CSM algorithms, and how these features might be incorporated into the data storage schemes described previously. The third section presents recommendations, based on the discussions of the prior sections, for directing future CSM testbed development to provide necessary matrix facilities for advanced algorithm implementation and use. The objective is to lend insight into the matrix structures discussed and to help explain the process of evaluating alternative matrix data structures and utilities for subsequent use in the CSM testbed.
Develop an piezoelectric sensing based on SHM system for nuclear dry storage system
NASA Astrophysics Data System (ADS)
Ma, Linlin; Lin, Bin; Sun, Xiaoyi; Howden, Stephen; Yu, Lingyu
2016-04-01
In US, there are over 1482 dry cask storage system (DCSS) in use storing 57,807 fuel assemblies. Monitoring is necessary to determine and predict the degradation state of the systems and structures. Therefore, nondestructive monitoring is in urgent need and must be integrated into the fuel cycle to quantify the "state of health" for the safe operation of nuclear power plants (NPP) and radioactive waste storage systems (RWSS). Innovative approaches are desired to evaluate the degradation and damage of used fuel containers under extended storage. Structural health monitoring (SHM) is an emerging technology that uses in-situ sensory system to perform rapid nondestructive detection of structural damage as well as long-term integrity monitoring. It has been extensively studied in aerospace engineering over the past two decades. This paper presents the development of a SHM and damage detection methodology based on piezoelectric sensors technologies for steel canisters in nuclear dry cask storage system. Durability and survivability of piezoelectric sensors under temperature influence are first investigated in this work by evaluating sensor capacitance and electromechanical admittance. Toward damage detection, the PES are configured in pitch catch setup to transmit and receive guided waves in plate-like structures. When the inspected structure has damage such as a surface defect, the incident guided waves will be reflected or scattered resulting in changes in the wave measurements. Sparse array algorithm is developed and implemented using multiple sensors to image the structure. The sparse array algorithm is also evaluated at elevated temperature.
Real Time Monitoring of Flooding from Microwave Satellite Observations
NASA Technical Reports Server (NTRS)
Galantowicz, John F.; Frey, Herb (Technical Monitor)
2002-01-01
We have developed a new method for making high-resolution flood extent maps (e.g., at the 30-100 m scale of digital elevation models) in real-time from low-resolution (20-70 km) passive microwave observations. The method builds a "flood-potential" database from elevations and historic flood imagery and uses it to create a flood-extent map consistent with the observed open water fraction. Microwave radiometric measurements are useful for flood monitoring because they sense surface water in clear-or-cloudy conditions and can provide more timely data (e.g., compared to radars) from relatively wide swath widths and an increasing number of available platforms (DMSP, ADEOS-II, Terra, NPOESS, GPM). The chief disadvantages for flood mapping are the radiometers' low resolution and the need for local calibration of the relationship between radiances and open-water fraction. We present our method for transforming microwave sensor-scale open water fraction estimates into high-resolution flood extent maps and describe 30-day flood map sequences generated during a retrospective study of the 1993 Great Midwest Flood. We discuss the method's potential improvement through as yet unimplemented algorithm enhancements and expected advancements in microwave radiometry (e.g., improved resolution and atmospheric correction).
Wheatcroft, R.A.; Stevens, A.W.; Hunt, L.M.; Milligan, T.G.
2006-01-01
Event-response coring on the Po River prodelta (northern Adriatic Sea) coupled with shipboard digital X-radiography, resistivity profiling, and grain-size analyses permitted documentation of the initial distribution and physical properties of the October 2000 flood deposit. The digital X-radiography system comprises a constant-potential X-ray source and an amorphous silicon imager with an active area of 29??42 cm and 12-bit depth resolution. Objective image segmentation algorithms based on bulk density (brightness), layer contacts (edge detection) and small-scale texture (fabric) were used to identify the flood deposit. Results indicate that the deposit formed in water depths of 6-29 m immediately adjacent to the three main distributary mouths of the Po (Pila, Tolle and Gnocca/Goro). Maximal thickness was 36 cm at a 20-m site off the main mouth (Pila), but many other sites hadthicknesses >20 cm. The Po flood deposit has a complex internal stratigraphy, with multiple layers, a diverse suite of physical sedimentary structures (e.g., laminations, ripple cross bedding, lenticular bedding, soft-sediment deformation structures), and dramatic changes in grain size that imply rapid deposition and fluctuations in energy during emplacement. Based on the flood deposit volume and well-constrained measurements of deposit bulk density the mass of the flood deposit was estimated to be 16??109 kg, which is about two-thirds of the estimated suspended sediment load delivered by the river during the event. The locus of deposition, overall thickness, and stratigraphic complexity of the flood deposit can best be explained by the relatively long sediment throughput times of the Po River, whereby sediment is delivered to the ocean during a range of conditions (i.e., the storm responsible for the precipitation is long gone), the majority of which are reflective of the fair-weather condition. Sediment is therefore deposited proximal to the river mouths, where it can form thick, but stratigraphically complex deposits. In contrast, floods of small rivers such as the Eel (northern California) are coupled to storm conditions, which lead to high levels of sediment dispersion. ?? 2006 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Schreiner, K. M.; Carlin, J. A.; Sayers, L.; Swenson, J.
2017-12-01
Marine sediments are an important long-term reservoir for both recently fixed organic carbon (OC) and ancient rock derived OC, much of which is delivered by rivers. The ratio between these two sources of OC in turn regulates atmospheric levels of oxygen and carbon dioxide over geologic time, making this riverine delivery of OC, primarily carried by sediments, an important flux in the global carbon cycle. However, while the overall magnitude of these fluxes are relatively well known, it remains to be determined the importance of episodic events, like storms and floods, in the flux of OC from terrestrial to marine environments. Here, we present data from a 34 cm core collected from the Gulf of Mexico at a mid-shelf distal depocenter for the Brazos River in 2015, during a strong El Nino when that area of the country was experiencing 100-year flood events and anomalously high river flow. Based on analysis of the radioactive isotope 7Be, approximately the top 7-8 cm of the sediment in this core was deposited during this flood event. Both bulk elemental (C, N, and stable carbon isotopes) and chemical biomarker (lignin-phenol) data has been combined to provide information of the origin and chemistry of the OC in this core both before and during flooding. C:N and d13C indicate a mixture of marine-sourced and terrestrially-sourced OC throughout the length of the core with very little variation between the flood layer and deeper sediments. However, lignin-phenol concentrations are higher in flood-deposited sediment, indicating that this sediment is likely terrestrially-sourced. Lignin-phenol indicators of OC degradation state (Acid:Aldehyde ratios) indicate that flood sediment is fresher and less degraded than deeper sediments. Taken together, these results indicate that 1. Bulk analyses are not enough to determine OC source and the importance of flood events in OC cycling and 2. Episodic events like floods could have an oversized impact on OC storage in marine sediments.
High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.
Simonyan, Vahan; Mazumder, Raja
2014-09-30
The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.
High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis
Simonyan, Vahan; Mazumder, Raja
2014-01-01
The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953
Large scale modelling of catastrophic floods in Italy
NASA Astrophysics Data System (ADS)
Azemar, Frédéric; Nicótina, Ludovico; Sassi, Maximiliano; Savina, Maurizio; Hilberts, Arno
2017-04-01
The RMS European Flood HD model® is a suite of country scale flood catastrophe models covering 13 countries throughout continental Europe and the UK. The models are developed with the goal of supporting risk assessment analyses for the insurance industry. Within this framework RMS is developing a hydrologic and inundation model for Italy. The model aims at reproducing the hydrologic and hydraulic properties across the domain through a modeling chain. A semi-distributed hydrologic model that allows capturing the spatial variability of the runoff formation processes is coupled with a one-dimensional river routing algorithm and a two-dimensional (depth averaged) inundation model. This model setup allows capturing the flood risk from both pluvial (overland flow) and fluvial flooding. Here we describe the calibration and validation methodologies for this modelling suite applied to the Italian river basins. The variability that characterizes the domain (in terms of meteorology, topography and hydrologic regimes) requires a modeling approach able to represent a broad range of meteo-hydrologic regimes. The calibration of the rainfall-runoff and river routing models is performed by means of a genetic algorithm that identifies the set of best performing parameters within the search space over the last 50 years. We first establish the quality of the calibration parameters on the full hydrologic balance and on individual discharge peaks by comparing extreme statistics to observations over the calibration period on several stations. The model is then used to analyze the major floods in the country; we discuss the different meteorological setup leading to the historical events and the physical mechanisms that induced these floods. We can thus assess the performance of RMS' hydrological model in view of the physical mechanisms leading to flood and highlight the main controls on flood risk modelling throughout the country. The model's ability to accurately simulate antecedent conditions and discharge hydrographs over the affected area is also assessed, showing that spatio-temporal correlation is retained through the modelling chain. Results show that our modelling approach can capture a wide range of conditions leading to major floods in the Italian peninsula. Under the umbrella of the RMS European Flood HD models this constitutes, to our knowledge, the only operational flood risk model to be applied at continental scale with a coherent model methodology and a domain wide MonteCarlo stochastic set.
NASA Astrophysics Data System (ADS)
Gaál, Ladislav; Szolgay, Ján.; Bacigál, Tomáå.¡; Kohnová, Silvia
2010-05-01
Copula-based estimation methods of hydro-climatological extremes have increasingly been gaining attention of researchers and practitioners in the last couple of years. Unlike the traditional estimation methods which are based on bivariate cumulative distribution functions (CDFs), copulas are a relatively flexible tool of statistics that allow for modelling dependencies between two or more variables such as flood peaks and flood volumes without making strict assumptions on the marginal distributions. The dependence structure and the reliability of the joint estimates of hydro-climatological extremes, mainly in the right tail of the joint CDF not only depends on the particular copula adopted but also on the data available for the estimation of the marginal distributions of the individual variables. Generally, data samples for frequency modelling have limited temporal extent, which is a considerable drawback of frequency analyses in practice. Therefore, it is advised to deal with statistical methods that improve any part of the process of copula construction and result in more reliable design values of hydrological variables. The scarcity of the data sample mostly in the extreme tail of the joint CDF can be bypassed, e.g., by using a considerably larger amount of simulated data by rainfall-runoff analysis or by including historical information on the variables under study. The latter approach of data extension is used here to make the quantile estimates of the individual marginals of the copula more reliable. In the presented paper it is proposed to use historical information in the frequency analysis of the marginal distributions in the framework of Bayesian Monte Carlo Markov Chain (MCMC) simulations. Generally, a Bayesian approach allows for a straightforward combination of different sources of information on floods (e.g. flood data from systematic measurements and historical flood records, respectively) in terms of a product of the corresponding likelihood functions. On the other hand, the MCMC algorithm is a numerical approach for sampling from the likelihood distributions. The Bayesian MCMC methods therefore provide an attractive way to estimate the uncertainty in parameters and quantile metrics of frequency distributions. The applicability of the method is demonstrated in a case study of the hydroelectric power station Orlík on the Vltava River. This site has a key role in the flood prevention of Prague, the capital city of the Czech Republic. The record length of the available flood data is 126 years from the period 1877-2002, while the flood event observed in 2002 that caused extensive damages and numerous casualties is treated as a historic one. To estimate the joint probabilities of flood peaks and volumes, different copulas are fitted and their goodness-of-fit are evaluated by bootstrap simulations. Finally, selected quantiles of flood volumes conditioned on given flood peaks are derived and compared with those obtained by the traditional method used in the practice of water management specialists of the Vltava River.
NASA Astrophysics Data System (ADS)
Schellenberg, Graham; Stortz, Greg; Goertzen, Andrew L.
2016-02-01
A typical positron emission tomography detector is comprised of a scintillator crystal array coupled to a photodetector array or other position sensitive detector. Such detectors using light sharing to read out crystal elements require the creation of a crystal lookup table (CLUT) that maps the detector response to the crystal of interaction based on the x-y position of the event calculated through Anger-type logic. It is vital for system performance that these CLUTs be accurate so that the location of events can be accurately identified and so that crystal-specific corrections, such as energy windowing or time alignment, can be applied. While using manual segmentation of the flood image to create the CLUT is a simple and reliable approach, it is both tedious and time consuming for systems with large numbers of crystal elements. In this work we describe the development of an automated algorithm for CLUT generation that uses a Gaussian mixture model paired with thin plate splines (TPS) to iteratively fit a crystal layout template that includes the crystal numbering pattern. Starting from a region of stability, Gaussians are individually fit to data corresponding to crystal locations while simultaneously updating a TPS for predicting future Gaussian locations at the edge of a region of interest that grows as individual Gaussians converge to crystal locations. The algorithm was tested with flood image data collected from 16 detector modules, each consisting of a 409 crystal dual-layer offset LYSO crystal array readout by a 32 pixel SiPM array. For these detector flood images, depending on user defined input parameters, the algorithm runtime ranged between 17.5-82.5 s per detector on a single core of an Intel i7 processor. The method maintained an accuracy above 99.8% across all tests, with the majority of errors being localized to error prone corner regions. This method can be easily extended for use with other detector types through adjustment of the initial template model used.
Lu, Zhonghua; Arikatla, Venkata S; Han, Zhongqing; Allen, Brian F; De, Suvranu
2014-12-01
High-frequency electricity is used in the majority of surgical interventions. However, modern computer-based training and simulation systems rely on physically unrealistic models that fail to capture the interplay of the electrical, mechanical and thermal properties of biological tissue. We present a real-time and physically realistic simulation of electrosurgery by modelling the electrical, thermal and mechanical properties as three iteratively solved finite element models. To provide subfinite-element graphical rendering of vaporized tissue, a dual-mesh dynamic triangulation algorithm based on isotherms is proposed. The block compressed row storage (BCRS) structure is shown to be critical in allowing computationally efficient changes in the tissue topology due to vaporization. We have demonstrated our physics-based electrosurgery cutting algorithm through various examples. Our matrix manipulation algorithms designed for topology changes have shown low computational cost. Our simulator offers substantially greater physical fidelity compared to previous simulators that use simple geometry-based heat characterization. Copyright © 2013 John Wiley & Sons, Ltd.
Trajectory NG: portable, compressed, general molecular dynamics trajectories.
Spångberg, Daniel; Larsson, Daniel S D; van der Spoel, David
2011-10-01
We present general algorithms for the compression of molecular dynamics trajectories. The standard ways to store MD trajectories as text or as raw binary floating point numbers result in very large files when efficient simulation programs are used on supercomputers. Our algorithms are based on the observation that differences in atomic coordinates/velocities, in either time or space, are generally smaller than the absolute values of the coordinates/velocities. Also, it is often possible to store values at a lower precision. We apply several compression schemes to compress the resulting differences further. The most efficient algorithms developed here use a block sorting algorithm in combination with Huffman coding. Depending on the frequency of storage of frames in the trajectory, either space, time, or combinations of space and time differences are usually the most efficient. We compare the efficiency of our algorithms with each other and with other algorithms present in the literature for various systems: liquid argon, water, a virus capsid solvated in 15 mM aqueous NaCl, and solid magnesium oxide. We perform tests to determine how much precision is necessary to obtain accurate structural and dynamic properties, as well as benchmark a parallelized implementation of the algorithms. We obtain compression ratios (compared to single precision floating point) of 1:3.3-1:35 depending on the frequency of storage of frames and the system studied.
NASA Astrophysics Data System (ADS)
Huang, C.; Hsu, N.
2013-12-01
This study imports Low-Impact Development (LID) technology of rainwater catchment systems into a Storm-Water runoff Management Model (SWMM) to design the spatial capacity and quantity of rain barrel for urban flood mitigation. This study proposes a simulation-optimization model for effectively searching the optimal design. In simulation method, we design a series of regular spatial distributions of capacity and quantity of rainwater catchment facilities, and thus the reduced flooding circumstances using a variety of design forms could be simulated by SWMM. Moreover, we further calculate the net benefit that is equal to subtract facility cost from decreasing inundation loss and the best solution of simulation method would be the initial searching solution of the optimization model. In optimizing method, first we apply the outcome of simulation method and Back-Propagation Neural Network (BPNN) for developing a water level simulation model of urban drainage system in order to replace SWMM which the operating is based on a graphical user interface and is hard to combine with optimization model and method. After that we embed the BPNN-based simulation model into the developed optimization model which the objective function is minimizing the negative net benefit. Finally, we establish a tabu search-based algorithm to optimize the planning solution. This study applies the developed method in Zhonghe Dist., Taiwan. Results showed that application of tabu search and BPNN-based simulation model into the optimization model not only can find better solutions than simulation method in 12.75%, but also can resolve the limitations of previous studies. Furthermore, the optimized spatial rain barrel design can reduce 72% of inundation loss according to historical flood events.
Compression of electromyographic signals using image compression techniques.
Costa, Marcus Vinícius Chaffim; Berger, Pedro de Azevedo; da Rocha, Adson Ferreira; de Carvalho, João Luiz Azevedo; Nascimento, Francisco Assis de Oliveira
2008-01-01
Despite the growing interest in the transmission and storage of electromyographic signals for long periods of time, few studies have addressed the compression of such signals. In this article we present an algorithm for compression of electromyographic signals based on the JPEG2000 coding system. Although the JPEG2000 codec was originally designed for compression of still images, we show that it can also be used to compress EMG signals for both isotonic and isometric contractions. For EMG signals acquired during isometric contractions, the proposed algorithm provided compression factors ranging from 75 to 90%, with an average PRD ranging from 3.75% to 13.7%. For isotonic EMG signals, the algorithm provided compression factors ranging from 75 to 90%, with an average PRD ranging from 3.4% to 7%. The compression results using the JPEG2000 algorithm were compared to those using other algorithms based on the wavelet transform.
PSO-Based Smart Grid Application for Sizing and Optimization of Hybrid Renewable Energy Systems
Mohamed, Mohamed A.; Eltamaly, Ali M.; Alolah, Abdulrahman I.
2016-01-01
This paper introduces an optimal sizing algorithm for a hybrid renewable energy system using smart grid load management application based on the available generation. This algorithm aims to maximize the system energy production and meet the load demand with minimum cost and highest reliability. This system is formed by photovoltaic array, wind turbines, storage batteries, and diesel generator as a backup source of energy. Demand profile shaping as one of the smart grid applications is introduced in this paper using load shifting-based load priority. Particle swarm optimization is used in this algorithm to determine the optimum size of the system components. The results obtained from this algorithm are compared with those from the iterative optimization technique to assess the adequacy of the proposed algorithm. The study in this paper is performed in some of the remote areas in Saudi Arabia and can be expanded to any similar regions around the world. Numerous valuable results are extracted from this study that could help researchers and decision makers. PMID:27513000
PSO-Based Smart Grid Application for Sizing and Optimization of Hybrid Renewable Energy Systems.
Mohamed, Mohamed A; Eltamaly, Ali M; Alolah, Abdulrahman I
2016-01-01
This paper introduces an optimal sizing algorithm for a hybrid renewable energy system using smart grid load management application based on the available generation. This algorithm aims to maximize the system energy production and meet the load demand with minimum cost and highest reliability. This system is formed by photovoltaic array, wind turbines, storage batteries, and diesel generator as a backup source of energy. Demand profile shaping as one of the smart grid applications is introduced in this paper using load shifting-based load priority. Particle swarm optimization is used in this algorithm to determine the optimum size of the system components. The results obtained from this algorithm are compared with those from the iterative optimization technique to assess the adequacy of the proposed algorithm. The study in this paper is performed in some of the remote areas in Saudi Arabia and can be expanded to any similar regions around the world. Numerous valuable results are extracted from this study that could help researchers and decision makers.
Dynamics of flood water infiltration and ground water recharge in hyperarid desert.
Dahan, Ofer; Tatarsky, Boaz; Enzel, Yehouda; Kulls, Christoph; Seely, Mary; Benito, Gererdo
2008-01-01
A study on flood water infiltration and ground water recharge of a shallow alluvial aquifer was conducted in the hyperarid section of the Kuiseb River, Namibia. The study site was selected to represent a typical desert ephemeral river. An instrumental setup allowed, for the first time, continuous monitoring of infiltration during a flood event through the channel bed and the entire vadose zone. The monitoring system included flexible time domain reflectometry probes that were designed to measure the temporal variation in vadose zone water content and instruments to concurrently measure the levels of flood and ground water. A sequence of five individual floods was monitored during the rainy season in early summer 2006. These newly generated data served to elucidate the dynamics of flood water infiltration. Each flood initiated an infiltration event which was expressed in wetting of the vadose zone followed by a measurable rise in the water table. The data enabled a direct calculation of the infiltration fluxes by various independent methods. The floods varied in their stages, peaks, and initial water contents. However, all floods produced very similar flux rates, suggesting that the recharge rates are less affected by the flood stages but rather controlled by flow duration and available aquifer storage under it. Large floods flood the stream channel terraces and promote the larger transmission losses. These, however, make only a negligible contribution to the recharge of the ground water. It is the flood duration within the active streambed, which may increase with flood magnitude that is important to the recharge process.
Injection and Monitoring at the Wallula Basalt Pilot Project
McGrail, B. Peter; Spane, Frank A.; Amonette, James E.; ...
2014-01-01
Continental flood basalts represent one of the largest geologic structures on earth but have received comparatively little attention for geologic storage of CO2. Flood basalt lava flows have flow tops that are porous, permeable, and have large potential capacity for storage of CO2. In appropriate geologic settings, interbedded sediment layers and dense low-permeability basalt rock flow interior sections may act as effective seals allowing time for mineralization reactions to occur. Previous laboratory experiments showed the relatively rapid chemical reaction of CO2-saturated pore water with basalts to form stable carbonate minerals. However, recent laboratory tests with water-saturated supercritical CO2 show thatmore » mineralization reactions occur in this phase as well, providing a second and potentially more important mineralization pathway than was previously understood. Field testing of these concepts is proceeding with drilling of the world’s first supercritical CO2 injection well in flood basalt being completed in May 2009 near the township of Wallula in Washington State and corresponding CO2 injection permit granted by the State of Washington in March 2011. Injection of a nominal 1000 MT of CO2 was completed in August 2013 and site monitoring is in progress. Well logging conducted immediately after injection termination confirmed the presence of CO2 predominantly within the upper flow top region, and showed no evidence of vertical CO2 migration outside the well casing. Shallow soil gas samples collected around the injection well show no evidence of leakage and fluid and gas samples collected from the injection zone show strongly elevated concentrations of Ca, Mg, Mn, and Fe and 13C/18O isotopic shifts that are consistent with basalt-water chemical reactions. If proven viable by this field test and others that are in progress or being planned, major flood basalts in the U.S., India, and perhaps Australia would provide significant additional CO2 storage capacity and additional geologic sequestration options in regions of these countries where conventional storage options are limited.« less
Physical parameters of Fluvisols on flooded and non-flooded terraces
NASA Astrophysics Data System (ADS)
Kercheva, Milena; Sokołowska, Zofia; Hajnos, Mieczysław; Skic, Kamil; Shishkov, Toma
2017-01-01
The heterogeneity of soil physical properties of Fluvisols, lack of large pristine areas, and different moisture regimes on non-flooded and flooded terraces impede the possibility to find a soil profile which can serve as a baseline for estimating the impact of natural or anthropogenic factors on soil evolution. The aim of this study is to compare the pore size distribution of pristine Fluvisols on flooded and non-flooded terraces using the method of the soil water retention curve, mercury intrusion porosimetry, nitrogen adsorption isotherms, and water vapour sorption. The pore size distribution of humic horizons of pristine Fluvisols on the non-flooded terrace differs from pore size distribution of Fluvisols on the flooded terrace. The peaks of textural and structural pores are higher in the humic horizons under more humid conditions. The structural characteristics of subsoil horizons depend on soil texture and evolution stage. The peaks of textural pores at about 1 mm diminish with lowering of the soil organic content. Structureless horizons are characterized by uni-modal pore size distribution. Although the content of structural pores of the subsoil horizons of Fluvisols on the non-flooded terrace is low, these pores are represented by biopores, as the coefficient of filtration is moderately high. The difference between non-flooded and flooded profiles is well expressed by the available water storage, volume and mean radius of pores, obtained by mercury intrusion porosimetry and water desorption, which are higher in the surface horizons of frequently flooded Fluvisols.
Rapid Flood Map Generation from Spaceborne SAR Observations
NASA Astrophysics Data System (ADS)
Yun, S. H.; Liang, C.; Manipon, G.; Jung, J.; Gurrola, E. M.; Owen, S. E.; Hua, H.; Agram, P. S.; Webb, F.; Sacco, G. F.; Rosen, P. A.; Simons, M.
2016-12-01
The Advanced Rapid Imaging and Analysis (ARIA) team has responded to the January 2016 US Midwest Floods along the Mississippi River. Daily teleconferences with FEMA, NOAA, NGA, and USGS, provided information on precipitation and flood crest migration, based on which we coordinated with the Japanese Aerospace Exploration Agency (JAXA) through NASA headquarters for JAXA's ALOS-2 timely tasking over two paths. We produced flood extent maps using ALOS-2 SM3 mode Level 1.5 data that were provided through the International Charter and stored at the US Geological Survey's Hazards Data Distribution System (HDDS) archive. On January 6, the first four frames (70 km x 240 km) were acquired, which included the City of Memphis. We registered post-event SAR images to pre-event images, applied radiometric calibration, took a logarithm of the ratio of the two images. Two thresholds were applied to represent flooded areas that became open water (colored in blue) and flooded areas with tall vegetation (colored in red). The second path was acquired on January 11 further down along the Mississippi River. Seven frames (70 km x 420 km) were acquired and flood maps were created in the similar fashion. The maps were delivered to the FEMA as well as posted on ARIA's public website. The FEMA stated that SAR provides inspection priority for optical imagery and ground response. The ALOS-2 data and the products have been a very important source of information during this response as the flood crest has moved down stream. The SAR data continue to be an important resource during times when optical observations are often not useful. In close collaboration with FEMA and USGS, we also work on other flood events including June 2016 China Floods using European Space Agency's (ESA's) Sentienl-1 data, to produce flood extent maps and identify algorithmic needs and ARIA system's requirements to automate and rapidly produce and deliver flood maps for future events. With the addition of Sentinel-1B satellite, the composite expected wait time until a SAR satellite to fly over a flooded area became smaller than 12 hours. With more SAR missions, such as SAOCOM, RADARSAT Constellation, Sentinel-1C/D, ALOS-3, and NISAR, SAR data are becoming more useful for rapid mapping of devastating floods, which are becoming more frequent and more severe around the world.
Characterization of Mediterranean hail-bearing storms using an operational polarimetric X-band radar
NASA Astrophysics Data System (ADS)
Vulpiani, G.; Baldini, L.; Roberto, N.
2015-07-01
This work documents the fruitul use of X-band radar observations for the monitoring of severe storms in an operational framework. More specifically, a couple of severe hail-bearing Mediterranean storms occurred in 2013 in southern Italy, flooding two important cities of Sicily, are described in terms of their polarimetric radar signatures and retrieved rainfall fields. It is used the X-band dual-polarization radar operating inside the Catania airport (Sicily, Italy), managed by the Italian Department of Civil Protection. A suitable processing is applied to X-band radar measurements. The crucial procedural step relies on the differential phase processing based on an iterative approach that uses a very short-length (1 km) moving window allowing to properly catch the observed high radial gradients of the differential phase. The parameterization of the attenuation correction algorithm, which use the reconstructed differential phase shift, is derived from electromagnetic simulations based on 3 years of DSD observations collected in Rome (Italy). A Fuzzy Logic hydrometeor classification algorithm was also adopted to support the analysis of the storm characteristics. The precipitation fields amount were reconstructed using a combined polarimetric rainfall algorithm based on reflectivity and specific differential phase. The first considered storm was observed on the 21 February, when a winter convective system, originated in the Tyrrhenian sea, hit only marginally the central-eastern coastline of Sicily causing the flash-flood of Catania. Due to the optimal radar location (the system is located at just few kilometers from the city center), it was possible to well retrieve the storm characteristics, including the amount of rainfall field at ground. Extemporaneous signal extinction, caused by close-range hail core causing significant differential phase shift in very short range path, is documented. The second storm, occurred on 21 August 2013, is a summer mesoscale convective system originated by the temperature gradient between sea and land surface, lasted a few hours and eventually flooded the city of Siracusa. The undergoing physical process, including the storm dynamics, is inferred by analysing the vertical sections of the polarimetric radar measurements. The high registered precipitation amount was fairly well reconstructed even though with a trend to underestimation at increasing distances. Several episodes of signal extinction clearly manifested during the mature stage of the observed supercell.
Characterization of Mediterranean hail-bearing storms using an operational polarimetric X-band radar
NASA Astrophysics Data System (ADS)
Vulpiani, G.; Baldini, L.; Roberto, N.
2015-11-01
This work documents the effective use of X-band radar observations for monitoring severe storms in an operational framework. Two severe hail-bearing Mediterranean storms that occurred in 2013 in southern Italy, flooding two important Sicilian cities, are described in terms of their polarimetric radar signatures and retrieved rainfall fields. The X-band dual-polarization radar operating inside the Catania airport (Sicily, Italy), managed by the Italian Department of Civil Protection, is considered here. A suitable processing is applied to X-band radar measurements. The crucial procedural step relies on the differential phase processing, being preparatory for attenuation correction and rainfall estimation. It is based on an iterative approach that uses a very short-length (1 km) moving window, allowing proper capture of the observed high radial gradients of the differential phase. The parameterization of the attenuation correction algorithm, which uses the reconstructed differential phase shift, is derived from electromagnetic simulations based on 3 years of drop size distribution (DSD) observations collected in Rome (Italy). A fuzzy logic hydrometeor classification algorithm was also adopted to support the analysis of the storm characteristics. The precipitation field amounts were reconstructed using a combined polarimetric rainfall algorithm based on reflectivity and specific differential phase. The first storm was observed on 21 February when a winter convective system that originated in the Tyrrhenian Sea, marginally hit the central-eastern coastline of Sicily, causing a flash flood in Catania. Due to an optimal location (the system is located a few kilometers from the city center), it was possible to retrieve the storm characteristics fairly well, including the amount of rainfall field at the ground. Extemporaneous signal extinction, caused by close-range hail core causing significant differential phase shift in a very short-range path, is documented. The second storm, on 21 August 2013, was a summer mesoscale convective system that originated from a Mediterranean low pressure system lasting a few hours that eventually flooded the city of Syracuse. The undergoing physical process, including the storm dynamics, is inferred by analyzing the vertical sections of the polarimetric radar measurements. The high registered amount of precipitation was fairly well reconstructed, although with a trend toward underestimation at increasing distances. Several episodes of signal extinction were clearly manifested during the mature stage of the observed supercells.
Detection of dominant runoff generation processes in flood frequency analysis
NASA Astrophysics Data System (ADS)
Iacobellis, Vito; Fiorentino, Mauro; Gioia, Andrea; Manfreda, Salvatore
2010-05-01
The investigation on hydrologic similarity represents one of the most exciting challenges faced by hydrologists in the last few years, in order to reduce uncertainty on flood prediction in ungauged basins (e.g., IAHS Decade on Predictions in Ungauged Basins (PUB) - Sivapalan et al., 2003). In perspective, the identification of dominant runoff generation mechanisms may provide a strategy for catchment classification and identification hydrologically omogeneous regions. In this context, we exploited the framework of theoretically derived flood probability distributions, in order to interpret the physical behavior of real basins. Recent developments on theoretically derived distributions have highlighted that in a given basin different runoff processes may coexistence and modify or affect the shape of flood distributions. The identification of dominant runoff generation mechanisms represents a key signatures of flood distributions providing an insight in hydrologic similarity. Iacobellis and Fiorentino (2000) introduced a novel distribution of flood peak annual maxima, the "IF" distribution, which exploited the variable source area concept, coupled with a runoff threshold having scaling properties. More recently, Gioia et al (2008) introduced the Two Component-IF (TCIF) distribution, generalizing the IF distribution, based on two different threshold mechanisms, associated respectively to ordinary and extraordinary events. Indeed, ordinary floods are mostly due to rainfall events exceeding a threshold infiltration rate in a small source area, while the so-called outlier events, often responsible of the high skewness of flood distributions, are triggered by severe rainfalls exceeding a threshold storage in a large portion of the basin. Within this scheme, we focused on the application of both models (IF and TCIF) over a considerable number of catchments belonging to different regions of Southern Italy. In particular, we stressed, as a case of strong general interest in the field of statistical hydrology, the role of procedures for parameters estimation and techniques for model selection in the case of nested distributions. References Gioia, A., V. Iacobellis, S. Manfreda, M. Fiorentino, Runoff thresholds in derived flood frequency distributions, Hydrol. Earth Syst. Sci., 12, 1295-1307, 2008. Iacobellis, V., and M. Fiorentino (2000), Derived distribution of floods based on the concept of partial area coverage with a climatic appeal, Water Resour. Res., 36(2), 469-482. Sivapalan, M., Takeuchi, K., Franks, S. W., Gupta, V. K., Karambiri, H., Lakshmi, V., Liang, X., McDonnell, J. J., Mendiondo, E. M., O'Connell, P. E., Oki, T., Pomeroy, J. W., Schertzer, D., Uhlenbrook, S. and Zehe, E.: IAHS Decade on Predictions in Ungauged Basins (PUB), 2003-2012: Shaping an exciting future for the hydrological sciences, Hydrol. Sci. J., 48(6), 857-880, 2003.
Reliable data storage system design and implementation for acoustic logging while drilling
NASA Astrophysics Data System (ADS)
Hao, Xiaolong; Ju, Xiaodong; Wu, Xiling; Lu, Junqiang; Men, Baiyong; Yao, Yongchao; Liu, Dong
2016-12-01
Owing to the limitations of real-time transmission, reliable downhole data storage and fast ground reading have become key technologies in developing tools for acoustic logging while drilling (LWD). In order to improve the reliability of the downhole storage system in conditions of high temperature, intensive shake and periodic power supply, improvements were made in terms of hardware and software. In hardware, we integrated the storage system and data acquisition control module into one circuit board, to reduce the complexity of the storage process, by adopting the controller combination of digital signal processor and field programmable gate array. In software, we developed a systematic management strategy for reliable storage. Multiple-backup independent storage was employed to increase the data redundancy. A traditional error checking and correction (ECC) algorithm was improved and we embedded the calculated ECC code into all management data and waveform data. A real-time storage algorithm for arbitrary length data was designed to actively preserve the storage scene and ensure the independence of the stored data. The recovery procedure of management data was optimized to realize reliable self-recovery. A new bad block management idea of static block replacement and dynamic page mark was proposed to make the period of data acquisition and storage more balanced. In addition, we developed a portable ground data reading module based on a new reliable high speed bus to Ethernet interface to achieve fast reading of the logging data. Experiments have shown that this system can work stably below 155 °C with a periodic power supply. The effective ground data reading rate reaches 1.375 Mbps with 99.7% one-time success rate at room temperature. This work has high practical application significance in improving the reliability and field efficiency of acoustic LWD tools.
Flood damage estimation of companies: A comparison of Stage-Damage-Functions and Random Forests
NASA Astrophysics Data System (ADS)
Sieg, Tobias; Kreibich, Heidi; Vogel, Kristin; Merz, Bruno
2017-04-01
The development of appropriate flood damage models plays an important role not only for the damage assessment after an event but also to develop adaptation and risk mitigation strategies. So called Stage-Damage-Functions (SDFs) are often applied as a standard approach to estimate flood damage. These functions assign a certain damage to the water depth depending on the use or other characteristics of the exposed objects. Recent studies apply machine learning algorithms like Random Forests (RFs) to model flood damage. These algorithms usually consider more influencing variables and promise to depict a more detailed insight into the damage processes. In addition they provide an inherent validation scheme. Our study focuses on direct, tangible damage of single companies. The objective is to model and validate the flood damage suffered by single companies with SDFs and RFs. The data sets used are taken from two surveys conducted after the floods in the Elbe and Danube catchments in the years 2002 and 2013 in Germany. Damage to buildings (n = 430), equipment (n = 651) as well as goods and stock (n = 530) are taken into account. The model outputs are validated via a comparison with the actual flood damage acquired by the surveys and subsequently compared with each other. This study investigates the gain in model performance with the use of additional data and the advantages and disadvantages of the RFs compared to SDFs. RFs show an increase in model performance with an increasing amount of data records over a comparatively large range, while the model performance of the SDFs is already saturated for a small set of records. In addition, the RFs are able to identify damage influencing variables, which improves the understanding of damage processes. Hence, RFs can slightly improve flood damage predictions and provide additional insight into the underlying mechanisms compared to SDFs.
1982-09-01
and storage yard owned by the Burlington Northern Railroad. The latter tract is covered predominantly by invader plant species , with only a few trees ...vegetation. Beautification Measures - The planting of flood-tolerant species of trees and shrubs to restore this area is recommended. See plate G-4i for...PLANT SPECIES TO BE INCLUDED IN THIS AREA INCLUDE: TREES GREEN ASH HACKBERRY RED DOGWOOD VIBURNUM RED MAPLES WILLOW STAGHORN SUMAC HAZEL NUT RIVER
NASA Astrophysics Data System (ADS)
Vilotte, J.-P.; Atkinson, M.; Michelini, A.; Igel, H.; van Eck, T.
2012-04-01
Increasingly dense seismic and geodetic networks are continuously transmitting a growing wealth of data from around the world. The multi-use of these data leaded the seismological community to pioneer globally distributed open-access data infrastructures, standard services and formats, e.g., the Federation of Digital Seismic Networks (FDSN) and the European Integrated Data Archives (EIDA). Our ability to acquire observational data outpaces our ability to manage, analyze and model them. Research in seismology is today facing a fundamental paradigm shift. Enabling advanced data-intensive analysis and modeling applications challenges conventional storage, computation and communication models and requires a new holistic approach. It is instrumental to exploit the cornucopia of data, and to guarantee optimal operation and design of the high-cost monitoring facilities. The strategy of VERCE is driven by the needs of the seismological data-intensive applications in data analysis and modeling. It aims to provide a comprehensive architecture and framework adapted to the scale and the diversity of those applications, and integrating the data infrastructures with Grid, Cloud and HPC infrastructures. It will allow prototyping solutions for new use cases as they emerge within the European Plate Observatory Systems (EPOS), the ESFRI initiative of the solid Earth community. Computational seismology, and information management, is increasingly revolving around massive amounts of data that stem from: (1) the flood of data from the observational systems; (2) the flood of data from large-scale simulations and inversions; (3) the ability to economically store petabytes of data online; (4) the evolving Internet and Data-aware computing capabilities. As data-intensive applications are rapidly increasing in scale and complexity, they require additional services-oriented architectures offering a virtualization-based flexibility for complex and re-usable workflows. Scientific information management poses computer science challenges: acquisition, organization, query and visualization tasks scale almost linearly with the data volumes. Commonly used FTP-GREP metaphor allows today to scan gigabyte-sized datasets but will not work for scanning terabyte-sized continuous waveform datasets. New data analysis and modeling methods, exploiting the signal coherence within dense network arrays, are nonlinear. Pair-algorithms on N points scale as N2. Wave form inversion and stochastic simulations raise computing and data handling challenges These applications are unfeasible for tera-scale datasets without new parallel algorithms that use near-linear processing, storage and bandwidth, and that can exploit new computing paradigms enabled by the intersection of several technologies (HPC, parallel scalable database crawler, data-aware HPC). This issues will be discussed based on a number of core pilot data-intensive applications and use cases retained in VERCE. This core applications are related to: (1) data processing and data analysis methods based on correlation techniques; (2) cpu-intensive applications such as large-scale simulation of synthetic waveforms in complex earth systems, and full waveform inversion and tomography. We shall analyze their workflow and data flow, and their requirements for a new service-oriented architecture and a data-aware platform with services and tools. Finally, we will outline the importance of a new collaborative environment between seismology and computer science, together with the need for the emergence and the recognition of 'research technologists' mastering the evolving data-aware technologies and the data-intensive research goals in seismology.
NASA Astrophysics Data System (ADS)
Hadi, M. Z.; Djatna, T.; Sugiarto
2018-04-01
This paper develops a dynamic storage assignment model to solve storage assignment problem (SAP) for beverages order picking in a drive-in rack warehousing system to determine the appropriate storage location and space for each beverage products dynamically so that the performance of the system can be improved. This study constructs a graph model to represent drive-in rack storage position then combine association rules mining, class-based storage policies and an arrangement rule algorithm to determine an appropriate storage location and arrangement of the product according to dynamic orders from customers. The performance of the proposed model is measured as rule adjacency accuracy, travel distance (for picking process) and probability a product become expiry using Last Come First Serve (LCFS) queue approach. Finally, the proposed model is implemented through computer simulation and compare the performance for different storage assignment methods as well. The result indicates that the proposed model outperforms other storage assignment methods.
NASA Astrophysics Data System (ADS)
Wilkinson, M. E.; Quinn, P. F.; Jonczyk, J.; Burke, S.; Nicholson, A.; Barber, N.; Owen, G.; Palmer, M.
2012-04-01
A number of studies have suggested that there is evidence that modern land-use management practices have increased surface runoff at the local scale. There is an urgent need for interventions to reduce the risk of flooding whilst also delivering multiple benefits (doing more for less). There are many settlements, which regularly suffer from flooding, which would benefit from upstream mitigation measures. Interventions at the source of runoff generation can have a positive impact on the flood hydrograph downstream. An integrated approach to managing runoff can also have multiple benefits on pollution and ecology, which could lead to beneficial impacts at the catchment scale. Belford, a small community in Northumberland, UK has suffered from an increased number of flood events over the past ten years. There is currently support within the English and Welsh Environment Agency for sustainable flood management solutions such as storage ponds, wetlands, beaver dams and willow riparian features which are being trialled at Belford. These runoff attenuation features (RAFs) also have benefits to water quality, capture sediment and create new ecological zones. Although the process by which numerous RAFs were deployed in Belford proved initially difficult to achieve within the existing regulatory framework, an efficient uptake process is now supported by local regulators including several branches of the Environment Agency. The Belford runoff management framework provides a step by step guide to implementing mitigation measures in the Belford burn catchment and could be easily applied to other catchments at a similar scale. The approach is based on implementing mitigation measures through engaging with catchment stakeholders and using solid field science and management protocols.
Hydrological Simulation of Flood Events At Large Basins Using Distributed Modelling
NASA Astrophysics Data System (ADS)
Vélez, J.; Vélez, I.; Puricelli, M.; Francés, F.
Recent advances in technology allows to the scientist community advance in new pro- cedures in order to reduce the risk associated to flood events. A conceptual distributed model has been implemented to simulate the hydrological processes involved during floods. The model has been named TETIS. The basin is divided into rectangular cells, all of them connected according to the network drainage. The rainfall-runoff process is modelled using four linked tanks at each cell with different outflow relationships at each tank, which represent the ET, direct runoff, interflow and base flow, respectively. The routing along the channel network has been proposed using basin geomorpho- logic characteristics coupled to the cinematic wave procedure. The vertical movement along the cell is proposed using simple relationships based on soil properties as field capacity and the saturated hydraulic conductivities, which were previously obtained using land use, litology, edaphology and basin properties maps. The different vertical proccesses along the cell included are: capillar storage, infiltration, percolation and underground losses. Finally, snowmelting and reservoir routing has been included. TETIS has been implemented in the flood warning system of the Tagus River, with a basin of 59 200 km2. The time discretization of the input data is 15 minutes, and the cell size is 500x500 m. The basic parameter maps were estimated for the entire basin, and a calibration and validation processes were performed using some recorded events in the upper part of the basin. Calibration confirmed the initial parameter estimation. Additionally, the validation in time and space showed the robustness of these types of models
Fast non-interferometric iterative phase retrieval for holographic data storage.
Lin, Xiao; Huang, Yong; Shimura, Tsutomu; Fujimura, Ryushi; Tanaka, Yoshito; Endo, Masao; Nishimoto, Hajimu; Liu, Jinpeng; Li, Yang; Liu, Ying; Tan, Xiaodi
2017-12-11
Fast non-interferometric phase retrieval is a very important technique for phase-encoded holographic data storage and other phase based applications due to its advantage of easy implementation, simple system setup, and robust noise tolerance. Here we present an iterative non-interferometric phase retrieval for 4-level phase encoded holographic data storage based on an iterative Fourier transform algorithm and known portion of the encoded data, which increases the storage code rate to two-times that of an amplitude based method. Only a single image at the Fourier plane of the beam is captured for the iterative reconstruction. Since beam intensity at the Fourier plane of the reconstructed beam is more concentrated than the reconstructed beam itself, the requirement of diffractive efficiency of the recording media is reduced, which will improve the dynamic range of recording media significantly. The phase retrieval only requires 10 iterations to achieve a less than 5% phase data error rate, which is successfully demonstrated by recording and reconstructing a test image data experimentally. We believe our method will further advance the holographic data storage technique in the era of big data.
Optimal Coordination of Building Loads and Energy Storage for Power Grid and End User Services
Hao, He; Wu, Di; Lian, Jianming; ...
2017-01-18
Demand response and energy storage play a profound role in the smart grid. The focus of this study is to evaluate benefits of coordinating flexible loads and energy storage to provide power grid and end user services. We present a Generalized Battery Model (GBM) to describe the flexibility of building loads and energy storage. An optimization-based approach is proposed to characterize the parameters (power and energy limits) of the GBM for flexible building loads. We then develop optimal coordination algorithms to provide power grid and end user services such as energy arbitrage, frequency regulation, spinning reserve, as well as energymore » cost and demand charge reduction. Several case studies have been performed to demonstrate the efficacy of the GBM and coordination algorithms, and evaluate the benefits of using their flexibility for power grid and end user services. We show that optimal coordination yields significant cost savings and revenue. Moreover, the best option for power grid services is to provide energy arbitrage and frequency regulation. Finally and furthermore, when coordinating flexible loads with energy storage to provide end user services, it is recommended to consider demand charge in addition to time-of-use price in order to flatten the aggregate power profile.« less
Reinforcement learning techniques for controlling resources in power networks
NASA Astrophysics Data System (ADS)
Kowli, Anupama Sunil
As power grids transition towards increased reliance on renewable generation, energy storage and demand response resources, an effective control architecture is required to harness the full functionalities of these resources. There is a critical need for control techniques that recognize the unique characteristics of the different resources and exploit the flexibility afforded by them to provide ancillary services to the grid. The work presented in this dissertation addresses these needs. Specifically, new algorithms are proposed, which allow control synthesis in settings wherein the precise distribution of the uncertainty and its temporal statistics are not known. These algorithms are based on recent developments in Markov decision theory, approximate dynamic programming and reinforcement learning. They impose minimal assumptions on the system model and allow the control to be "learned" based on the actual dynamics of the system. Furthermore, they can accommodate complex constraints such as capacity and ramping limits on generation resources, state-of-charge constraints on storage resources, comfort-related limitations on demand response resources and power flow limits on transmission lines. Numerical studies demonstrating applications of these algorithms to practical control problems in power systems are discussed. Results demonstrate how the proposed control algorithms can be used to improve the performance and reduce the computational complexity of the economic dispatch mechanism in a power network. We argue that the proposed algorithms are eminently suitable to develop operational decision-making tools for large power grids with many resources and many sources of uncertainty.
Motor Control of Two Flywheels Enabling Combined Attitude Control and Bus Regulation
NASA Technical Reports Server (NTRS)
Kenny, Barbara H.
2004-01-01
This presentation discussed the flywheel technology development work that is ongoing at NASA GRC with a particular emphasis on the flywheel system control. The "field orientation" motor/generator control algorithm was discussed and explained. The position-sensorless angle and speed estimation algorithm was presented. The motor current response to a step change in command at low (10 kRPM) and high (60 kRPM) was discussed. The flywheel DC bus regulation control was explained and experimental results presented. Finally, the combined attitude control and energy storage algorithm that controls two flywheels simultaneously was presented. Experimental results were shown that verified the operational capability of the algorithm. shows high speed flywheel energy storage (60,000 RPM) and the successful implementation of an algorithm to simultaneously control both energy storage and a single axis of attitude with two flywheels. Overall, the presentation demonstrated that GRC has an operational facility that
NASA Technical Reports Server (NTRS)
Kascak, Peter E.; Kenny, Barbara H.; Dever, Timothy P.; Santiago, Walter; Jansen, Ralph H.
2001-01-01
An experimental flywheel energy storage system is described. This system is being used to develop a flywheel based replacement for the batteries on the International Space Station (ISS). Motor control algorithms which allow the flywheel to interface with a simplified model of the ISS power bus, and function similarly to the existing ISS battery system, are described. Results of controller experimental verification on a 300 W-hr flywheel are presented.
Will climate change affect weather types associated with flooding in the Elbe river basin?
NASA Astrophysics Data System (ADS)
Nissen, Katrin M.; Pardowitz, Tobias; Ulbrich, Uwe; Nied, Manuela
2013-04-01
This study investigates the effects of anthropogenic climate change on weather types associated with flooding in the Elbe river basin. The study is based on an ensemble of 3 simulations with the ECHAM5 MPIOM coupled model forced with historical and SRES A1B greenhouse gas concentrations. Relevant weather types, occuring in association with recent flood events, are identified in the ERA40 reanalysis data set. The weather types are classified with the SANDRA cluster algorithm. Distributions of tropospheric humidity content, 500 hPa geopotential height and 500 hPa temperature over Europe are taken as input parameters. 8 (out of 40) weather types are found to be associated with flooding events in the Elbe river basin. The majority of these (6) typically occur during winter, while 2 are warm season patterns. Downscaling reveals characteristic precipitation anomalies associated with the individual patterns. The 8 flood relevant weather types are then identified in the ECHAM5 simulations. The effect of climate change on these patterns is investigated by comparing the last 30 years of the previous century to the last 30 years of the 21st century. According to the model the frequency of most patterns will not change. 5 patterns may experience a statistically significant increase in the mean precipitation over the catchment area and 4 patterns an increase in extreme precipitation. Persistence may slightly decrease for 2 patterns and remain unchanged for the others. Overall, this indicates a moderate increase in the risk for Elbe river flooding, related to changes in the weather patterns, in the coming decades.
NASA Astrophysics Data System (ADS)
Cifelli, R.; Johnson, L. E.; White, A. B.
2014-12-01
Advancements in monitoring and prediction of precipitation and severe storms can provide significant benefits for water resource managers, allowing them to mitigate flood damage risks, capture additional water supplies and offset drought impacts, and enhance ecosystem services. A case study for the San Francisco Bay area provides the context for quantification of the benefits of an Advanced Quantitative Precipitation Information (AQPI) system. The AQPI builds off more than a decade of NOAA research and applications of advanced precipitation sensors, data assimilation, numerical models of storms and storm runoff, and systems integration for real-time operations. An AQPI would dovetail with the current National Weather Service forecast operations to provide higher resolution monitoring of rainfall events and longer lead time forecasts. A regional resource accounting approach has been developed to quantify the incremental benefits assignable to the AQPI system; these benefits total to $35 M/yr in the 9 county Bay region. Depending on the jurisdiction large benefits for flood damage avoidance may accrue for locations having dense development in flood plains. In other locations forecst=based reservoir operations can increase reservoir storage for water supplies. Ecosystem services benefits for fisheries may be obtained from increased reservoir storage and downstream releases. Benefits in the transporation sectors are associated with increased safety and avoided delays. Compared to AQPI system implementation and O&M costs over a 10 year operations period, a benefit - cost (B/C) ratio is computed which ranges between 2.8 to 4. It is important to acknowledge that many of the benefits are dependent on appropriate and adequate response by the hazards and water resources management agencies and citizens.
Floodplain-mapping With Modern It-instruments
NASA Astrophysics Data System (ADS)
Bley, D.; Pasche, E.
of all natural hazards, floods occur globally most frequently, claim most casualities and cause the biggest economic losses. Reasons are anthropogenic changes (river cor- rection, land surface sealing, waldsterben, climatic changes) combined with a high population density. Counteractions must be the resettlement of human beings away from flood-prone areas, flood controls and environmental monitoring, as well as renat- uralization and provision of retention basins and areas. The consequence, especially if we think of the recent flood-events on the rivers Rhine, Odra and Danube must be a preventive and sustainable flood control. As a consequence the legislator de- manded in the Water Management Act nation-wide floodplain-mapping, to preserve the necessary retention-areas for high water flows and prevent misuses. In this context, water level calculations based on a one-dimensional steady-flow computer model are among the major tasks in hydraulic engineering practice. Bjoernsen Consulting En- gineers developed in cooperation with the Technical University of Hamburg-Harburg the integrated software system WSPWIN. It is based upon state of the art informa- tion technology and latest developments in hydraulic research. WSPWIN consists of a pre-processing module, a calculation core, and GIS-based post-processing elements. As water level calculations require the recording and storage of large amounts of to- pographic and hydraulic data it is helpful that WSPWIN consists of an interactive graphical profile-editor, which allows visual data checking and editing. The calcu- lation program comprises water level calculations under steady uniform and steady non-uniform flow conditions using the formulas of Darcy-Weisbach and Gauckler- Manning-Strickler. Bridges, weirs, pipes as well as the effects of submerged vege- tation are taken into account. Post-processing includes plotting facilities for cross- sectional and longitudinal profiles as well as map-oriented GIS-based data editing and result presentation. Import of digital elevation models and generation of profiles are possible. Furthermore, the intersection of the DEM with the calculated water level en- ables the creation of floodplain maps. WSPWIN is the official standard software for one-dimensional hydraulic modeling in six German Federal States, where it is used by all water-management agencies. Moreover, many private companies, universities and water-associations employ WSPWIN as well. The program is presented showing the procedure and difficulties of floodplain-mapping and flood control on a Bavarian river.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowen, Benjamin; Ruebel, Oliver; Fischer, Curt Fischer R.
BASTet is an advanced software library written in Python. BASTet serves as the analysis and storage library for the OpenMSI project. BASTet is an integrate framework for: i) storage of spectral imaging data, ii) storage of derived analysis data, iii) provenance of analyses, iv) integration and execution of analyses via complex workflows. BASTet implements the API for the HDF5 storage format used by OpenMSI. Analyses that are developed using BASTet benefit from direct integration with storage format, automatic tracking of provenance, and direct integration with command-line and workflow execution tools. BASTet also defines interfaces to enable developers to directly integratemore » their analysis with OpenMSI's web-based viewing infrastruture without having to know OpenMSI. BASTet also provides numerous helper classes and tools to assist with the conversion of data files, ease parallel implementation of analysis algorithms, ease interaction with web-based functions, description methods for data reduction. BASTet also includes detailed developer documentation, user tutorials, iPython notebooks, and other supporting documents.« less
Code of Federal Regulations, 2014 CFR
2014-07-01
... response to climate change. Conservation. The protection, preservation, management, or restoration of... structure and/or function and changes resources, substrate availability, or the physical environment... carbon; climate regulation; water filtration, purification, and storage; soil stabilization; flood...
Code of Federal Regulations, 2013 CFR
2013-07-01
... response to climate change. Conservation. The protection, preservation, management, or restoration of... structure and/or function and changes resources, substrate availability, or the physical environment... carbon; climate regulation; water filtration, purification, and storage; soil stabilization; flood...
Code of Federal Regulations, 2012 CFR
2012-07-01
... response to climate change. Conservation. The protection, preservation, management, or restoration of... structure and/or function and changes resources, substrate availability, or the physical environment... carbon; climate regulation; water filtration, purification, and storage; soil stabilization; flood...
Content-aware network storage system supporting metadata retrieval
NASA Astrophysics Data System (ADS)
Liu, Ke; Qin, Leihua; Zhou, Jingli; Nie, Xuejun
2008-12-01
Nowadays, content-based network storage has become the hot research spot of academy and corporation[1]. In order to solve the problem of hit rate decline causing by migration and achieve the content-based query, we exploit a new content-aware storage system which supports metadata retrieval to improve the query performance. Firstly, we extend the SCSI command descriptor block to enable system understand those self-defined query requests. Secondly, the extracted metadata is encoded by extensible markup language to improve the universality. Thirdly, according to the demand of information lifecycle management (ILM), we store those data in different storage level and use corresponding query strategy to retrieval them. Fourthly, as the file content identifier plays an important role in locating data and calculating block correlation, we use it to fetch files and sort query results through friendly user interface. Finally, the experiments indicate that the retrieval strategy and sort algorithm have enhanced the retrieval efficiency and precision.
NASA Astrophysics Data System (ADS)
Unland, N. P.; Cartwright, I.; Cendón, D. I.; Chisari, R.
2014-12-01
Bank exchange processes within 50 m of the Tambo River, southeast Australia, have been investigated through the combined use of 3H and 14C. Groundwater residence times increase towards the Tambo River, which suggests the absence of significant bank storage. Major ion concentrations and δ2H and δ18O values of bank water also indicate that bank infiltration does not significantly impact groundwater chemistry under baseflow and post-flood conditions, suggesting that the gaining nature of the river may be driving the return of bank storage water back into the Tambo River within days of peak flood conditions. The covariance between 3H and 14C indicates the leakage and mixing between old (~17 200 years) groundwater from a semi-confined aquifer and younger groundwater (<100 years) near the river, where confining layers are less prevalent. It is likely that the upward infiltration of deeper groundwater from the semi-confined aquifer during flooding limits bank infiltration. Furthermore, the more saline deeper groundwater likely controls the geochemistry of water in the river bank, minimising the chemical impact that bank infiltration has in this setting. These processes, coupled with the strongly gaining nature of the Tambo River are likely to be the factors reducing the chemical impact of bank storage in this setting. This study illustrates the complex nature of river groundwater interactions and the potential downfall in assuming simple or idealised conditions when conducting hydrogeological studies.
NASA Astrophysics Data System (ADS)
Habert, J.; Ricci, S.; Le Pape, E.; Thual, O.; Piacentini, A.; Goutal, N.; Jonville, G.; Rochoux, M.
2016-01-01
This paper presents a data-driven hydrodynamic simulator based on the 1-D hydraulic solver dedicated to flood forecasting with lead time of an hour up to 24 h. The goal of the study is to reduce uncertainties in the hydraulic model and thus provide more reliable simulations and forecasts in real time for operational use by the national hydrometeorological flood forecasting center in France. Previous studies have shown that sequential assimilation of water level or discharge data allows to adjust the inflows to the hydraulic network resulting in a significant improvement of the discharge while leaving the water level state imperfect. Two strategies are proposed here to improve the water level-discharge relation in the model. At first, a modeling strategy consists in improving the description of the river bed geometry using topographic and bathymetric measurements. Secondly, an inverse modeling strategy proposes to locally correct friction coefficients in the river bed and the flood plain through the assimilation of in situ water level measurements. This approach is based on an Extended Kalman filter algorithm that sequentially assimilates data to infer the upstream and lateral inflows at first and then the friction coefficients. It provides a time varying correction of the hydrological boundary conditions and hydraulic parameters. The merits of both strategies are demonstrated on the Marne catchment in France for eight validation flood events and the January 2004 flood event is used as an illustrative example throughout the paper. The Nash-Sutcliffe criterion for water level is improved from 0.135 to 0.832 for a 12-h forecast lead time with the data assimilation strategy. These developments have been implemented at the SAMA SPC (local flood forecasting service in the Haute-Marne French department) and used for operational forecast since 2013. They were shown to provide an efficient tool for evaluating flood risk and to improve the flood early warning system. Complementary with the deterministic forecast of the hydraulic state, the estimation of an uncertainty range is given relying on off-line and on-line diagnosis. The possibilities to further extend the control vector while limiting the computational cost and equifinality problem are finally discussed.
NASA Astrophysics Data System (ADS)
Weiler, M.
2016-12-01
Heavy rain induced flash floods are still a serious hazard and generate high damages in urban areas. In particular in the spatially complex urban areas, the temporal and spatial pattern of runoff generation processes at a wide spatial range during extreme rainfall events need to be predicted including the specific effects of green infrastructure and urban forests. In addition, the initial conditions (soil moisture pattern, water storage of green infrastructure) and the effect of lateral redistribution of water (run-on effects and re-infiltration) have to be included in order realistically predict flash flood generation. We further developed the distributed, process-based model RoGeR (Runoff Generation Research) to include the relevant features and processes in urban areas in order to test the effects of different settings, initial conditions and the lateral redistribution of water on the predicted flood response. The uncalibrated model RoGeR runs at a spatial resolution of 1*1m² (LiDAR, degree of sealing, landuse), soil properties and geology (1:50.000). In addition, different green infrastructures are included into the model as well as the effect of trees on interception and transpiration. A hydraulic model was included into RoGeR to predict surface runoff, water redistribution, and re-infiltration. During rainfall events, RoGeR predicts at 5 min temporal resolution, but the model also simulates evapotranspiration and groundwater recharge during rain-free periods at a longer time step. The model framework was applied to several case studies in Germany where intense rainfall events produced flash floods causing high damage in urban areas and to a long-term research catchment in an urban setting (Vauban, Freiburg), where a variety of green infrastructures dominates the hydrology. Urban-RoGeR allowed us to study the effects of different green infrastructures on reducing the flood peak, but also its effect on the water balance (evapotranspiration and groundwater recharge). We could also show that infiltration of surface runoff from areas with a low infiltration (lateral redistribution) reduce the flood peaks by over 90% in certain areas and situations. Finally, we also evaluated the model to long-term runoff observations (surface runoff, ET, roof runoff) and to flood marks in the selected case studies.
NASA Astrophysics Data System (ADS)
Pappenberger, F.; Beven, K. J.; Frodsham, K.; Matgen, P.
2005-12-01
Flood inundation models play an increasingly important role in assessing flood risk. The growth of 2D inundation models that are intimately related to raster maps of floodplains is occurring at the same time as an increase in the availability of 2D remote data (e.g. SAR images and aerial photographs), against which model performancee can be evaluated. This requires new techniques to be explored in order to evaluate model performance in two dimensional space. In this paper we present a fuzzified pattern matching algorithm which compares favorably to a set of traditional measures. However, we further argue that model calibration has to go beyond the comparison of physical properties and should demonstrate how a weighting towards consequences, such as loss of property, can enhance model focus and prediction. Indeed, it will be necessary to abandon a fully spatial comparison in many scenarios to concentrate the model calibration exercise on specific points such as hospitals, police stations or emergency response centers. It can be shown that such point evaluations lead to significantly different flood hazard maps due to the averaging effect of a spatial performance measure. A strategy to balance the different needs (accuracy at certain spatial points and acceptable spatial performance) has to be based in a public and political decision making process.
NASA Astrophysics Data System (ADS)
Changnon, Stanley A.
1999-03-01
A record-breaking 24-h rainstorm on 17-18 July 1996 was centered on south Chicago and its southern and western suburbs, areas with a population of 3.4 million. The resulting flash flooding in Chicago and 21 suburbs broke all-time records in the region and brought the Illinois and Mississippi Rivers above flood stage. More than 4300 persons were evacuated from the flooded zones and 35000 homes experienced flood damage. Six persons were killed and the total estimated cost of the flood (losses and recovery actions) was 645 million, ranking as Illinois' second most costly weather disaster on record after the 1993 flood. Extensive damages and travel delays occurred on metropolitan transportation systems (highways and railroads). Commuters were unable to reach Chicago for up to three days and more than 300 freight trains were delayed or rerouted. Communities dealt with removal of flood-damaged materials, as well as damage to streets, bridges, and sewage treatment and water treatment plants. Reduced crop yields in adjacent rural areas represented a 67 million loss of farm income. Conflicts between communities developed over blame for the flooding due to inadequate storage capacity resulting in new regional flood planning. Federal and state aid ultimately reached 265 million, 41% of the storm costs. More than 85000 individuals received assistance, and 222 structures have been relocated under the federal Hazard Mitigation Grant Program at a cost of 19.6 million.
Scan-Line Methods in Spatial Data Systems
1990-09-04
algorithms in detail to show some of the implementation issues. Data Compression Storage and transmission times can be reduced by using compression ...goes through the data . Luckily, there are good one-directional compression algorithms , such as run-length coding 13 in which each scan line can be...independently compressed . These are the algorithms to use in a parallel scan-line system. Data compression is usually only used for long-term storage of
Large space structures control algorithm characterization
NASA Technical Reports Server (NTRS)
Fogel, E.
1983-01-01
Feedback control algorithms are developed for sensor/actuator pairs on large space systems. These algorithms have been sized in terms of (1) floating point operation (FLOP) demands; (2) storage for variables; and (3) input/output data flow. FLOP sizing (per control cycle) was done as a function of the number of control states and the number of sensor/actuator pairs. Storage for variables and I/O sizing was done for specific structure examples.
Flood triggering in Switzerland: the role of daily to monthly preceding precipitation
NASA Astrophysics Data System (ADS)
Froidevaux, P.; Schwanbeck, J.; Weingartner, R.; Chevalier, C.; Martius, O.
2015-09-01
Determining the role of different precipitation periods for peak discharge generation is crucial for both projecting future changes in flood probability and for short- and medium-range flood forecasting. In this study, catchment-averaged daily precipitation time series are analyzed prior to annual peak discharge events (floods) in Switzerland. The high number of floods considered - more than 4000 events from 101 catchments have been analyzed - allows to derive significant information about the role of antecedent precipitation for peak discharge generation. Based on the analysis of precipitation times series, a new separation of flood-related precipitation periods is proposed: (i) the period 0 to 1 day before flood days, when the maximum flood-triggering precipitation rates are generally observed, (ii) the period 2 to 3 days before flood days, when longer-lasting synoptic situations generate "significantly higher than normal" precipitation amounts, and (iii) the period from 4 days to 1 month before flood days when previous wet episodes may have already preconditioned the catchment. The novelty of this study lies in the separation of antecedent precipitation into the precursor antecedent precipitation (4 days before floods or earlier, called PRE-AP) and the short range precipitation (0 to 3 days before floods, a period when precipitation is often driven by one persistent weather situation like e.g., a stationary low-pressure system). A precise separation of "antecedent" and "peak-triggering" precipitation is not attempted. Instead, the strict definition of antecedent precipitation periods permits a direct comparison of all catchments. The precipitation accumulating 0 to 3 days before an event is the most relevant for floods in Switzerland. PRE-AP precipitation has only a weak and region-specific influence on flood probability. Floods were significantly more frequent after wet PRE-AP periods only in the Jura Mountains, in the western and eastern Swiss plateau, and at the outlet of large lakes. As a general rule, wet PRE-AP periods enhance the flood probability in catchments with gentle topography, high infiltration rates, and large storage capacity (karstic cavities, deep soils, large reservoirs). In contrast, floods were significantly less frequent after wet PRE-AP periods in glacial catchments because of reduced melt. For the majority of catchments however, no significant correlation between precipitation amounts and flood occurrences is found when the last 3 days before floods are omitted in the precipitation amounts. Moreover, the PRE-AP was not higher for extreme floods than for annual floods with a high frequency and was very close to climatology for all floods. The fact that floods are not significantly more frequent nor more intense after wet PRE-AP is a clear indicator of a short discharge memory of Pre-Alpine, Alpine and South Alpine Swiss catchments. Our study poses the question whether the impact of long-term precursory precipitation for floods in such catchments is not overestimated in the general perception. The results suggest that the consideration of a 3-4 days precipitation period should be sufficient to represent (understand, reconstruct, model, project) Swiss Alpine floods.
Flood triggering in Switzerland: the role of daily to monthly preceding precipitation
NASA Astrophysics Data System (ADS)
Froidevaux, P.; Schwanbeck, J.; Weingartner, R.; Chevalier, C.; Martius, O.
2015-03-01
Determining the role of different precipitation periods for peak discharge generation is crucial for both projecting future changes in flood probability and for short- and medium-range flood forecasting. We analyze catchment-averaged daily precipitation time series prior to annual peak discharge events (floods) in Switzerland. The high amount of floods considered - more than 4000 events from 101 catchments have been analyzed - allows to derive significant information about the role of antecedent precipitation for peak discharge generation. Based on the analysis of precipitation times series, we propose a new separation of flood-related precipitation periods: (i) the period 0 to 1 day before flood days, when the maximum flood-triggering precipitation rates are generally observed, (ii) the period 2 to 3 days before flood days, when longer-lasting synoptic situations generate "significantly higher than normal" precipitation amounts, and (iii) the period from 4 days to one month before flood days when previous wet episodes may have already preconditioned the catchment. The novelty of this study lies in the separation of antecedent precipitation into the precursor antecedent precipitation (4 days before floods or earlier, called PRE-AP) and the short range precipitation (0 to 3 days before floods, a period when precipitation is often driven by one persistent weather situation like e.g. a stationary low-pressure system). Because we consider a high number of events and because we work with daily precipitation values, we do not separate the "antecedent" and "peak-triggering" precipitation. The whole precipitation recorded during the flood day is included in the short-range antecedent precipitation. The precipitation accumulating 0 to 3 days before an event is the most relevant for floods in Switzerland. PRE-AP precipitation has only a weak and region-specific influence on flood probability. Floods were significantly more frequent after wet PRE-AP periods only in the Jura Mountains, in the western and eastern Swiss plateau, and at the exit of large lakes. As a general rule, wet PRE-AP periods enhance the flood probability in catchments with gentle topography, high infiltration rates, and large storage capacity (karstic cavities, deep soils, large reservoirs). In contrast, floods were significantly less frequent after wet PRE-AP periods in glacial catchments because of reduced melt. For the majority of catchments however, no significant correlation between precipitation amounts and flood occurrences is found when the last three days before floods are omitted in the precipitation amounts. Moreover, the PRE-AP was not higher for extreme floods than for annual floods with a high frequency and was very close to climatology for all floods. The weak influence of PRE-AP is a clear indicator of a short discharge memory of Prealpine, Alpine and Southalpine Swiss catchments. Our study nevertheless poses the question whether the impact of long-term precursory precipitation for floods in such catchments is not overestimated in the general perception. We conclude that the consideration of a 3-4 days precipitation period should be sufficient to represent (understand, reconstruct, model, project) Swiss Alpine floods.
Systems aspects of COBE science data compression
NASA Technical Reports Server (NTRS)
Freedman, I.; Boggess, E.; Seiler, E.
1993-01-01
A general approach to compression of diverse data from large scientific projects has been developed and this paper addresses the appropriate system and scientific constraints together with the algorithm development and test strategy. This framework has been implemented for the COsmic Background Explorer spacecraft (COBE) by retrofitting the existing VAS-based data management system with high-performance compression software permitting random access to the data. Algorithms which incorporate scientific knowledge and consume relatively few system resources are preferred over ad hoc methods. COBE exceeded its planned storage by a large and growing factor and the retrieval of data significantly affects the processing, delaying the availability of data for scientific usage and software test. Embedded compression software is planned to make the project tractable by reducing the data storage volume to an acceptable level during normal processing.
Delivery of video-on-demand services using local storages within passive optical networks.
Abeywickrama, Sandu; Wong, Elaine
2013-01-28
At present, distributed storage systems have been widely studied to alleviate Internet traffic build-up caused by high-bandwidth, on-demand applications. Distributed storage arrays located locally within the passive optical network were previously proposed to deliver Video-on-Demand services. As an added feature, a popularity-aware caching algorithm was also proposed to dynamically maintain the most popular videos in the storage arrays of such local storages. In this paper, we present a new dynamic bandwidth allocation algorithm to improve Video-on-Demand services over passive optical networks using local storages. The algorithm exploits the use of standard control packets to reduce the time taken for the initial request communication between the customer and the central office, and to maintain the set of popular movies in the local storage. We conduct packet level simulations to perform a comparative analysis of the Quality-of-Service attributes between two passive optical networks, namely the conventional passive optical network and one that is equipped with a local storage. Results from our analysis highlight that strategic placement of a local storage inside the network enables the services to be delivered with improved Quality-of-Service to the customer. We further formulate power consumption models of both architectures to examine the trade-off between enhanced Quality-of-Service performance versus the increased power requirement from implementing a local storage within the network.
Reconstructing the 2015 Flash Flood event of Salgar Colombia, The Case of a Poor Gauged Basin
NASA Astrophysics Data System (ADS)
Velasquez, N.; Zapata, E.; Hoyos Ortiz, C. D.; Velez, J. I.
2017-12-01
Flash floods events associated with severe precipitation events are highly destructive, often resulting in significant human and economic losses. Due to their nature, flash floods trend to occur in medium to small basins located within complex high mountainous regions. In the Colombian Andean region these basins are very common, with the aggravating factor that the vulnerability is considerably high as some important human settlements are located within these basins, frequently occupating flood plains and other flash-flood prone areas. During the dawn of May 18 of 2015 two severe rainfall events generated a flash flood event in the municipality ofSalgar, La Liboriana basin, locatedin the northwestern Colombian Andes, resulting in more than 100 human casualties and significant economic losses. The present work is a reconstruction of the hydrological processes that took place before and during the Liboriana flash flood event, analyzed as a case of poorly gauged basin.The event conditions where recreated based on radar retrievals and a hydrological distributed model, linked with a proposed 1D hydraulic model and simple shallow landslide model. Results suggest that the flash flood event was caused by the occurrence of two successive severe convective events over the same basin, with an important modulation associated with soil characteristics and water storage.Despite of its simplicity, the proposed hydraulic model achieves a good representation of the flooded area during the event, with limitations due to the adopted spatial scale (12.7 meters, from ALOS PALSAR images). Observed landslides were obtained from satellite images; for this case the model simulates skillfully the landslide occurrence regions with small differences in the exact locations.To understand this case, radar data shows to be key due to specific convective cores location and rainfall intensity estimation.In mountainous regions, there exists a significant number of settlements with similar vulnerability and with the same gauging conditions, the use of low-cost modelling strategy could represent a good risk management tool in these regions with low planning capabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frailey, Scott M.; Krapac, Ivan G.; Damico, James R.
2012-03-30
The Midwest Geological Sequestration Consortium (MGSC) carried out a small-scale carbon dioxide (CO 2) injection test in a sandstone within the Clore Formation (Mississippian System, Chesterian Series) in order to gauge the large-scale CO 2 storage that might be realized from enhanced oil recovery (EOR) of mature Illinois Basin oil fields via miscible liquid CO 2 flooding.
NASA Astrophysics Data System (ADS)
Vathsala, H.; Koolagudi, Shashidhar G.
2017-01-01
In this paper we discuss a data mining application for predicting peninsular Indian summer monsoon rainfall, and propose an algorithm that combine data mining and statistical techniques. We select likely predictors based on association rules that have the highest confidence levels. We then cluster the selected predictors to reduce their dimensions and use cluster membership values for classification. We derive the predictors from local conditions in southern India, including mean sea level pressure, wind speed, and maximum and minimum temperatures. The global condition variables include southern oscillation and Indian Ocean dipole conditions. The algorithm predicts rainfall in five categories: Flood, Excess, Normal, Deficit and Drought. We use closed itemset mining, cluster membership calculations and a multilayer perceptron function in the algorithm to predict monsoon rainfall in peninsular India. Using Indian Institute of Tropical Meteorology data, we found the prediction accuracy of our proposed approach to be exceptionally good.
Instrumentation for detailed bridge-scour measurements
Landers, Mark N.; Mueller, David S.; Trent, Roy E.; ,
1993-01-01
A portable instrumentation system is being developed to obtain channel bathymetry during floods for detailed bridge-scour measurements. Portable scour measuring systems have four components: sounding instrument, horizontal positioning instrument, deployment mechanisms, and data storage device. The sounding instrument will be a digital fathometer. Horizontal position will be measured using a range-azimuth based hydrographic survey system. The deployment mechanism designed for this system is a remote-controlled boat using a small waterplane area, twin-hull design. An on-board computer and radio will monitor the vessel instrumentation, record measured data, and telemeter data to shore.
Origin of the Colorado River experimental flood in Grand Canyon
Andrews, E.D.; Pizzi, L.A.
2000-01-01
The Colorado River is one of the most highly regulated and extensively utilized rivers in the world. Total reservoir storage is approximately four times the mean annual runoff of ~17 x 109 m3 year -1. Reservoir storage and regulation have decreased annual peak discharges and hydroelectric power generation has increased daily flow variability. In recent years, the incidental impacts of this development have become apparent especially along the Colorado River through Grand Canyon National Park downstream from Glen Canyon Dam and caused widespread concern. Since the completion of Glen Canyon Dam, the number and size of sand bars, which are used by recreational river runners and form the habitat for native fishes, have decreased substantially. Following an extensive hydrological and geomorphic investigation, an experimental flood release from the Glen Canyon Dam was proposed to determine whether sand bars would be rebuilt by a relatively brief period of flow substantially greater than the normal operating regime. This proposed release, however, was constrained by the Law of the River, the body of law developed over 70 years to control and distribute Colorado River water, the needs of hydropower users and those dependent upon hydropower revenues, and the physical constraints of the dam itself. A compromise was reached following often difficult negotiations and an experimental flood to rebuild sand bars was released in 1996. This flood, and the process by which it came about, gives hope to resolving the difficult and pervasive problem of allocation of water resources among competing interests.The Colorado River is one of the most highly regulated and extensively utilized rivers in the world. Total reservoir storage is approximately four times the mean annual runoff of approximately 17??109 m3 year-1. Reservoir storage and regulation have decreased annual peak discharges and hydroelectric power generation has increased daily flow variability. In recent years, the incidental impacts of this development have become apparent especially along the Colorado River through Grand Canyon National Park downstream from Glen Canyon Dam and caused widespread concern. Since the completion of Glen Canyon Dam, the number and size of sand bars, which are used by recreational river runners and form the habitat for native fishes, have decreased substantially. Following an extensive hydrological and geomorphic investigation, an experimental flood release from the Glen Canyon Dam was proposed to determine whether sand bars would be rebuilt by a relatively brief period of flow substantially greater than the normal operating regime. This proposed release, however, was constrained by the Law of the River, the body of law developed over 70 years to control and distribute Colorado River water, the needs of hydropower users and those dependent upon hydropower revenues, and the physical constraints of the dam itself. A compromise was reached following often difficult negotiations and an experimental flood to rebuild sand bars was released in 1996. This flood, and the process by which it came about, gives hope to resolving the difficult and pervasive problem of allocation of water resources among competing interests.
Digital signal processing algorithms for automatic voice recognition
NASA Technical Reports Server (NTRS)
Botros, Nazeih M.
1987-01-01
The current digital signal analysis algorithms are investigated that are implemented in automatic voice recognition algorithms. Automatic voice recognition means, the capability of a computer to recognize and interact with verbal commands. The digital signal is focused on, rather than the linguistic, analysis of speech signal. Several digital signal processing algorithms are available for voice recognition. Some of these algorithms are: Linear Predictive Coding (LPC), Short-time Fourier Analysis, and Cepstrum Analysis. Among these algorithms, the LPC is the most widely used. This algorithm has short execution time and do not require large memory storage. However, it has several limitations due to the assumptions used to develop it. The other 2 algorithms are frequency domain algorithms with not many assumptions, but they are not widely implemented or investigated. However, with the recent advances in the digital technology, namely signal processors, these 2 frequency domain algorithms may be investigated in order to implement them in voice recognition. This research is concerned with real time, microprocessor based recognition algorithms.
Fingerprint recognition of wavelet-based compressed images by neuro-fuzzy clustering
NASA Astrophysics Data System (ADS)
Liu, Ti C.; Mitra, Sunanda
1996-06-01
Image compression plays a crucial role in many important and diverse applications requiring efficient storage and transmission. This work mainly focuses on a wavelet transform (WT) based compression of fingerprint images and the subsequent classification of the reconstructed images. The algorithm developed involves multiresolution wavelet decomposition, uniform scalar quantization, entropy and run- length encoder/decoder and K-means clustering of the invariant moments as fingerprint features. The performance of the WT-based compression algorithm has been compared with JPEG current image compression standard. Simulation results show that WT outperforms JPEG in high compression ratio region and the reconstructed fingerprint image yields proper classification.
NASA Astrophysics Data System (ADS)
Wang, Wu; Huang, Wei; Zhang, Yongjun
2018-03-01
The grid-integration of Photovoltaic-Storage System brings some undefined factors to the network. In order to make full use of the adjusting ability of Photovoltaic-Storage System (PSS), this paper puts forward a reactive power optimization model, which are used to construct the objective function based on power loss and the device adjusting cost, including energy storage adjusting cost. By using Cataclysmic Genetic Algorithm to solve this optimization problem, and comparing with other optimization method, the result proved that: the method of dynamic extended reactive power optimization this article puts forward, can enhance the effect of reactive power optimization, including reducing power loss and device adjusting cost, meanwhile, it gives consideration to the safety of voltage.
Exploring the capacity of radar remote sensing to estimate wetland marshes water storage.
Grings, F; Salvia, M; Karszenbaum, H; Ferrazzoli, P; Kandus, P; Perna, P
2009-05-01
This paper focuses on the use of radar remote sensing for water storage estimation in wetland marshes of the Paraná River Delta in Argentina. The approach followed is based on the analysis of a temporal set of ENVISAT ASAR data which includes images acquired under different polarizations and incidence angles as well as different environmental conditions (water level, precipitation, and vegetation condition). Two marsh species, named junco and cortadera, were monitored. This overall data set gave us the possibility of studying and understanding the basic interactions between the radar, the soil under different flood conditions, and the vegetation structure. The comprehension of the observed features was addressed through electromagnetic models developed for these ecosystems. The procedure used in this work to estimate water level within marshes combines a direct electromagnetic model, field work data specifically obtained to feed the model, the actual ASAR measurements and a well known retrieval scheme based on a cost function. Results are validated with water level evaluations at specific points. A map showing an estimation of the water storage capacity and its error in junco and cortadera areas for the date where the investigation was done is also presented.
CoGI: Towards Compressing Genomes as an Image.
Xie, Xiaojing; Zhou, Shuigeng; Guan, Jihong
2015-01-01
Genomic science is now facing an explosive increase of data thanks to the fast development of sequencing technology. This situation poses serious challenges to genomic data storage and transferring. It is desirable to compress data to reduce storage and transferring cost, and thus to boost data distribution and utilization efficiency. Up to now, a number of algorithms / tools have been developed for compressing genomic sequences. Unlike the existing algorithms, most of which treat genomes as one-dimensional text strings and compress them based on dictionaries or probability models, this paper proposes a novel approach called CoGI (the abbreviation of Compressing Genomes as an Image) for genome compression, which transforms the genomic sequences to a two-dimensional binary image (or bitmap), then applies a rectangular partition coding algorithm to compress the binary image. CoGI can be used as either a reference-based compressor or a reference-free compressor. For the former, we develop two entropy-based algorithms to select a proper reference genome. Performance evaluation is conducted on various genomes. Experimental results show that the reference-based CoGI significantly outperforms two state-of-the-art reference-based genome compressors GReEn and RLZ-opt in both compression ratio and compression efficiency. It also achieves comparable compression ratio but two orders of magnitude higher compression efficiency in comparison with XM--one state-of-the-art reference-free genome compressor. Furthermore, our approach performs much better than Gzip--a general-purpose and widely-used compressor, in both compression speed and compression ratio. So, CoGI can serve as an effective and practical genome compressor. The source code and other related documents of CoGI are available at: http://admis.fudan.edu.cn/projects/cogi.htm.
Detecting Pulsing Denial-of-Service Attacks with Nondeterministic Attack Intervals
NASA Astrophysics Data System (ADS)
Luo, Xiapu; Chan, Edmond W. W.; Chang, Rocky K. C.
2009-12-01
This paper addresses the important problem of detecting pulsing denial of service (PDoS) attacks which send a sequence of attack pulses to reduce TCP throughput. Unlike previous works which focused on a restricted form of attacks, we consider a very broad class of attacks. In particular, our attack model admits any attack interval between two adjacent pulses, whether deterministic or not. It also includes the traditional flooding-based attacks as a limiting case (i.e., zero attack interval). Our main contribution is Vanguard, a new anomaly-based detection scheme for this class of PDoS attacks. The Vanguard detection is based on three traffic anomalies induced by the attacks, and it detects them using a CUSUM algorithm. We have prototyped Vanguard and evaluated it on a testbed. The experiment results show that Vanguard is more effective than the previous methods that are based on other traffic anomalies (after a transformation using wavelet transform, Fourier transform, and autocorrelation) and detection algorithms (e.g., dynamic time warping).
The effects of floodplain forest restoration and logjams on flood risk and flood hydrology
NASA Astrophysics Data System (ADS)
Dixon, Simon; Sear, David A.; Sykes, Tim; Odoni, Nicholas
2015-04-01
Flooding is the most common natural catastrophe, accounting for around half of all natural disaster related deaths and causing economic losses in Europe estimated at over € 2bn per year. In addition flooding is expected to increase in magnitude and frequency with climate change, effectively shortening the return period for a given magnitude flood. Increasing the height and extent of hard engineered defences in response to increased risk is both unsustainable and undesirable. Thus alternative approaches to flood mitigation are needed such as harnessing vegetation processes to slow the passage of flood waves and increase local flood storage. However, our understanding of these effects at the catchment scale is limited. In this presentation we demonstrate the effects of two river restoration approaches upon catchment scale flood hydrology. The addition of large wood to river channels during river restoration projects is a popular method of attempting to improve physical and biological conditions in degraded river systems. Projects utilising large wood can involve the installation of engineered logjams (ELJs), the planting and enhancement of riparian forests, or a combination of both. Altering the wood loading of a channel through installation of ELJs and increasing floodplain surface complexity through encouraging mature woodland could be expected to increase the local hydraulic resistance, increasing the timing and duration of overbank events locally and therefore increasing the travel time of a flood wave through a reach. This reach-scale effect has been documented in models and the field; however the impacts of these local changes at a catchment scale remains to be illustrated. Furthermore there is limited knowledge of how changing successional stages of a restored riparian forest through time may affect its influence on hydromorphic processes. We present results of a novel paired numerical modelling study. We model changes in flood hydrology based on a 98km² catchment using OVERFLOW; a simplified hydrological model using a spatially distributed unit hydrograph approach. Restoration scenarios for the hydrological modelling are informed by the development of a new conceptual model of riparian forest succession, including quantitative estimates of deadwood inputs to the system, using a numerical forest growth model. We explore scenarios using ELJs alone as well as managed and unmanaged riparian forest restoration at scales from reach to sub-catchment. We demonstrate that changes to catchment flood hydrology with restoration are highly location dependant and downstream flood peaks can in some cases increase through synchronisation of sub-catchment flood waves. We constrain magnitude estimates for increases and decreases in flood peaks for modelled restoration scenarios and scales. Finally we analyse the potential for using riparian forest restoration as part of an integrated flood risk management strategy, including specific examples of type and extent of restoration which may prove most beneficial.
MODELING PLUMES IN SMALL STREAMS
Pesticides accumulate on land surfaces from agricultural, commercial, and domestic application, and wash into streams and rivers during dry and wet weather. Flood water retention basins or structures often collect this contaminated runoff, providing intermediate storage and limit...
18 CFR 1304.405 - Fuel storage tanks and handling facilities.
Code of Federal Regulations, 2012 CFR
2012-04-01
... State showing how the tank will be anchored so that it does not float during flooding; and (5) Evidence, where applicable, that the applicant has complied with all spill prevention, control and countermeasures...
18 CFR 1304.405 - Fuel storage tanks and handling facilities.
Code of Federal Regulations, 2013 CFR
2013-04-01
... State showing how the tank will be anchored so that it does not float during flooding; and (5) Evidence, where applicable, that the applicant has complied with all spill prevention, control and countermeasures...
18 CFR 1304.405 - Fuel storage tanks and handling facilities.
Code of Federal Regulations, 2014 CFR
2014-04-01
... State showing how the tank will be anchored so that it does not float during flooding; and (5) Evidence, where applicable, that the applicant has complied with all spill prevention, control and countermeasures...
18 CFR 1304.405 - Fuel storage tanks and handling facilities.
Code of Federal Regulations, 2011 CFR
2011-04-01
... State showing how the tank will be anchored so that it does not float during flooding; and (5) Evidence, where applicable, that the applicant has complied with all spill prevention, control and countermeasures...
18 CFR 1304.405 - Fuel storage tanks and handling facilities.
Code of Federal Regulations, 2010 CFR
2010-04-01
... State showing how the tank will be anchored so that it does not float during flooding; and (5) Evidence, where applicable, that the applicant has complied with all spill prevention, control and countermeasures...
49 CFR 379.5 - Protection and storage of records.
Code of Federal Regulations, 2013 CFR
2013-10-01
... to this part from fires, floods, and other hazards, and safeguard the records from unnecessary... notify the Secretary if prescribed records are substantially destroyed or damaged before the term of the prescribed retention periods. ...