Sample records for alkaline flooding methods

  1. Alkaline flooding for enhanced oil recovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gittler, W.E.

    1983-09-01

    There are over 12 active projects of varying size using one of 3 major types of alkaline agents. These include sodium silicate, caustic soda, and soda ash. Among the largest pilots currently is the THUMS project in the Wilmington field, California. Plans called for the injection of a 4% weight concentration of sodium orthosilicate over a 60% PV. Through the first 3 yr, over 27 million bbl of chemicals have been injected. Gulf Oil is operating several alkaline floods, one of which is located off shore in the Quarantine Bay field, Louisiana. In this pilot, sodium hydroxide in a weightmore » concentration of 5 to 12% is being injected. Belco Petroleum Corp. has reported that their pilot operating in the Isenhour Unit in Wyoming is using a .5% weight concentration of soda ash in conjunction with a polymer. Other uses for alkaline agents in chemical flooding include the use of silicate as a preflush or sacrificial agent in micellar/polymer and surfactant recovery systems. In addition, caustic has been tested in the surface-mixed caustic emulsion process while orthosilicate has been tested in a recovery method known as mobility-controlled caustic floods.« less

  2. Surfactant-enhanced alkaline flooding: Buffering at intermediate alkaline pH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rudin, J.; Wasan, D.T.

    1993-11-01

    The alkaline flooding process involves injecting alkaline agents into the reservoir to produce more oil than is produced through conventional waterflooding. The interaction of the alkali in the flood water with the naturally occurring acids in the reservoir oil results in in-situ formation of soaps, which are partially responsible for lowering IFT and improving oil recovery. The extent to which IFT is lowered depends on the specific oil and injection water properties. Numerous investigators have attempted to clarify the relationship between system chemical composition and IFT. An experimental investigation of buffered alkaline flooding system chemistry was undertaken to determine themore » influence of various species present on interfacial tension (IFT) as a function of pH and ionic strength. IFT was found to go through an ultralow minimum in certain pH ranges. This synergism results from simultaneous adsorption of un-ionized and ionized acid species on the interface.« less

  3. Interfacial activity in alkaline flooding enhanced oil recovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chan, M.K.

    1981-01-01

    The ionization of long-chained organic acids in the crude oil to form soaps was shown to be primarily responsible for the lowering of oil-water interfacial tension at alkaline pH. These active acids can be concentrated by silica gel chromatography into a minor polar fraction. An equilibrium chemical model was proposed based on 2 competing reactions: the ionization of acids to form active anions, and the formation of undissociated soap between acid anions and sodium ions. It correlates the interfacial activity with the interfacial concentration of active acid anions which is expressed in terms of the concentrations of the chemical speciesmore » in the system. The model successfully predicts the observed oil-alkaline solution interfacial phenomenon, including its dependence on pH, alkali and salt concentrations, type of acid present and type of soap formed. Flooding at different alkali concentrations to activate different acid species present in the crude was shown to give better recovery than flooding at a single high alkali concentration. Treating the crude oil with a dilute solution of mineral acids liberates additional free active acids and yields better interfacial activity during subsequent alkali contact.« less

  4. Aqueous flooding methods for tertiary oil recovery

    DOEpatents

    Peru, Deborah A.

    1989-01-01

    A method of aqueous flooding of subterranean oil bearing formation for tertiary oil recovery involves injecting through a well into the formation a low alkaline pH aqueous sodium bicarbonate flooding solution. The flooding solution's pH ranges from about 8.25 to 9.25 and comprises from 0.25 to 5 weight percent and preferably about 0.75 to 3.0 weight percent of sodium bicarbonate and includes a petroleum recovery surfactant of 0.05 to 1.0 weight percent and between 1 and 20 weight percent of sodium chloride. After flooding, an oil and water mixture is withdrawn from the well and the oil is separated from the oil and water mixture.

  5. Seychelles alkaline suite records the culmination of Deccan Traps continental flood volcanism

    NASA Astrophysics Data System (ADS)

    Owen-Smith, T. M.; Ashwal, L. D.; Torsvik, T. H.; Ganerød, M.; Nebel, O.; Webb, S. J.; Werner, S. C.

    2013-12-01

    Silhouette and North Islands in the Seychelles represent an alkaline plutonic-volcanic complex, dated at 63 to 63.5 Ma by U-Pb zircon and 40Ar/39Ar methods. This magmatism coincides with the final stages of the cataclysmic Deccan Traps continental flood volcanism in India (67 to 63 Ma), and thus a causal link has been suggested. Recent reconstructions have placed the Seychelles islands adjacent to the Laxmi Ridge and at the western margin of the Réunion mantle plume at the time of formation of the complex. Here we present geochemical evidence in support of the notion that the Seychelles alkaline magmatism was initiated by the peripheral activity of the Réunion mantle plume and is thus part of the Deccan magmatic event. Positive εNd (0.59 to 3.76) and εHf (0.82 to 6.79) and initial Sr of 0.703507 to 0.705643 at 65 Ma indicate derivation of the Seychelles alkaline magmas from a Réunion-like mantle source with an additional minor enriched component, suggesting entrainment of sub-continental lithospheric mantle. The similarity in trace element composition between the Seychelles suite and Deccan alkaline felsic and mafic rocks provides additional evidence for a common mantle source for the Seychelles and Deccan magmatism. Furthermore, we demonstrate the role of fractional crystallisation in the evolution of the alkaline suite. Modelling using major elements suggests that fractional crystallisation and varying degrees of accumulation of olivine, plagioclase, ilmenite, clinopyroxene, alkali feldspar and apatite can describe the spectrum of rock types, from gabbro, through syenite, to granite.

  6. Speciation and Release Kinetics of Cadmium in an Alkaline Paddy Soil Under Various Flooding Periods and Draining Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    S Khaokaew; R Chaney; G Landrot

    2011-12-31

    This study determined Cd speciation and release kinetics in a Cd-Zn cocontaminated alkaline paddy soil, under various flooding periods and draining conditions, by employing synchrotron-based techniques, and a stirred-flow kinetic method. Results revealed that varying flooding periods and draining conditions affected Cd speciation and its release kinetics. Linear least-squares fitting (LLSF) of bulk X-ray absorption fine structure (XAFS) spectra of the air-dried, and the 1 day-flooded soil samples, showed that at least 50% of Cd was bound to humic acid. Cadmium carbonates were found as the major species at most flooding periods, while a small amount of cadmium sulfide wasmore » found after the soils were flooded for longer periods. Under all flooding and draining conditions, at least 14 mg/kg Cd was desorbed from the soil after a 2-hour desorption experiment. The results obtained by micro X-ray fluorescence ({mu}-XRF) spectroscopy showed that Cd was less associated with Zn than Ca, in most soil samples. Therefore, it is more likely that Cd and Ca will be present in the same mineral phases rather than Cd and Zn, although the source of these two latter elements may originate from the same surrounding Zn mines in the Mae Sot district.« less

  7. Oil recovery by alkaline waterflooding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooke, C.E. Jr.; Williams, R.E.; Kolodzie, P.A.

    1974-01-01

    Flooding of oil containing organic acids with alkaline water under favorable conditions can result in recovery of around 50% of the residual oil left in a watered-out model. A high recovery efficiency results from the formation of a bank of viscous water-in-oil emulsion as surface active agents (soaps) are created by reactions of base in the water with the organic acids in the oil. The type and amount of organic acids in the oil, the pH and salt content of the water, and the amount of fines in the porous medium are the primary factors which determine the amount ofmore » additional oil recovered by this method. Interaction of alkaline water with reservoir rock largely determines the amount of chemical needed to flood a reservoir. Laboratory investigations using synthetic oils and crude oils show the importance of oil-water and liquid-solid interfacial properties to the results of an alkaline waterflood. A small field test demonstrated that emulsion banks can be formed in the reservoir and that chemical costs can be reasonable in selected reservoirs. Although studies have provided many qualitative guide lines for evaluating the feasibility of alkaline waterflooding, the economic attractiveness of the process must be considered on an individual reservoir.« less

  8. Modelling and scale-up of chemical flooding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, G.A.; Lake, L.W.; Sepehrnoori, K.

    1990-03-01

    The objective of this research is to develop, validate, and apply a comprehensive chemical flooding simulator for chemical recovery processes involving surfactants, polymers, and alkaline chemicals in various combinations. This integrated program includes components of laboratory experiments, physical property modelling, scale-up theory, and numerical analysis as necessary and integral components of the simulation activity. We have continued to develop, test, and apply our chemical flooding simulator (UTCHEM) to a wide variety of laboratory and reservoir problems involving tracers, polymers, polymer gels, surfactants, and alkaline agents. Part I is an update on the Application of Higher-Order Methods in Chemical Flooding Simulation.more » This update focuses on the comparison of grid orientation effects for four different numerical methods implemented in UTCHEM. Part II is on Simulation Design Studies and is a continuation of Saad's Big Muddy surfactant pilot simulation study reported last year. Part III reports on the Simulation of Gravity Effects under conditions similar to those of some of the oil reservoirs in the North Sea. Part IV is on Determining Oil Saturation from Interwell Tracers UTCHEM is used for large-scale interwell tracer tests. A systematic procedure for estimating oil saturation from interwell tracer data is developed and a specific example based on the actual field data provided by Sun E P Co. is given. Part V reports on the Application of Vectorization and Microtasking for Reservoir Simulation. Part VI reports on Alkaline Simulation. The alkaline/surfactant/polymer flood compositional simulator (UTCHEM) reported last year is further extended to include reactions involving chemical species containing magnesium, aluminium and silicon as constituent elements. Part VII reports on permeability and trapping of microemulsion.« less

  9. Development of an alkaline/surfactant/polymer compositional reservoir simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhuyan, D.

    1989-01-01

    The mathematical formulation of a generalized three-dimensional compositional reservoir simulator for high-pH chemical flooding processes is presented in this work. The model assumes local thermodynamic equilibrium with respect to both reaction chemistry and phase behavior and calculates equilibrium electrolyte and phase compositions as a function of time and position. The reaction chemistry considers aqueous electrolytic chemistry, precipitation/dissolution of minerals, ion exchange reactions on matrix surface, reaction of acidic components of crude oil with the bases in the aqueous solution and cation exchange reactions with the micelles. The simulator combines this detailed reaction chemistry associated with these processes with the extensivemore » physical and flow property modeling schemes of an existing chemical flood simulator (UTCHEM) to model the multiphase, multidimensional displacement processes. The formulation of the chemical equilibrium model is quite general and is adaptable to simulate a variety of chemical descriptions. In addition to its use in the simulation of high-pH chemical flooding processes, the model will find application in the simulation of other reactive flow problems like the ground water contamination, reinjection of produced water, chemical waste disposal, etc. in one, two or three dimensions and under multiphase flow conditions. In this work, the model is used to simulate several hypothetical cases of high-pH chemical floods, which include cases from a simple alkaline preflush of a micellar/polymer flood to surfactant enhanced alkaline-polymer flooding and the results are analyzed. Finally, a few published alkaline, alkaline-polymer and surfactant-alkaline-polymer corefloods are simulated and compared with the experimental results.« less

  10. Mathematical modeling of high-pH chemical flooding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhuyan, D.; Lake, L.W.; Pope, G.A.

    1990-05-01

    This paper describes a generalized compositional reservoir simulator for high-pH chemical flooding processes. This simulator combines the reaction chemistry associated with these processes with the extensive physical- and flow-property modeling schemes of an existing micellar/polymer flood simulator, UTCHEM. Application of the model is illustrated for cases from a simple alkaline preflush to surfactant-enhanced alkaline-polymer flooding.

  11. The index-flood and the GRADEX methods combination for flood frequency analysis.

    NASA Astrophysics Data System (ADS)

    Fuentes, Diana; Di Baldassarre, Giuliano; Quesada, Beatriz; Xu, Chong-Yu; Halldin, Sven; Beven, Keith

    2017-04-01

    Flood frequency analysis is used in many applications, including flood risk management, design of hydraulic structures, and urban planning. However, such analysis requires of long series of observed discharge data which are often not available in many basins around the world. In this study, we tested the usefulness of combining regional discharge and local precipitation data to estimate the event flood volume frequency curve for 63 catchments in Mexico, Central America and the Caribbean. This was achieved by combining two existing flood frequency analysis methods, the regionalization index-flood approach with the GRADEX method. For up to 10-years return period, similar shape of the scaled flood frequency curve for catchments with similar flood behaviour was assumed from the index-flood approach. For return periods larger than 10-years the probability distribution of rainfall and discharge volumes were assumed to be asymptotically and exponential-type functions with the same scale parameter from the GRADEX method. Results showed that if the mean annual flood (MAF), used as index-flood, is known, the index-flood approach performed well for up to 10 years return periods, resulting in 25% mean relative error in prediction. For larger return periods the prediction capability decreased but could be improved by the use of the GRADEX method. As the MAF is unknown at ungauged and short-period measured basins, we tested predicting the MAF using catchments climate-physical characteristics, and discharge statistics, the latter when observations were available for only 8 years. Only the use of discharge statistics resulted in acceptable predictions.

  12. Bayesian flood forecasting methods: A review

    NASA Astrophysics Data System (ADS)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been

  13. Two mantle sources, two plumbing systems: Tholeiitic and alkaline magmatism of the Maymecha River basin, Siberian flood volcanic province

    USGS Publications Warehouse

    Arndt, N.; Chauvel, C.; Czamanske, G.; Fedorenko, V.

    1998-01-01

    Rocks of two distinctly different magma series are found in a ???4000-m-thick sequence of lavas and tuffs in the Maymecha River basin which is part of the Siberian flood-volcanic province. The tholeiites are typical low-Ti continental flood basalts with remarkably restricted, petrologically evolved compositions. They have basaltic MgO contents, moderate concentrations of incompatible trace elements, moderate fractionation of incompatible from compatible elements, distinct negative Ta(Nb) anomalies, and ??Nd values of 0 to + 2. The primary magmas were derived from a relatively shallow mantle source, and evolved in large crustal magma chambers where they acquired their relatively uniform compositions and became contaminated with continental crust. An alkaline series, in contrast, contains a wide range of rock types, from meymechite and picrite to trachytes, with a wide range of compositions (MgO from 0.7 to 38 wt%, SiO2 from 40 to 69 wt%, Ce from 14 to 320 ppm), high concentrations of incompatible elements and extreme fractionation of incompatible from compatible elements (Al2O3/TiO2 ??? 1; Sm/Yb up to 11). These rocks lack Ta(Nb) anomalies and have a broad range of ??Nd values, from -2 to +5. The parental magmas are believed to have formed by low-degree melting at extreme mantle depths (>200 km). They bypassed the large crustal magma chambers and ascended rapidly to the surface, a consequence, perhaps, of high volatile contents in the primary magmas. The tholeiitic series dominates the lower part of the sequence and the alkaline series the upper part; at the interface, the two types are interlayered. The succession thus provides evidence of a radical change in the site of mantle melting, and the simultaneous operation of two very different crustal plumbing systems, during the evolution of this flood-volcanic province. ?? Springer-Verlag 1998.

  14. Flood risk assessment in France: comparison of extreme flood estimation methods (EXTRAFLO project, Task 7)

    NASA Astrophysics Data System (ADS)

    Garavaglia, F.; Paquet, E.; Lang, M.; Renard, B.; Arnaud, P.; Aubert, Y.; Carre, J.

    2013-12-01

    In flood risk assessment the methods can be divided in two families: deterministic methods and probabilistic methods. In the French hydrologic community the probabilistic methods are historically preferred to the deterministic ones. Presently a French research project named EXTRAFLO (RiskNat Program of the French National Research Agency, https://extraflo.cemagref.fr) deals with the design values for extreme rainfall and floods. The object of this project is to carry out a comparison of the main methods used in France for estimating extreme values of rainfall and floods, to obtain a better grasp of their respective fields of application. In this framework we present the results of Task 7 of EXTRAFLO project. Focusing on French watersheds, we compare the main extreme flood estimation methods used in French background: (i) standard flood frequency analysis (Gumbel and GEV distribution), (ii) regional flood frequency analysis (regional Gumbel and GEV distribution), (iii) local and regional flood frequency analysis improved by historical information (Naulet et al., 2005), (iv) simplify probabilistic method based on rainfall information (i.e. Gradex method (CFGB, 1994), Agregee method (Margoum, 1992) and Speed method (Cayla, 1995)), (v) flood frequency analysis by continuous simulation approach and based on rainfall information (i.e. Schadex method (Paquet et al., 2013, Garavaglia et al., 2010), Shyreg method (Lavabre et al., 2003)) and (vi) multifractal approach. The main result of this comparative study is that probabilistic methods based on additional information (i.e. regional, historical and rainfall information) provide better estimations than the standard flood frequency analysis. Another interesting result is that, the differences between the various extreme flood quantile estimations of compared methods increase with return period, staying relatively moderate up to 100-years return levels. Results and discussions are here illustrated throughout with the example

  15. Coupling the Alkaline-Surfactant-Polymer Technology and The Gelation Technology to Maximize Oil Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malcolm Pitts; Jie Qi; Dan Wilson

    2005-10-01

    Gelation technologies have been developed to provide more efficient vertical sweep efficiencies for flooding naturally fractured oil reservoirs or more efficient areal sweep efficiency for those with high permeability contrast ''thief zones''. The field proven alkaline-surfactant-polymer technology economically recovers 15% to 25% OOIP more oil than waterflooding from swept pore space of an oil reservoir. However, alkaline-surfactant-polymer technology is not amenable to naturally fractured reservoirs or those with thief zones because much of injected solution bypasses target pore space containing oil. This work investigates whether combining these two technologies could broaden applicability of alkaline-surfactant-polymer flooding into these reservoirs. A priormore » fluid-fluid report discussed interaction of different gel chemical compositions and alkaline-surfactant-polymer solutions. Gel solutions under dynamic conditions of linear corefloods showed similar stability to alkaline-surfactant-polymer solutions as in the fluid-fluid analyses. Aluminum-polyacrylamide, flowing gels are not stable to alkaline-surfactant-polymer solutions of either pH 10.5 or 12.9. Chromium acetate-polyacrylamide flowing and rigid flowing gels are stable to subsequent alkaline-surfactant-polymer solution injection. Rigid flowing chromium acetate-polyacrylamide gels maintained permeability reduction better than flowing chromium acetate-polyacrylamide gels. Silicate-polyacrylamide gels are not stable with subsequent injection of either a pH 10.5 or a 12.9 alkaline-surfactant-polymer solution. Chromium acetate-xanthan gum rigid gels are not stable to subsequent alkaline-surfactant-polymer solution injection. Resorcinol-formaldehyde gels were stable to subsequent alkaline-surfactant-polymer solution injection. When evaluated in a dual core configuration, injected fluid flows into the core with the greatest effective permeability to the injected fluid. The same gel stability trends to

  16. A method for making an alkaline battery electrode plate

    NASA Technical Reports Server (NTRS)

    Chida, K.; Ezaki, T.

    1983-01-01

    A method is described for making an alkaline battery electrode plate where the desired active substances are filled into a nickel foam substrate. In this substrate an electrolytic oxidation reduction occurs in an alkaline solution containing lithium hydroxide.

  17. A method for mapping flood hazard along roads.

    PubMed

    Kalantari, Zahra; Nickman, Alireza; Lyon, Steve W; Olofsson, Bo; Folkeson, Lennart

    2014-01-15

    A method was developed for estimating and mapping flood hazard probability along roads using road and catchment characteristics as physical catchment descriptors (PCDs). The method uses a Geographic Information System (GIS) to derive candidate PCDs and then identifies those PCDs that significantly predict road flooding using a statistical modelling approach. The method thus allows flood hazards to be estimated and also provides insights into the relative roles of landscape characteristics in determining road-related flood hazards. The method was applied to an area in western Sweden where severe road flooding had occurred during an intense rain event as a case study to demonstrate its utility. The results suggest that for this case study area three categories of PCDs are useful for prediction of critical spots prone to flooding along roads: i) topography, ii) soil type, and iii) land use. The main drivers among the PCDs considered were a topographical wetness index, road density in the catchment, soil properties in the catchment (mainly the amount of gravel substrate) and local channel slope at the site of a road-stream intersection. These can be proposed as strong indicators for predicting the flood probability in ungauged river basins in this region, but some care is needed in generalising the case study results other potential factors are also likely to influence the flood hazard probability. Overall, the method proposed represents a straightforward and consistent way to estimate flooding hazards to inform both the planning of future roadways and the maintenance of existing roadways. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Asymmetric membranes for destabilization of oil droplets in produced water from alkaline-surfactant-polymer (ASP) flooding

    NASA Astrophysics Data System (ADS)

    Ramlee, Azierah; Chiam, Chel-Ken; Sarbatly, Rosalam

    2018-05-01

    This work presents a study of destabilization of oil droplets in the produced water from alkaline-surfactant-polymer (ASP) flooding by using four types of laboratory-fabricated polyvinylidene fluoride (PVDF) membranes. The PVDF membranes were fabricated via immersion precipitation method with ethanol (0 - 30 %, v/v) as the coagulant. The membranes with the effective area of 17.35 cm2 were tested with synthesized ASP solution as the feed in cross-flow microfiltration process. The ASP feed solution initially contained the oil droplets with radius ranged from 40 to 100 nm and the mean radius was 61 nm. Results have shown that the concentration of the ethanol in the coagulation bath affects the formation of the membrane structure and the corresponding porosity, while no significance influence on the membrane thickness. Coalescence of the oil droplets was occurred when the ASP solution permeated through the asymmetric PVDF membranes. Through the coalescence process, the oil droplets were destabilized where the radius of the oil droplets in the permeates increased to 1.5-4 µm with the corresponding mean radius ranged from 2.4 to 2.7 µm.

  19. Phosphorus dynamics in long-term flooded, drained and reflooded soils

    USDA-ARS?s Scientific Manuscript database

    In flooded areas, soils are often exposed to standing water and subsequent drainage, thus over fertilization can release excess phosphorus (P) into surface water and groundwater. To investigate P release and transformation processes in flooded alkaline soils, we flooded-drained-reflooded two soils f...

  20. Why does Japan use the probability method to set design flood?

    NASA Astrophysics Data System (ADS)

    Nakamura, S.; Oki, T.

    2015-12-01

    Design flood is hypothetical flood to make flood prevention plan. In Japan, a probability method based on precipitation data is used to define the scale of design flood: Tone River, the biggest river in Japan, is 1 in 200 years, Shinano River is 1 in 150 years, and so on. It is one of important socio-hydrological issue how to set reasonable and acceptable design flood in a changing world. The method to set design flood vary among countries. Although the probability method is also used in Netherland, but the base data is water level or discharge data and the probability is 1 in 1250 years (in fresh water section). On the other side, USA and China apply the maximum flood method which set the design flood based on the historical or probable maximum flood. This cases can leads a question: "what is the reason why the method vary among countries?" or "why does Japan use the probability method?" The purpose of this study is to clarify the historical process which the probability method was developed in Japan based on the literature. In the late 19the century, the concept of "discharge" and modern river engineering were imported by Dutch engineers, and modern flood prevention plans were developed in Japan. In these plans, the design floods were set based on the historical maximum method. Although the historical maximum method had been used until World War 2, however, the method was changed to the probability method after the war because of limitations of historical maximum method under the specific socio-economic situations: (1) the budget limitation due to the war and the GHQ occupation, (2) the historical floods: Makurazaki typhoon in 1945, Kathleen typhoon in 1947, Ione typhoon in 1948, and so on, attacked Japan and broke the record of historical maximum discharge in main rivers and the flood disasters made the flood prevention projects difficult to complete. Then, Japanese hydrologists imported the hydrological probability statistics from the West to take account of

  1. Method of increasing the sulfation capacity of alkaline earth sorbents

    DOEpatents

    Shearer, J.A.; Turner, C.B.; Johnson, I.

    1980-03-13

    A system and method for increasing the sulfation capacity of alkaline earth carbonates to scrub sulfur dioxide produced during the fluidized bed combustion of coal in which partially sulfated alkaline earth carbonates are hydrated in a fluidized bed to crack the sulfate coating and convert the alkaline earth oxide to the hydroxide. Subsequent dehydration of the sulfate-hydroxide to a sulfate-oxide particle produces particles having larger pore size, increased porosity, decreased grain size and additional sulfation capacity. A continuous process is disclosed.

  2. Method of increasing the sulfation capacity of alkaline earth sorbents

    DOEpatents

    Shearer, John A.; Turner, Clarence B.; Johnson, Irving

    1982-01-01

    A system and method for increasing the sulfation capacity of alkaline earth carbonates to scrub sulfur dioxide produced during the fluidized bed combustion of coal in which partially sulfated alkaline earth carbonates are hydrated in a fluidized bed to crack the sulfate coating and convert the alkaline earth oxide to the hydroxide. Subsequent dehydration of the sulfate-hydroxide to a sulfate-oxide particle produces particles having larger pore size, increased porosity, decreased grain size and additional sulfation capacity. A continuous process is disclosed.

  3. Alkaline solution absorption of carbon dioxide method and apparatus

    DOEpatents

    Hobbs, D.T.

    1991-01-01

    Disclosed is a method for measuring the concentration of hydroxides (or pH) in alkaline solutions, using the tendency of hydroxides to adsorb CO{sub 2}. The method comprises passing CO{sub 2} over the surface of an alkaline solution in a remote tank before and after measurements of the CO{sub 2} concentration. Comparison of the measurements yields the adsorption fraction from which the hydroxide concentration can be calculated using a correlation of hydroxide or pH to adsorption fraction. A schematic is given of a process system according to a preferred embodiment of the invention. 2 figs.

  4. An active monitoring method for flood events

    NASA Astrophysics Data System (ADS)

    Chen, Zeqiang; Chen, Nengcheng; Du, Wenying; Gong, Jianya

    2018-07-01

    Timely and active detecting and monitoring of a flood event are critical for a quick response, effective decision-making and disaster reduction. To achieve the purpose, this paper proposes an active service framework for flood monitoring based on Sensor Web services and an active model for the concrete implementation of the active service framework. The framework consists of two core components-active warning and active planning. The active warning component is based on a publish-subscribe mechanism implemented by the Sensor Event Service. The active planning component employs the Sensor Planning Service to control the execution of the schemes and models and plans the model input data. The active model, called SMDSA, defines the quantitative calculation method for five elements, scheme, model, data, sensor, and auxiliary information, as well as their associations. Experimental monitoring of the Liangzi Lake flood in the summer of 2010 is conducted to test the proposed framework and model. The results show that 1) the proposed active service framework is efficient for timely and automated flood monitoring. 2) The active model, SMDSA, is a quantitative calculation method used to monitor floods from manual intervention to automatic computation. 3) As much preliminary work as possible should be done to take full advantage of the active service framework and the active model.

  5. Alkaline electrochemical cells and method of making

    NASA Technical Reports Server (NTRS)

    Hoyt, H. E.; Pfluger, H. L. (Inventor)

    1970-01-01

    Equilibrated cellulose ether membranes of increased electrolytic conductivity for use as separators in concentrated alkaline electrochemical cells are investigated. The method of making such membranes by equilibration to the degree desired in an aqueous alkali solution mantained at a temperature below about 10 C is described.

  6. Comparing the index-flood and multiple-regression methods using L-moments

    NASA Astrophysics Data System (ADS)

    Malekinezhad, H.; Nachtnebel, H. P.; Klik, A.

    In arid and semi-arid regions, the length of records is usually too short to ensure reliable quantile estimates. Comparing index-flood and multiple-regression analyses based on L-moments was the main objective of this study. Factor analysis was applied to determine main influencing variables on flood magnitude. Ward’s cluster and L-moments approaches were applied to several sites in the Namak-Lake basin in central Iran to delineate homogeneous regions based on site characteristics. Homogeneity test was done using L-moments-based measures. Several distributions were fitted to the regional flood data and index-flood and multiple-regression methods as two regional flood frequency methods were compared. The results of factor analysis showed that length of main waterway, compactness coefficient, mean annual precipitation, and mean annual temperature were the main variables affecting flood magnitude. The study area was divided into three regions based on the Ward’s method of clustering approach. The homogeneity test based on L-moments showed that all three regions were acceptably homogeneous. Five distributions were fitted to the annual peak flood data of three homogeneous regions. Using the L-moment ratios and the Z-statistic criteria, GEV distribution was identified as the most robust distribution among five candidate distributions for all the proposed sub-regions of the study area, and in general, it was concluded that the generalised extreme value distribution was the best-fit distribution for every three regions. The relative root mean square error (RRMSE) measure was applied for evaluating the performance of the index-flood and multiple-regression methods in comparison with the curve fitting (plotting position) method. In general, index-flood method gives more reliable estimations for various flood magnitudes of different recurrence intervals. Therefore, this method should be adopted as regional flood frequency method for the study area and the Namak-Lake basin

  7. Assessment of Three Flood Hazard Mapping Methods: A Case Study of Perlis

    NASA Astrophysics Data System (ADS)

    Azizat, Nazirah; Omar, Wan Mohd Sabki Wan

    2018-03-01

    Flood is a common natural disaster and also affect the all state in Malaysia. Regarding to Drainage and Irrigation Department (DID) in 2007, about 29, 270 km2 or 9 percent of region of the country is prone to flooding. Flood can be such devastating catastrophic which can effected to people, economy and environment. Flood hazard mapping can be used is an important part in flood assessment to define those high risk area prone to flooding. The purposes of this study are to prepare a flood hazard mapping in Perlis and to evaluate flood hazard using frequency ratio, statistical index and Poisson method. The six factors affecting the occurrence of flood including elevation, distance from the drainage network, rainfall, soil texture, geology and erosion were created using ArcGIS 10.1 software. Flood location map in this study has been generated based on flooded area in year 2010 from DID. These parameters and flood location map were analysed to prepare flood hazard mapping in representing the probability of flood area. The results of the analysis were verified using flood location data in year 2013, 2014, 2015. The comparison result showed statistical index method is better in prediction of flood area rather than frequency ratio and Poisson method.

  8. Flood Hazard Mapping by Applying Fuzzy TOPSIS Method

    NASA Astrophysics Data System (ADS)

    Han, K. Y.; Lee, J. Y.; Keum, H.; Kim, B. J.; Kim, T. H.

    2017-12-01

    There are lots of technical methods to integrate various factors for flood hazard mapping. The purpose of this study is to suggest the methodology of integrated flood hazard mapping using MCDM(Multi Criteria Decision Making). MCDM problems involve a set of alternatives that are evaluated on the basis of conflicting and incommensurate criteria. In this study, to apply MCDM to assessing flood risk, maximum flood depth, maximum velocity, and maximum travel time are considered as criterion, and each applied elements are considered as alternatives. The scheme to find the efficient alternative closest to a ideal value is appropriate way to assess flood risk of a lot of element units(alternatives) based on various flood indices. Therefore, TOPSIS which is most commonly used MCDM scheme is adopted to create flood hazard map. The indices for flood hazard mapping(maximum flood depth, maximum velocity, and maximum travel time) have uncertainty concerning simulation results due to various values according to flood scenario and topographical condition. These kind of ambiguity of indices can cause uncertainty of flood hazard map. To consider ambiguity and uncertainty of criterion, fuzzy logic is introduced which is able to handle ambiguous expression. In this paper, we made Flood Hazard Map according to levee breach overflow using the Fuzzy TOPSIS Technique. We confirmed the areas where the highest grade of hazard was recorded through the drawn-up integrated flood hazard map, and then produced flood hazard map can be compared them with those indicated in the existing flood risk maps. Also, we expect that if we can apply the flood hazard map methodology suggested in this paper even to manufacturing the current flood risk maps, we will be able to make a new flood hazard map to even consider the priorities for hazard areas, including more varied and important information than ever before. Keywords : Flood hazard map; levee break analysis; 2D analysis; MCDM; Fuzzy TOPSIS

  9. Development of method for evaluating estimated inundation area by using river flood analysis based on multiple flood scenarios

    NASA Astrophysics Data System (ADS)

    Ono, T.; Takahashi, T.

    2017-12-01

    Non-structural mitigation measures such as flood hazard map based on estimated inundation area have been more important because heavy rains exceeding the design rainfall frequently occur in recent years. However, conventional method may lead to an underestimation of the area because assumed locations of dike breach in river flood analysis are limited to the cases exceeding the high-water level. The objective of this study is to consider the uncertainty of estimated inundation area with difference of the location of dike breach in river flood analysis. This study proposed multiple flood scenarios which can set automatically multiple locations of dike breach in river flood analysis. The major premise of adopting this method is not to be able to predict the location of dike breach correctly. The proposed method utilized interval of dike breach which is distance of dike breaches placed next to each other. That is, multiple locations of dike breach were set every interval of dike breach. The 2D shallow water equations was adopted as the governing equation of river flood analysis, and the leap-frog scheme with staggered grid was used. The river flood analysis was verified by applying for the 2015 Kinugawa river flooding, and the proposed multiple flood scenarios was applied for the Akutagawa river in Takatsuki city. As the result of computation in the Akutagawa river, a comparison with each computed maximum inundation depth of dike breaches placed next to each other proved that the proposed method enabled to prevent underestimation of estimated inundation area. Further, the analyses on spatial distribution of inundation class and maximum inundation depth in each of the measurement points also proved that the optimum interval of dike breach which can evaluate the maximum inundation area using the minimum assumed locations of dike breach. In brief, this study found the optimum interval of dike breach in the Akutagawa river, which enabled estimated maximum inundation area

  10. Comparison of floods non-stationarity detection methods: an Austrian case study

    NASA Astrophysics Data System (ADS)

    Salinas, Jose Luis; Viglione, Alberto; Blöschl, Günter

    2016-04-01

    Non-stationarities in flood regimes have a huge impact in any mid and long term flood management strategy. In particular the estimation of design floods is very sensitive to any kind of flood non-stationarity, as they should be linked to a return period, concept that can be ill defined in a non-stationary context. Therefore it is crucial when analyzing existent flood time series to detect and, where possible, attribute flood non-stationarities to changing hydroclimatic and land-use processes. This works presents the preliminary results of applying different non-stationarity detection methods on annual peak discharges time series over more than 400 gauging stations in Austria. The kind of non-stationarities analyzed include trends (linear and non-linear), breakpoints, clustering beyond stochastic randomness, and detection of flood rich/flood poor periods. Austria presents a large variety of landscapes, elevations and climates that allow us to interpret the spatial patterns obtained with the non-stationarity detection methods in terms of the dominant flood generation mechanisms.

  11. COUPLING THE ALKALINE-SURFACTANT-POLYMER TECHNOLOGY AND THE GELATION TECHNOLOGY TO MAXIMIZE OIL PRODUCTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malcolm Pitts; Jie Qi; Dan Wilson

    2005-04-01

    Gelation technologies have been developed to provide more efficient vertical sweep efficiencies for flooding naturally fractured oil reservoirs or more efficient areal sweep efficiency for those with high permeability contrast ''thief zones''. The field proven alkaline-surfactant-polymer technology economically recovers 15% to 25% OOIP more oil than waterflooding from swept pore space of an oil reservoir. However, alkaline-surfactant-polymer technology is not amenable to naturally fractured reservoirs or those with thief zones because much of injected solution bypasses target pore space containing oil. This work investigates whether combining these two technologies could broaden applicability of alkaline-surfactant-polymer flooding into these reservoirs. A priormore » fluid-fluid report discussed interaction of different gel chemical compositions and alkaline-surfactant-polymer solutions. Gel solutions under dynamic conditions of linear corefloods showed similar stability to alkaline-surfactant-polymer solutions as in the fluid-fluid analyses. Aluminum-polyacrylamide, flowing gels are not stable to alkaline-surfactant-polymer solutions of either pH 10.5 or 12.9. Chromium acetate-polyacrylamide flowing and rigid flowing gels are stable to subsequent alkaline-surfactant-polymer solution injection. Rigid flowing chromium acetate-polyacrylamide gels maintained permeability reduction better than flowing chromium acetate-polyacrylamide gels. Silicate-polyacrylamide gels are not stable with subsequent injection of either a pH 10.5 or a 12.9 alkaline-surfactant-polymer solution. Chromium acetate-xanthan gum rigid gels are not stable to subsequent alkaline-surfactant-polymer solution injection. Resorcinol-formaldehyde gels were stable to subsequent alkaline-surfactant-polymer solution injection. When evaluated in a dual core configuration, injected fluid flows into the core with the greatest effective permeability to the injected fluid. The same gel stability trends to

  12. Coupling the Alkaline-Surfactant-Polymer Technology and The Gelation Technology to Maximize Oil Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malcolm Pitts; Jie Qi; Dan Wilson

    2005-12-01

    Gelation technologies have been developed to provide more efficient vertical sweep efficiencies for flooding naturally fractured oil reservoirs or reservoirs with different sand lenses with high permeability contrast. The field proven alkaline-surfactant-polymer technology economically recovers 15% to 25% OOIP more crude oil than waterflooding from swept pore space of an oil reservoir. However, alkaline-surfactant-polymer technology is not amenable to naturally fractured reservoirs or reservoirs with high permeability contrast zones because much of injected solution bypasses target pore space containing oil. This work investigates whether combining these two technologies could broaden applicability of alkaline-surfactant-polymer flooding into these reservoirs. Fluid-fluid interaction withmore » different gel chemical compositions and alkaline-surfactant-polymer solution with pH values ranging from 9.2 to 12.9 have been tested. Aluminum-polyacrylamide gels are not stable to alkaline-surfactant-polymer solutions at any pH. Chromium-polyacrylamide gels with polymer to chromium ion ratios of 25 or greater were stable to alkaline-surfactant-polymer solutions if solution pH was 10.6 or less. When the polymer to chromium ion was 15 or less, chromium-polyacrylamide gels were stable to alkaline-surfactant-polymer solutions with pH values up to 12.9. Chromium-xanthan gum gels were stable to alkaline-surfactant-polymer solutions with pH values of 12.9 at the polymer to chromium ion ratios tested. Silicate-polyacrylamide, resorcinol-formaldehyde, and sulfomethylated resorcinol-formaldehyde gels were also stable to alkaline-surfactant-polymer solutions with pH values ranging from 9.2 to 12.9. Iron-polyacrylamide gels were immediately destroyed when contacted with any of the alkaline-surfactant-polymer solutions with pH values ranging from 9.2 to 12.9. Gel solutions under dynamic conditions of linear corefloods showed similar stability to alkaline-surfactant-polymer solutions

  13. A dimension reduction method for flood compensation operation of multi-reservoir system

    NASA Astrophysics Data System (ADS)

    Jia, B.; Wu, S.; Fan, Z.

    2017-12-01

    Multiple reservoirs cooperation compensation operations coping with uncontrolled flood play vital role in real-time flood mitigation. This paper come up with a reservoir flood compensation operation index (ResFCOI), which formed by elements of flood control storage, flood inflow volume, flood transmission time and cooperation operations period, then establish a flood cooperation compensation operations model of multi-reservoir system, according to the ResFCOI to determine a computational order of each reservoir, and lastly the differential evolution algorithm is implemented for computing single reservoir flood compensation optimization in turn, so that a dimension reduction method is formed to reduce computational complexity. Shiguan River Basin with two large reservoirs and an extensive uncontrolled flood area, is used as a case study, results show that (a) reservoirs' flood discharges and the uncontrolled flood are superimposed at Jiangjiaji Station, while the formed flood peak flow is as small as possible; (b) cooperation compensation operations slightly increase in usage of flood storage capacity in reservoirs, when comparing to rule-based operations; (c) it takes 50 seconds in average when computing a cooperation compensation operations scheme. The dimension reduction method to guide flood compensation operations of multi-reservoir system, can make each reservoir adjust its flood discharge strategy dynamically according to the uncontrolled flood magnitude and pattern, so as to mitigate the downstream flood disaster.

  14. Quality assurance flood source and method of making

    DOEpatents

    Fisher, Darrell R [Richland, WA; Alexander, David L [West Richland, WA; Satz, Stanley [Surfside, FL

    2002-12-03

    Disclosed is a is an improved flood source, and method of making the same, which emits an evenly distributed flow of energy from a gamma emitting radionuclide dispersed throughout the volume of the flood source. The flood source is formed by filling a bottom pan with a mix of epoxy resin with cobalt-57, preferably at 10 to 20 millicuries and then adding a hardener. The pan is secured to a flat, level surface to prevent the pan from warping and to act as a heat sink for removal of heat from the pan during the curing of the resin-hardener mixture.

  15. Optical and Physical Methods for Mapping Flooding with Satellite Imagery

    NASA Technical Reports Server (NTRS)

    Fayne, Jessica Fayne; Bolten, John; Lakshmi, Venkat; Ahamed, Aakash

    2016-01-01

    Flood and surface water mapping is becoming increasingly necessary, as extreme flooding events worldwide can damage crop yields and contribute to billions of dollars economic damages as well as social effects including fatalities and destroyed communities (Xaio et al. 2004; Kwak et al. 2015; Mueller et al. 2016).Utilizing earth observing satellite data to map standing water from space is indispensable to flood mapping for disaster response, mitigation, prevention, and warning (McFeeters 1996; Brakenridge and Anderson 2006). Since the early 1970s(Landsat, USGS 2013), researchers have been able to remotely sense surface processes such as extreme flood events to help offset some of these problems. Researchers have demonstrated countless methods and modifications of those methods to help increase knowledge of areas at risk and areas that are flooded using remote sensing data from optical and radar systems, as well as free publically available and costly commercial datasets.

  16. Generation of synthetic flood hydrographs by hydrological donors (SHYDONHY method)

    NASA Astrophysics Data System (ADS)

    Paquet, Emmanuel

    2017-04-01

    For the design of hydraulic infrastructures like dams, a design hydrograph is required in most of the cases. Some of its features (e.g. peak value, duration, volume) corresponding to a given return period are computed thanks to a wide range of methods: historical records, mono or multivariate statistical analysis, stochastic simulation, etc. Then various methods have been proposed to construct design hydrographs having such characteristics, ranging from traditional unit-hydrograph to statistical methods (Yue et al., 2002). A new method to build design hydrographs (or more generally synthetic hydrographs) is introduced here, named SHYDONHY, French acronym for "Synthèse d'HYdrogrammes par DONneurs HYdrologiques". It is based on an extensive database of 100 000 flood hydrographs recorded at hourly time-step on 1300 gauging stations in France and Switzerland, covering a wide range of catchment size and climatology. For each station, an average of two hydrographs per year of record has been selected by a peak-over-threshold (POT) method with independence criteria (Lang et al., 1999). This sampling ensures that only hydrographs of intense floods are gathered in the dataset. For a given catchment, where few or no hydrograph is available at the outlet, a sub-set of 10 "donor stations" is selected within the complete dataset, considering several criteria: proximity, size, mean annual values and regimes for both total runoff and POT-selected floods. This sub-set of stations (and their corresponding flood hydrographs) will allow to: • Estimate a characteristic duration of flood hydrographs (e.g. duration for which the discharge is above 50% of the peak value). • For a given duration (e.g. one day), estimate the average peak-to- volume ratio of floods. • For a given duration and peak-to-volume ratio, generation of a synthetic reference hydrograph by combining appropriate hydrographs of the sub-set. • For a given daily discharge sequence, being observed or generated

  17. Flood Discharge Analysis with Nakayasu Method Using Combination of HEC-RAS Method on Deli River in Medan City

    NASA Astrophysics Data System (ADS)

    Harahap, Rumilla; Jeumpa, Kemala; Hadibroto, Bambang

    2018-03-01

    The problem in this research is how in the rainy season the water does not overflow, does not occur flood and during the dry season does not occur drought so it can adjust the condition or existence of Deli river which is around Medan city. Deli River floods often occur, either caused by a smaller capacity than the existing discharge, lack of maintenance and drainage and disposal systems that do not fit with the environment, resulting in flood subscriptions every year. The purpose of this research is to know flood discharge at Deli river as Flood control in Medan city. This research is analyzed on several methods such as log Pearson, Gumbel and hydrograph unit, while HEC-RAS method is modeling conducted in analyzing the water profile of the Deli River. Furthermore, the calculation of the periodic flood discharge using the Nakayasu Method. Calculation result at Deli River return period flood discharge 2 years with an area of 14.8 km2 annual flood hydrograph the total is 26.79 m3/sec on the hours at the 4th time. Return period flood discharge 5 years with an area of 14.8 km2 annual flood hydrograph the total is 73,44 m3/sec. While 25 annual return period total flood hydrograph is 146.50 m3/sec. With flood analysis can reduce and minimize the risk of losses and land can be mapped if in the area there is flooding.

  18. Method of determining pH by the alkaline absorption of carbon dioxide

    DOEpatents

    Hobbs, David T.

    1992-01-01

    A method for measuring the concentration of hydroxides in alkaline solutions in a remote location using the tendency of hydroxides to absorb carbon dioxide. The method includes the passing of carbon dioxide over the surface of an alkaline solution in a remote tank before and after measurements of the carbon dioxide solution. A comparison of the measurements yields the absorption fraction from which the hydroxide concentration can be calculated using a correlation of hydroxide or pH to absorption fraction.

  19. Method of determining pH by the alkaline absorption of carbon dioxide

    DOEpatents

    Hobbs, D.T.

    1992-10-06

    A method is described for measuring the concentration of hydroxides in alkaline solutions in a remote location using the tendency of hydroxides to absorb carbon dioxide. The method includes the passing of carbon dioxide over the surface of an alkaline solution in a remote tank before and after measurements of the carbon dioxide solution. A comparison of the measurements yields the absorption fraction from which the hydroxide concentration can be calculated using a correlation of hydroxide or pH to absorption fraction. 2 figs.

  20. Flood hazard assessment in areas prone to flash flooding

    NASA Astrophysics Data System (ADS)

    Kvočka, Davor; Falconer, Roger A.; Bray, Michaela

    2016-04-01

    Contemporary climate projections suggest that there will be an increase in the occurrence of high-intensity rainfall events in the future. These precipitation extremes are usually the main cause for the emergence of extreme flooding, such as flash flooding. Flash floods are among the most unpredictable, violent and fatal natural hazards in the world. Furthermore, it is expected that flash flooding will occur even more frequently in the future due to more frequent development of extreme weather events, which will greatly increase the danger to people caused by flash flooding. This being the case, there will be a need for high resolution flood hazard maps in areas susceptible to flash flooding. This study investigates what type of flood hazard assessment methods should be used for assessing the flood hazard to people caused by flash flooding. Two different types of flood hazard assessment methods were tested: (i) a widely used method based on an empirical analysis, and (ii) a new, physically based and experimentally calibrated method. Two flash flood events were considered herein, namely: the 2004 Boscastle flash flood and the 2007 Železniki flash flood. The results obtained in this study suggest that in the areas susceptible to extreme flooding, the flood hazard assessment should be conducted using methods based on a mechanics-based analysis. In comparison to standard flood hazard assessment methods, these physically based methods: (i) take into account all of the physical forces, which act on a human body in floodwater, (ii) successfully adapt to abrupt changes in the flow regime, which often occur for flash flood events, and (iii) rapidly assess a flood hazard index in a relatively short period of time.

  1. Probabilistic Design Storm Method for Improved Flood Estimation in Ungauged Catchments

    NASA Astrophysics Data System (ADS)

    Berk, Mario; Å pačková, Olga; Straub, Daniel

    2017-12-01

    The design storm approach with event-based rainfall-runoff models is a standard method for design flood estimation in ungauged catchments. The approach is conceptually simple and computationally inexpensive, but the underlying assumptions can lead to flawed design flood estimations. In particular, the implied average recurrence interval (ARI) neutrality between rainfall and runoff neglects uncertainty in other important parameters, leading to an underestimation of design floods. The selection of a single representative critical rainfall duration in the analysis leads to an additional underestimation of design floods. One way to overcome these nonconservative approximations is the use of a continuous rainfall-runoff model, which is associated with significant computational cost and requires rainfall input data that are often not readily available. As an alternative, we propose a novel Probabilistic Design Storm method that combines event-based flood modeling with basic probabilistic models and concepts from reliability analysis, in particular the First-Order Reliability Method (FORM). The proposed methodology overcomes the limitations of the standard design storm approach, while utilizing the same input information and models without excessive computational effort. Additionally, the Probabilistic Design Storm method allows deriving so-called design charts, which summarize representative design storm events (combinations of rainfall intensity and other relevant parameters) for floods with different return periods. These can be used to study the relationship between rainfall and runoff return periods. We demonstrate, investigate, and validate the method by means of an example catchment located in the Bavarian Pre-Alps, in combination with a simple hydrological model commonly used in practice.

  2. Hot-Alkaline DNA Extraction Method for Deep-Subseafloor Archaeal Communities

    PubMed Central

    Terada, Takeshi; Hoshino, Tatsuhiko; Inagaki, Fumio

    2014-01-01

    A prerequisite for DNA-based microbial community analysis is even and effective cell disruption for DNA extraction. With a commonly used DNA extraction kit, roughly two-thirds of subseafloor sediment microbial cells remain intact on average (i.e., the cells are not disrupted), indicating that microbial community analyses may be biased at the DNA extraction step, prior to subsequent molecular analyses. To address this issue, we standardized a new DNA extraction method using alkaline treatment and heating. Upon treatment with 1 M NaOH at 98°C for 20 min, over 98% of microbial cells in subseafloor sediment samples collected at different depths were disrupted. However, DNA integrity tests showed that such strong alkaline and heat treatment also cleaved DNA molecules into short fragments that could not be amplified by PCR. Subsequently, we optimized the alkaline and temperature conditions to minimize DNA fragmentation and retain high cell disruption efficiency. The best conditions produced a cell disruption rate of 50 to 80% in subseafloor sediment samples from various depths and retained sufficient DNA integrity for amplification of the complete 16S rRNA gene (i.e., ∼1,500 bp). The optimized method also yielded higher DNA concentrations in all samples tested compared with extractions using a conventional kit-based approach. Comparative molecular analysis using real-time PCR and pyrosequencing of bacterial and archaeal 16S rRNA genes showed that the new method produced an increase in archaeal DNA and its diversity, suggesting that it provides better analytical coverage of subseafloor microbial communities than conventional methods. PMID:24441163

  3. A green method of graphene preparation in an alkaline environment.

    PubMed

    Štengl, Václav; Henych, Jiří; Bludská, Jana; Ecorchard, Petra; Kormunda, Martin

    2015-05-01

    We present a new, simple, quick and ecologically friendly method of exfoliating graphite to produce graphene. The method is based on the intercalation of a permanganate M2MnO4 (M=K, Na, Li), which is formed by the reaction of a manganate MMnO4 with an alkali metal hydroxide MOH. The quality of exfoliation and the morphology were determined using X-ray photoelectron spectroscopy, X-ray diffraction and microscopic techniques, including transmission electron microscopy and atomic force microscopy. We observed that a stable graphene suspension could be prepared under strongly alkaline conditions in the presence of permanganate and ultrasound assistance. The use of only an alkaline environment for the direct preparation of graphene from graphite structures has not been previously described or applied. It was found that such a method of preparation leads to surprisingly high yields and a stable product for hydrophilic graphene applications. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Flooding and Flood Management

    USGS Publications Warehouse

    Brooks, K.N.; Fallon, J.D.; Lorenz, D.L.; Stark, J.R.; Menard, Jason; Easter, K.W.; Perry, Jim

    2011-01-01

    Floods result in great human disasters globally and nationally, causing an average of $4 billion of damages each year in the United States. Minnesota has its share of floods and flood damages, and the state has awarded nearly $278 million to local units of government for flood mitigation projects through its Flood Hazard Mitigation Grant Program. Since 1995, flood mitigation in the Red River Valley has exceeded $146 million. Considerable local and state funding has been provided to manage and mitigate problems of excess stormwater in urban areas, flooding of farmlands, and flood damages at road crossings. The cumulative costs involved with floods and flood mitigation in Minnesota are not known precisely, but it is safe to conclude that flood mitigation is a costly business. This chapter begins with a description of floods in Minneosta to provide examples and contrasts across the state. Background material is presented to provide a basic understanding of floods and flood processes, predication, and management and mitigation. Methods of analyzing and characterizing floods are presented because they affect how we respond to flooding and can influence relevant practices. The understanding and perceptions of floods and flooding commonly differ among those who work in flood forecasting, flood protection, or water resource mamnagement and citizens and businesses affected by floods. These differences can become magnified following a major flood, pointing to the need for better understanding of flooding as well as common language to describe flood risks and the uncertainty associated with determining such risks. Expectations of accurate and timely flood forecasts and our ability to control floods do not always match reality. Striving for clarity is important in formulating policies that can help avoid recurring flood damages and costs.

  5. An at-site flood estimation method in the context of nonstationarity II. Statistical analysis of floods in Quebec

    NASA Astrophysics Data System (ADS)

    Gado, Tamer A.; Nguyen, Van-Thanh-Van

    2016-04-01

    This paper, the second of a two-part paper, investigates the nonstationary behaviour of flood peaks in Quebec (Canada) by analyzing the annual maximum flow series (AMS) available for the common 1966-2001 period from a network of 32 watersheds. Temporal trends in the mean of flood peaks were examined by the nonparametric Mann-Kendall test. The significance of the detected trends over the whole province is also assessed by a bootstrap test that preserves the cross-correlation structure of the network. Furthermore, The LM-NS method (introduced in the first part) is used to parametrically model the AMS, investigating its applicability to real data, to account for temporal trends in the moments of the time series. In this study two probability distributions (GEV & Gumbel) were selected to model four different types of time-varying moments of the historical time series considered, comprising eight competing models. The selected models are: two stationary models (GEV0 & Gumbel0), two nonstationary models in the mean as a linear function of time (GEV1 & Gumbel1), two nonstationary models in the mean as a parabolic function of time (GEV2 & Gumbel2), and two nonstationary models in the mean and the log standard deviation as linear functions of time (GEV11 & Gumbel11). The eight models were applied to flood data available for each watershed and their performance was compared to identify the best model for each location. The comparative methodology involves two phases: (1) a descriptive ability based on likelihood-based optimality criteria such as the Bayesian Information Criterion (BIC) and the deviance statistic; and (2) a predictive ability based on the residual bootstrap. According to the Mann-Kendall test and the LM-NS method, a quarter of the analyzed stations show significant trends in the AMS. All of the significant trends are negative, indicating decreasing flood magnitudes in Quebec. It was found that the LM-NS method could provide accurate flood estimates in the

  6. Estimation of design floods in ungauged catchments using a regional index flood method. A case study of Lake Victoria Basin in Kenya

    NASA Astrophysics Data System (ADS)

    Nobert, Joel; Mugo, Margaret; Gadain, Hussein

    Reliable estimation of flood magnitudes corresponding to required return periods, vital for structural design purposes, is impacted by lack of hydrological data in the study area of Lake Victoria Basin in Kenya. Use of regional information, derived from data at gauged sites and regionalized for use at any location within a homogenous region, would improve the reliability of the design flood estimation. Therefore, the regional index flood method has been applied. Based on data from 14 gauged sites, a delineation of the basin into two homogenous regions was achieved using elevation variation (90-m DEM), spatial annual rainfall pattern and Principal Component Analysis of seasonal rainfall patterns (from 94 rainfall stations). At site annual maximum series were modelled using the Log normal (LN) (3P), Log Logistic Distribution (LLG), Generalized Extreme Value (GEV) and Log Pearson Type 3 (LP3) distributions. The parameters of the distributions were estimated using the method of probability weighted moments. Goodness of fit tests were applied and the GEV was identified as the most appropriate model for each site. Based on the GEV model, flood quantiles were estimated and regional frequency curves derived from the averaged at site growth curves. Using the least squares regression method, relationships were developed between the index flood, which is defined as the Mean Annual Flood (MAF) and catchment characteristics. The relationships indicated area, mean annual rainfall and altitude were the three significant variables that greatly influence the index flood. Thereafter, estimates of flood magnitudes in ungauged catchments within a homogenous region were estimated from the derived equations for index flood and quantiles from the regional curves. These estimates will improve flood risk estimation and to support water management and engineering decisions and actions.

  7. Flood maps in Europe - methods, availability and use

    NASA Astrophysics Data System (ADS)

    de Moel, H.; van Alphen, J.; Aerts, J. C. J. H.

    2009-03-01

    To support the transition from traditional flood defence strategies to a flood risk management approach at the basin scale in Europe, the EU has adopted a new Directive (2007/60/EC) at the end of 2007. One of the major tasks which member states must carry out in order to comply with this Directive is to map flood hazards and risks in their territory, which will form the basis of future flood risk management plans. This paper gives an overview of existing flood mapping practices in 29 countries in Europe and shows what maps are already available and how such maps are used. Roughly half of the countries considered have maps covering as good as their entire territory, and another third have maps covering significant parts of their territory. Only five countries have very limited or no flood maps available yet. Of the different flood maps distinguished, it appears that flood extent maps are the most commonly produced floods maps (in 23 countries), but flood depth maps are also regularly created (in seven countries). Very few countries have developed flood risk maps that include information on the consequences of flooding. The available flood maps are mostly developed by governmental organizations and primarily used for emergency planning, spatial planning, and awareness raising. In spatial planning, flood zones delimited on flood maps mainly serve as guidelines and are not binding. Even in the few countries (e.g. France, Poland) where there is a legal basis to regulate floodplain developments using flood zones, practical problems are often faced which reduce the mitigating effect of such binding legislation. Flood maps, also mainly extent maps, are also created by the insurance industry in Europe and used to determine insurability, differentiate premiums, or to assess long-term financial solvency. Finally, flood maps are also produced by international river commissions. With respect to the EU Flood Directive, many countries already have a good starting point to map

  8. Development of a test method against hot alkaline chemical splashes.

    PubMed

    Mäkinen, Helena; Nieminen, Kalevi; Mäki, Susanna; Siiskonen, Sirkku

    2008-01-01

    High temperature alkaline chemical liquids have caused injuries and hazardous situations in Finnish pulp manufacturing mills. There are no requirements and/or test method standards concerning protection against high temperature alkaline chemical splashes. This paper describes the test method development process to test and identify materials appropriate for hot liquid chemical hazard protection. In the first phase, the liquid was spilled through a stainless steel funnel and the protection performance was evaluated using a polyvinyl chloride (PVC) film under the test material. After several tentative improvements, a graphite crucible was used for heating and spilling the chemical, and a copper-coated K-type thermometer with 4 independent measuring areas was designed to measure the temperature under the material samples. The thermometer was designed to respond quickly so that peak temperatures could be measured. The main problem was to keep the spilled amount of chemical constant, which unfortunately resulted in significant variability in data.

  9. Coupling the Alkaline-Surfactant-Polymer Technology and the Gelation Technology to Maximize Oil Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malcolm Pitts; Jie Qi; Dan Wilson

    2005-12-01

    Gelation technologies have been developed to provide more efficient vertical sweep efficiencies for flooding naturally fractured oil reservoirs or reservoirs with different sand lenses with high permeability contrast. The field proven alkaline-surfactant-polymer technology economically recovers 15% to 25% OOIP more crude oil than waterflooding froin swept pore space of an oil reservoir. However, alkaline-surfactant-polymer technology is not amenable to naturally fractured reservoirs or reservoirs with high permeability contrast zones because much of injected solution bypasses target pore space containing oil. This work investigates whether combining these two technologies could broaden applicability of alkaline-surfactant-polymer flooding into these reservoirs. Fluid-fluid interaction withmore » different gel chemical compositions and alkaline-surfactant-polymer solution with pH values ranging from 9.2 to 12.9 have been tested. Aluminum-polyacrylamide gels are not stable to alkaline-surfactant-polymer solutions at any pH. Chromium-polyacrylamide gels with polymer to chromium ion ratios of 25 or greater were stable to alkaline-surfactant-polymer solutions if solution pH was 10.6 or less. When the polymer to chromium ion was 15 or less, chromium-polyacrylamide gels were stable to alkaline-surfactant-polymer solutions with pH values up to 12.9. Chromium-xanthan gum gels were stable to alkaline-surfactant-polymer solutions with pH values of 12.9 at the polymer to chromium ion ratios tested. Silicate-polyacrylamide, resorcinol-formaldehyde, and sulfomethylated resorcinol-formaldehyde gels were also stable to alkaline-surfactant-polymer solutions with pH values ranging from 9.2 to 12.9. Iron-polyacrylamide gels were immediately destroyed when contacted with any of the alkaline-surfactant-polymer solutions with pH values ranging from 9.2 to 12.9. Gel solutions under dynamic conditions of linear corefloods showed similar stability to alkaline-surfactant-polymer solutions

  10. Alkaline battery operational methodology

    DOEpatents

    Sholklapper, Tal; Gallaway, Joshua; Steingart, Daniel; Ingale, Nilesh; Nyce, Michael

    2016-08-16

    Methods of using specific operational charge and discharge parameters to extend the life of alkaline batteries are disclosed. The methods can be used with any commercial primary or secondary alkaline battery, as well as with newer alkaline battery designs, including batteries with flowing electrolyte. The methods include cycling batteries within a narrow operating voltage window, with minimum and maximum cut-off voltages that are set based on battery characteristics and environmental conditions. The narrow voltage window decreases available capacity but allows the batteries to be cycled for hundreds or thousands of times.

  11. Global river flood hazard maps: hydraulic modelling methods and appropriate uses

    NASA Astrophysics Data System (ADS)

    Townend, Samuel; Smith, Helen; Molloy, James

    2014-05-01

    Flood hazard is not well understood or documented in many parts of the world. Consequently, the (re-)insurance sector now needs to better understand where the potential for considerable river flooding aligns with significant exposure. For example, international manufacturing companies are often attracted to countries with emerging economies, meaning that events such as the 2011 Thailand floods have resulted in many multinational businesses with assets in these regions incurring large, unexpected losses. This contribution addresses and critically evaluates the hydraulic methods employed to develop a consistent global scale set of river flood hazard maps, used to fill the knowledge gap outlined above. The basis of the modelling approach is an innovative, bespoke 1D/2D hydraulic model (RFlow) which has been used to model a global river network of over 5.3 million kilometres. Estimated flood peaks at each of these model nodes are determined using an empirically based rainfall-runoff approach linking design rainfall to design river flood magnitudes. The hydraulic model is used to determine extents and depths of floodplain inundation following river bank overflow. From this, deterministic flood hazard maps are calculated for several design return periods between 20-years and 1,500-years. Firstly, we will discuss the rationale behind the appropriate hydraulic modelling methods and inputs chosen to produce a consistent global scaled river flood hazard map. This will highlight how a model designed to work with global datasets can be more favourable for hydraulic modelling at the global scale and why using innovative techniques customised for broad scale use are preferable to modifying existing hydraulic models. Similarly, the advantages and disadvantages of both 1D and 2D modelling will be explored and balanced against the time, computer and human resources available, particularly when using a Digital Surface Model at 30m resolution. Finally, we will suggest some

  12. Do regional methods really help reduce uncertainties in flood frequency analyses?

    NASA Astrophysics Data System (ADS)

    Cong Nguyen, Chi; Payrastre, Olivier; Gaume, Eric

    2013-04-01

    Flood frequency analyses are often based on continuous measured series at gauge sites. However, the length of the available data sets is usually too short to provide reliable estimates of extreme design floods. To reduce the estimation uncertainties, the analyzed data sets have to be extended either in time, making use of historical and paleoflood data, or in space, merging data sets considered as statistically homogeneous to build large regional data samples. Nevertheless, the advantage of the regional analyses, the important increase of the size of the studied data sets, may be counterbalanced by the possible heterogeneities of the merged sets. The application and comparison of four different flood frequency analysis methods to two regions affected by flash floods in the south of France (Ardèche and Var) illustrates how this balance between the number of records and possible heterogeneities plays in real-world applications. The four tested methods are: (1) a local statistical analysis based on the existing series of measured discharges, (2) a local analysis valuating the existing information on historical floods, (3) a standard regional flood frequency analysis based on existing measured series at gauged sites and (4) a modified regional analysis including estimated extreme peak discharges at ungauged sites. Monte Carlo simulations are conducted to simulate a large number of discharge series with characteristics similar to the observed ones (type of statistical distributions, number of sites and records) to evaluate to which extent the results obtained on these case studies can be generalized. These two case studies indicate that even small statistical heterogeneities, which are not detected by the standard homogeneity tests implemented in regional flood frequency studies, may drastically limit the usefulness of such approaches. On the other hand, these result show that the valuation of information on extreme events, either historical flood events at gauged

  13. Temporal clustering of floods in Germany: Do flood-rich and flood-poor periods exist?

    NASA Astrophysics Data System (ADS)

    Merz, Bruno; Nguyen, Viet Dung; Vorogushyn, Sergiy

    2016-10-01

    The repeated occurrence of exceptional floods within a few years, such as the Rhine floods in 1993 and 1995 and the Elbe and Danube floods in 2002 and 2013, suggests that floods in Central Europe may be organized in flood-rich and flood-poor periods. This hypothesis is studied by testing the significance of temporal clustering in flood occurrence (peak-over-threshold) time series for 68 catchments across Germany for the period 1932-2005. To assess the robustness of the results, different methods are used: Firstly, the index of dispersion, which quantifies the departure from a homogeneous Poisson process, is investigated. Further, the time-variation of the flood occurrence rate is derived by non-parametric kernel implementation and the significance of clustering is evaluated via parametric and non-parametric tests. Although the methods give consistent overall results, the specific results differ considerably. Hence, we recommend applying different methods when investigating flood clustering. For flood estimation and risk management, it is of relevance to understand whether clustering changes with flood severity and time scale. To this end, clustering is assessed for different thresholds and time scales. It is found that the majority of catchments show temporal clustering at the 5% significance level for low thresholds and time scales of one to a few years. However, clustering decreases substantially with increasing threshold and time scale. We hypothesize that flood clustering in Germany is mainly caused by catchment memory effects along with intra- to inter-annual climate variability, and that decadal climate variability plays a minor role.

  14. Removal of dissolved actinides from alkaline solutions by the method of appearing reagents

    DOEpatents

    Krot, Nikolai N.; Charushnikova, Iraida A.

    1997-01-01

    A method of reducing the concentration of neptunium and plutonium from alkaline radwastes containing plutonium and neptunium values along with other transuranic values produced during the course of plutonium production. The OH.sup.- concentration of the alkaline radwaste is adjusted to between about 0.1M and about 4M. [UO.sub.2 (O.sub.2).sub.3 ].sup.4- ion is added to the radwastes in the presence of catalytic amounts of Cu.sup.+2, Co.sup.+2 or Fe.sup.+2 with heating to a temperature in excess of about 60.degree. C. or 85.degree. C., depending on the catalyst, to coprecipitate plutonium and neptunium from the radwaste. Thereafter, the coprecipitate is separated from the alkaline radwaste.

  15. Single well surfactant test to evaluate surfactant floods using multi tracer method

    DOEpatents

    Sheely, Clyde Q.

    1979-01-01

    Data useful for evaluating the effectiveness of or designing an enhanced recovery process said process involving mobilizing and moving hydrocarbons through a hydrocarbon bearing subterranean formation from an injection well to a production well by injecting a mobilizing fluid into the injection well, comprising (a) determining hydrocarbon saturation in a volume in the formation near a well bore penetrating formation, (b) injecting sufficient mobilizing fluid to mobilize and move hydrocarbons from a volume in the formation near the well bore, and (c) determining the hydrocarbon saturation in a volume including at least a part of the volume of (b) by an improved single well surfactant method comprising injecting 2 or more slugs of water containing the primary tracer separated by water slugs containing no primary tracer. Alternatively, the plurality of ester tracers can be injected in a single slug said tracers penetrating varying distances into the formation wherein the esters have different partition coefficients and essentially equal reaction times. The single well tracer method employed is disclosed in U.S. Pat. No. 3,623,842. This method designated the single well surfactant test (SWST) is useful for evaluating the effect of surfactant floods, polymer floods, carbon dioxide floods, micellar floods, caustic floods and the like in subterranean formations in much less time and at much reduced cost compared to conventional multiwell pilot tests.

  16. Group decision-making approach for flood vulnerability identification using the fuzzy VIKOR method

    NASA Astrophysics Data System (ADS)

    Lee, G.; Jun, K. S.; Chung, E.-S.

    2015-04-01

    This study proposes an improved group decision making (GDM) framework that combines the VIKOR method with data fuzzification to quantify the spatial flood vulnerability including multiple criteria. In general, GDM method is an effective tool for formulating a compromise solution that involves various decision makers since various stakeholders may have different perspectives on their flood risk/vulnerability management responses. The GDM approach is designed to achieve consensus building that reflects the viewpoints of each participant. The fuzzy VIKOR method was developed to solve multi-criteria decision making (MCDM) problems with conflicting and noncommensurable criteria. This comprising method can be used to obtain a nearly ideal solution according to all established criteria. This approach effectively can propose some compromising decisions by combining the GDM method and fuzzy VIKOR method. The spatial flood vulnerability of the southern Han River using the GDM approach combined with the fuzzy VIKOR method was compared with the spatial flood vulnerability using general MCDM methods, such as the fuzzy TOPSIS and classical GDM methods (i.e., Borda, Condorcet, and Copeland). As a result, the proposed fuzzy GDM approach can reduce the uncertainty in the data confidence and weight derivation techniques. Thus, the combination of the GDM approach with the fuzzy VIKOR method can provide robust prioritization because it actively reflects the opinions of various groups and considers uncertainty in the input data.

  17. Radar-based Quantitative Precipitation Forecasting using Spatial-scale Decomposition Method for Urban Flood Management

    NASA Astrophysics Data System (ADS)

    Yoon, S.; Lee, B.; Nakakita, E.; Lee, G.

    2016-12-01

    Recent climate changes and abnormal weather phenomena have resulted in increased occurrences of localized torrential rainfall. Urban areas in Korea have suffered from localized heavy rainfall, including the notable Seoul flood disaster in 2010 and 2011. The urban hydrological environment has changed in relation to precipitation, such as reduced concentration time, a decreased storage rate, and increased peak discharge. These changes have altered and accelerated the severity of damage to urban areas. In order to prevent such urban flash flood damages, we have to secure the lead time for evacuation through the improvement of radar-based quantitative precipitation forecasting (QPF). The purpose of this research is to improve the QPF products using spatial-scale decomposition method for considering the life time of storm and to assess the accuracy between traditional QPF method and proposed method in terms of urban flood management. The layout of this research is as below. First, this research applies the image filtering to separate the spatial-scale of rainfall field. Second, the separated small and large-scale rainfall fields are extrapolated by each different forecasting method. Third, forecasted rainfall fields are combined at each lead time. Finally, results of this method are evaluated and compared with the results of uniform advection model for urban flood modeling. It is expected that urban flood information using improved QPF will help to reduce casualties and property damage caused by urban flooding through this research.

  18. Modeling and simulation of surfactant-polymer flooding using a new hybrid method

    NASA Astrophysics Data System (ADS)

    Daripa, Prabir; Dutta, Sourav

    2017-04-01

    Chemical enhanced oil recovery by surfactant-polymer (SP) flooding has been studied in two space dimensions. A new global pressure for incompressible, immiscible, multicomponent two-phase porous media flow has been derived in the context of SP flooding. This has been used to formulate a system of flow equations that incorporates the effect of capillary pressure and also the effect of polymer and surfactant on viscosity, interfacial tension and relative permeabilities of the two phases. The coupled system of equations for pressure, water saturation, polymer concentration and surfactant concentration has been solved using a new hybrid method in which the elliptic global pressure equation is solved using a discontinuous finite element method and the transport equations for water saturation and concentrations of the components are solved by a Modified Method Of Characteristics (MMOC) in the multicomponent setting. Numerical simulations have been performed to validate the method, both qualitatively and quantitatively, and to evaluate the relative performance of the various flooding schemes for several different heterogeneous reservoirs.

  19. The 40Ar/39Ar age record and geodynamic significance of Indo-Madagascar and Deccan flood basalt volcanism in the Sarnu-Dandali alkaline complex, Rajasthan, northwestern India

    NASA Astrophysics Data System (ADS)

    Vijayan, Anjali; Pande, Kanchan; Sheth, Hetu; Kant Sharma, Kamal

    2017-04-01

    The Sarnu-Dandali alkaline complex in Rajasthan, northwestern India, is considered to represent early, pre-tholeiite magmatism in the Deccan Traps continental flood basalt (CFB) province, based on a single 40Ar/39Ar age of 68.57 Ma. Rhyolites found in the complex are considered to be 750 Ma Malani basement. Our new 40Ar/39Ar ages of 88.9-86.8 Ma (for syenites, nephelinite, phonolite and rhyolite) and 66.3 ± 0.4 Ma (2σ, melanephelinite) provide clear evidence that whereas the Sarnu-Dandali complex has Deccan-age components, it is dominantly an older (by ˜20 million years) alkaline complex, with rhyolites included. Sarnu-Dandali is thus an alkaline igneous center active at least twice in the Late Cretaceous, and also much before as suggested by a basalt flow underlying the Early Cretaceous Sarnu Sandstone. The 89-86 Ma 40Ar/39Ar ages fully overlap with those for the Indo-Madagascar CFB province formed during continental break-up between India (plus Seychelles) and Madagascar. Recent 40Ar/39Ar work has shown polychronous emplacement (over ≥ 45 million years) of the Mundwara alkaline complex in Rajasthan, 100 km from Sarnu-Dandali, and 84-80 Ma ages obtained from Mundwara also arguably represent late stages of the Indo-Madagascar CFB volcanism. Remnants of the Indo-Madagascar CFB province are known from several localities in southern India but hitherto unknown from northwestern India 2000 km away. Additional equivalents buried under the vast Deccan Traps are highly likely. We relate the Sarnu-Dandali and Mundwara complexes to decompression melting of ancient, subduction-fluxed, enriched mantle lithosphere due to periodic lithospheric extension during much of the Cretaceous, and hundreds of kilometers inland from the India-Madagascar and India-Seychelles rifted margins.

  20. Stability indicating methods for the analysis of cefprozil in the presence of its alkaline induced degradation product

    NASA Astrophysics Data System (ADS)

    Attia, Khalid A. M.; Nassar, Mohammed W. I.; El-Zeiny, Mohamed B.; Serag, Ahmed

    2016-04-01

    Three simple, specific, accurate and precise spectrophotometric methods were developed for the determination of cefprozil (CZ) in the presence of its alkaline induced degradation product (DCZ). The first method was the bivariate method, while the two other multivariate methods were partial least squares (PLS) and spectral residual augmented classical least squares (SRACLS). The multivariate methods were applied with and without variable selection procedure (genetic algorithm GA). These methods were tested by analyzing laboratory prepared mixtures of the above drug with its alkaline induced degradation product and they were applied to its commercial pharmaceutical products.

  1. Comparison of methods for estimating flood magnitudes on small streams in Georgia

    USGS Publications Warehouse

    Hess, Glen W.; Price, McGlone

    1989-01-01

    The U.S. Geological Survey has collected flood data for small, natural streams at many sites throughout Georgia during the past 20 years. Flood-frequency relations were developed for these data using four methods: (1) observed (log-Pearson Type III analysis) data, (2) rainfall-runoff model, (3) regional regression equations, and (4) map-model combination. The results of the latter three methods were compared to the analyses of the observed data in order to quantify the differences in the methods and determine if the differences are statistically significant.

  2. Compound flooding: examples, methods, and challenges

    NASA Astrophysics Data System (ADS)

    Wahl, T.

    2017-12-01

    When different climatic extremes occur simultaneously or in close succession, the impacts to the environment, built infrastructure and society at large are often significantly escalated. These events are collectively referred to as "compound" events. Although they are typically regarded as highly "surprising" when they occur, the dependencies and multi-scale nature of many climate phenomena mean that such events occur much more likely than might be expected by random chance alone. However, despite their high impacts, compound extremes are not, or only poorly covered in current risk analysis frameworks and policy agendas. Floods in particular, which are among the most dangerous and costly natural hazards, are rarely a function of just one driver. Rather, they often arise through the joint occurrence of different source mechanisms. This can include oceanographic drivers such as tides, storm surges, or waves, as well as hydrologic drivers such as rainfall runoff (pluvial) or river discharge (fluvial). Often, two or more of these flood drivers affect the same region and are correlated with each other, which needs to be accounted for in flood risk assessments. This presentation will briefly introduce the different types of compound flooding along with recent examples from around the globe where those high impact events led to substantial damages and loss of lives. A broad overview will be provided of existing statistical modelling tools to identify and simulate dependencies between flood drivers, for example when calculating joint probabilities. Finally, some of the most pressing challenges in developing improved strategies to assess and mitigate the risks of climatic compound extremes, and compound flooding in particular, will be discussed.

  3. Flood-frequency prediction methods for unregulated streams of Tennessee, 2000

    USGS Publications Warehouse

    Law, George S.; Tasker, Gary D.

    2003-01-01

    Up-to-date flood-frequency prediction methods for unregulated, ungaged rivers and streams of Tennessee have been developed. Prediction methods include the regional-regression method and the newer region-of-influence method. The prediction methods were developed using stream-gage records from unregulated streams draining basins having from 1 percent to about 30 percent total impervious area. These methods, however, should not be used in heavily developed or storm-sewered basins with impervious areas greater than 10 percent. The methods can be used to estimate 2-, 5-, 10-, 25-, 50-, 100-, and 500-year recurrence-interval floods of most unregulated rural streams in Tennessee. A computer application was developed that automates the calculation of flood frequency for unregulated, ungaged rivers and streams of Tennessee. Regional-regression equations were derived by using both single-variable and multivariable regional-regression analysis. Contributing drainage area is the explanatory variable used in the single-variable equations. Contributing drainage area, main-channel slope, and a climate factor are the explanatory variables used in the multivariable equations. Deleted-residual standard error for the single-variable equations ranged from 32 to 65 percent. Deleted-residual standard error for the multivariable equations ranged from 31 to 63 percent. These equations are included in the computer application to allow easy comparison of results produced by the different methods. The region-of-influence method calculates multivariable regression equations for each ungaged site and recurrence interval using basin characteristics from 60 similar sites selected from the study area. Explanatory variables that may be used in regression equations computed by the region-of-influence method include contributing drainage area, main-channel slope, a climate factor, and a physiographic-region factor. Deleted-residual standard error for the region-of-influence method tended to be only

  4. Degradation Kinetics Study of Alogliptin Benzoate in Alkaline Medium by Validated Stability-Indicating HPTLC Method.

    PubMed

    Bodiwala, Kunjan Bharatkumar; Shah, Shailesh; Thakor, Jeenal; Marolia, Bhavin; Prajapati, Pintu

    2016-11-01

    A rapid, sensitive, and stability-indicating high-performance thin-layer chromatographic method was developed and validated to study degradation kinetics of Alogliptin benzoate (ALG) in an alkaline medium. ALG was degraded under acidic, alkaline, oxidative, and thermal stress conditions. The degraded samples were chromatographed on silica gel 60F254-TLC plates, developed using a quaternary-solvent system (chloroform-methanol-ethyl acetate-triethyl amine, 9+1+1+0.5, v/v/v/v), and scanned at 278 nm. The developed method was validated per International Conference on Harmonization guidelines using validation parameters such as specificity, linearity and range, precision, accuracy, LOD, and LOQ. The linearity range for ALG was 100-500 ng/band (correlation coefficient = 0.9997) with an average recovery of 99.47%. The LOD and LOQ for ALG were 9.8 and 32.7 ng/band, respectively. The developed method was successfully applied for the quantitative estimation of ALG in its synthetic mixture with common excipients. Degradation kinetics of ALG in an alkaline medium was studied by degrading it under three different temperatures and three different concentrations of alkali. Degradation of ALG in the alkaline medium was found to follow first-order kinetics. Contour plots have been generated to predict degradation rate constant, half-life, and shelf life of ALG in various combinations of temperature and concentration of alkali using Design Expert software.

  5. Comparison of enteral and parenteral methods of urine alkalinization in patients receiving high-dose methotrexate.

    PubMed

    Rouch, Jamie A; Burton, Bradley; Dabb, Alix; Brown, Vicky; Seung, Amy H; Kinsman, Katharine; Holdhoff, Matthias

    2017-01-01

    Purpose Hyperhydration and urinary alkalinization is implemented with all high-dose (HD)-methotrexate infusions to promote excretion and prevent precipitation of methotrexate in the renal tubules. Our institution utilized enteral alkalinizing agents (sodium bicarbonate tablets and sodium citrate/citric acid solution) to alkalinize the urine of patients receiving HD-methotrexate during a parenteral sodium bicarbonate and sodium acetate shortage. The purpose of this study is to establish the safety and efficacy of the enteral route for urine alkalinization. Methods A single-center, retrospective, cohort study was conducted comparing cycles of HD-methotrexate using enteral alkalinizing agents to parenteral sodium bicarbonate. The primary objective was to compare the time, in hours, from administration of first inpatient administered dose of alkalinizing agent to time of achieving goal urine pH. Secondary objectives evaluated total dose of sodium bicarbonate required to achieve goal urine pH, time from start of urine alkalinizing agent until time of achieving methotrexate level safe for discharge, and toxicities associated with methotrexate and the alkalinizing agents. Results A total of 118 patients were included in this study, equally divided into two cohorts based on parenteral versus enteral routes of administration. No statistical difference was determined between the two cohorts regarding time to goal urine pH (6.5 h versus 7.9 h, P = 0.051) or regarding time to methotrexate level deemed safe for discharge (63.5 h versus 62.5 h, p = 0.835). There were no significant differences in methotrexate-induced toxicities. Conclusion Our study found enteral routes of urine alkalinization to be a viable alternative to the traditional parenteral sodium bicarbonate, especially during parenteral sodium bicarbonate and acetate shortages.

  6. Predicting Flood Hazards in Systems with Multiple Flooding Mechanisms

    NASA Astrophysics Data System (ADS)

    Luke, A.; Schubert, J.; Cheng, L.; AghaKouchak, A.; Sanders, B. F.

    2014-12-01

    Delineating flood zones in systems that are susceptible to flooding from a single mechanism (riverine flooding) is a relatively well defined procedure with specific guidance from agencies such as FEMA and USACE. However, there is little guidance in delineating flood zones in systems that are susceptible to flooding from multiple mechanisms such as storm surge, waves, tidal influence, and riverine flooding. In this study, a new flood mapping method which accounts for multiple extremes occurring simultaneously is developed and exemplified. The study site in which the method is employed is the Tijuana River Estuary (TRE) located in Southern California adjacent to the U.S./Mexico border. TRE is an intertidal coastal estuary that receives freshwater flows from the Tijuana River. Extreme discharge from the Tijuana River is the primary driver of flooding within TRE, however tide level and storm surge also play a significant role in flooding extent and depth. A comparison between measured flows at the Tijuana River and ocean levels revealed a correlation between extreme discharge and ocean height. Using a novel statistical method based upon extreme value theory, ocean heights were predicted conditioned up extreme discharge occurring within the Tijuana River. This statistical technique could also be applied to other systems in which different factors are identified as the primary drivers of flooding, such as significant wave height conditioned upon tide level, for example. Using the predicted ocean levels conditioned upon varying return levels of discharge as forcing parameters for the 2D hydraulic model BreZo, the 100, 50, 20, and 10 year floodplains were delineated. The results will then be compared to floodplains delineated using the standard methods recommended by FEMA for riverine zones with a downstream ocean boundary.

  7. Alkaline resistant phosphate glasses and method of preparation and use thereof

    DOEpatents

    Brow, Richard K.; Reis, Signo T.; Velez, Mariano; Day, Delbert E.

    2010-01-26

    A substantially alkaline resistant calcium-iron-phosphate (CFP) glass and methods of making and using thereof. In one application, the CFP glass is drawn into a fiber and dispersed in cement to produce glass fiber reinforced concrete (GFRC) articles having the high compressive strength of concrete with the high impact, flexural and tensile strength associated with glass fibers.

  8. Anodes for alkaline electrolysis

    DOEpatents

    Soloveichik, Grigorii Lev [Latham, NY

    2011-02-01

    A method of making an anode for alkaline electrolysis cells includes adsorption of precursor material on a carbonaceous material, conversion of the precursor material to hydroxide form and conversion of precursor material from hydroxide form to oxy-hydroxide form within the alkaline electrolysis cell.

  9. Analysis of flood modeling through innovative geomatic methods

    NASA Astrophysics Data System (ADS)

    Zazo, Santiago; Molina, José-Luis; Rodríguez-Gonzálvez, Pablo

    2015-05-01

    A suitable assessment and management of the exposure level to natural flood risks necessarily requires an exhaustive knowledge of the terrain. This study, primarily aimed to evaluate flood risk, firstly assesses the suitability of an innovative technique, called Reduced Cost Aerial Precision Photogrammetry (RC-APP), based on a motorized technology ultra-light aircraft ULM (Ultra-Light Motor), together with the hybridization of reduced costs sensors, for the acquisition of geospatial information. Consequently, this research generates the RC-APP technique which is found to be a more accurate-precise, economical and less time consuming geomatic product. This technique is applied in river engineering for the geometric modeling and risk assessment to floods. Through the application of RC-APP, a high spatial resolution image (orthophoto of 2.5 cm), and a Digital Elevation Model (DEM) of 0.10 m mesh size and high density points (about 100 points/m2), with altimetric accuracy of -0.02 ± 0.03 m have been obtained. These products have provided a detailed knowledge of the terrain, afterward used for the hydraulic simulation which has allowed a better definition of the inundated area, with important implications for flood risk assessment and management. In this sense, it should be noted that the achieved spatial resolution of DEM is 0.10 m which is especially interesting and useful in hydraulic simulations through 2D software. According to the results, the developed methodology and technology allows for a more accurate riverbed representation, compared with other traditional techniques such as Light Detection and Ranging (LiDAR), with a Root-Mean-Square Error (RMSE ± 0.50 m). This comparison has revealed that RC-APP has one lower magnitude order of error than the LiDAR method. Consequently, this technique arises as an efficient and appropriate tool, especially in areas with high exposure to risk of flooding. In hydraulic terms, the degree of detail achieved in the 3D model

  10. Characterization of rice starch and protein obtained by a fast alkaline extraction method.

    PubMed

    Souza, Daiana de; Sbardelotto, Arthur Francisco; Ziegler, Denize Righetto; Marczak, Ligia Damasceno Ferreira; Tessaro, Isabel Cristina

    2016-01-15

    This study evaluated the characteristics of rice starch and protein obtained by a fast alkaline extraction method on rice flour (RF) derived from broken rice. The extraction was conducted using 0.18% NaOH at 30°C for 30min followed by centrifugation to separate the starch rich and the protein rich fractions. This fast extraction method allowed to obtain an isoelectric precipitation protein concentrate (IPPC) with 79% protein and a starchy product with low protein content. The amino acid content of IPPC was practically unchanged compared to the protein in RF. The proteins of the IPPC underwent denaturation during extraction and some of the starch suffered the cold gelatinization phenomenon, due to the alkaline treatment. With some modifications, the fast method can be interesting in a technological point of view as it enables process cost reduction and useful ingredients obtention to the food and chemical industries. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Methods for estimating flood frequency in Montana based on data through water year 1998

    USGS Publications Warehouse

    Parrett, Charles; Johnson, Dave R.

    2004-01-01

    Annual peak discharges having recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years (T-year floods) were determined for 660 gaged sites in Montana and in adjacent areas of Idaho, Wyoming, and Canada, based on data through water year 1998. The updated flood-frequency information was subsequently used in regression analyses, either ordinary or generalized least squares, to develop equations relating T-year floods to various basin and climatic characteristics, equations relating T-year floods to active-channel width, and equations relating T-year floods to bankfull width. The equations can be used to estimate flood frequency at ungaged sites. Montana was divided into eight regions, within which flood characteristics were considered to be reasonably homogeneous, and the three sets of regression equations were developed for each region. A measure of the overall reliability of the regression equations is the average standard error of prediction. The average standard errors of prediction for the equations based on basin and climatic characteristics ranged from 37.4 percent to 134.1 percent. Average standard errors of prediction for the equations based on active-channel width ranged from 57.2 percent to 141.3 percent. Average standard errors of prediction for the equations based on bankfull width ranged from 63.1 percent to 155.5 percent. In most regions, the equations based on basin and climatic characteristics generally had smaller average standard errors of prediction than equations based on active-channel or bankfull width. An exception was the Southeast Plains Region, where all equations based on active-channel width had smaller average standard errors of prediction than equations based on basin and climatic characteristics or bankfull width. Methods for weighting estimates derived from the basin- and climatic-characteristic equations and the channel-width equations also were developed. The weights were based on the cross correlation of residuals from the

  12. An at-site flood estimation method in the context of nonstationarity I. A simulation study

    NASA Astrophysics Data System (ADS)

    Gado, Tamer A.; Nguyen, Van-Thanh-Van

    2016-04-01

    The stationarity of annual flood peak records is the traditional assumption of flood frequency analysis. In some cases, however, as a result of land-use and/or climate change, this assumption is no longer valid. Therefore, new statistical models are needed to capture dynamically the change of probability density functions over time, in order to obtain reliable flood estimation. In this study, an innovative method for nonstationary flood frequency analysis was presented. Here, the new method is based on detrending the flood series and applying the L-moments along with the GEV distribution to the transformed ;stationary; series (hereafter, this is called the LM-NS). The LM-NS method was assessed through a comparative study with the maximum likelihood (ML) method for the nonstationary GEV model, as well as with the stationary (S) GEV model. The comparative study, based on Monte Carlo simulations, was carried out for three nonstationary GEV models: a linear dependence of the mean on time (GEV1), a quadratic dependence of the mean on time (GEV2), and linear dependence in both the mean and log standard deviation on time (GEV11). The simulation results indicated that the LM-NS method performs better than the ML method for most of the cases studied, whereas the stationary method provides the least accurate results. An additional advantage of the LM-NS method is to avoid the numerical problems (e.g., convergence problems) that may occur with the ML method when estimating parameters for small data samples.

  13. Methods for delineating flood-prone areas in the Great Basin of Nevada and adjacent states

    USGS Publications Warehouse

    Burkham, D.E.

    1988-01-01

    The Great Basin is a region of about 210,000 square miles having no surface drainage to the ocean; it includes most of Nevada and parts of Utah, California, Oregon, Idaho, and Wyoming. The area is characterized by many parallel mountain ranges and valleys trending north-south. Stream channels usually are well defined and steep within the mountains, but on reaching the alluvial fan at the canyon mouth, they may diverge into numerous distributary channels, be discontinuous near the apex of the fan, or be deeply entrenched in the alluvial deposits. Larger rivers normally have well-defined channels to or across the valley floors, but all terminate at lakes or playas. Major floods occur in most parts of the Great Basin and result from snowmelt, frontal-storm rainfall, and localized convective rainfall. Snowmelt floods typically occur during April-June. Floods resulting from frontal rain and frontal rain on snow generally occur during November-March. Floods resulting from convective-type rainfall during localized thunderstorms occur most commonly during the summer months. Methods for delineating flood-prone areas are grouped into five general categories: Detailed, historical, analytical, physiographic, and reconnaissance. The detailed and historical methods are comprehensive methods; the analytical and physiographic are intermediate; and the reconnaissance method is only approximate. Other than the reconnaissance method, each method requires determination of a T-year discharge (the peak rate of flow during a flood with long-term average recurrence interval of T years) and T-year profile and the development of a flood-boundary map. The procedure is different, however, for each method. Appraisal of the applicability of each method included consideration of its technical soundness, limitations and uncertainties, ease of use, and costs in time and money. Of the five methods, the detailed method is probably the most accurate, though most expensive. It is applicable to

  14. The model of flood control using servqual method and importance performance analysis in Surakarta City – Indonesia

    NASA Astrophysics Data System (ADS)

    Titi Purwantini, V.; Sutanto, Yusuf

    2018-05-01

    This research is to create a model of flood control in the city of Surakarta using Servqual method and Importance Performance Analysis. Service quality is generally defined as the overall assessment of a service by the customersor the extent to which a service meets customer’s needs or expectations. The purpose of this study is to find the first model of flood control that is appropriate to the condition of the community. Surakarta This means looking for a model that can provide satisfactory service for the people of Surakarta who are in the location of the flood. The second is to find the right model to improve service performance of Surakarta City Government in serving the people in flood location. The method used to determine the satisfaction of the public on the quality of service is to see the difference in the quality of service expected by the community with the reality. This method is Servqual Method While to assess the performance of city government officials is by comparing the actual performance with the quality of services provided, this method is This means looking for a model that can provide satisfactory service for the people of Surakarta who are in the location of the flood.The second is to find the right model to improve service performance of Surakarta City Government in serving the people in flood location. The method used to determine the satisfaction of the public on the quality of service is to see the difference in the quality of service expected by the community with the reality. This method is Servqual Method While to assess the performance of city government officials is by comparing the actual performance with the quality of services provided, this method is Importance Performance Analysis. Samples were people living in flooded areas in the city of Surakarta. Result this research is Satisfaction = Responsiveness+ Realibility + Assurance + Empathy+ Tangible (Servqual Model) and Importance Performance Analysis is From Cartesian diagram

  15. Mapping flood hazards under uncertainty through probabilistic flood inundation maps

    NASA Astrophysics Data System (ADS)

    Stephens, T.; Bledsoe, B. P.; Miller, A. J.; Lee, G.

    2017-12-01

    Changing precipitation, rapid urbanization, and population growth interact to create unprecedented challenges for flood mitigation and management. Standard methods for estimating risk from flood inundation maps generally involve simulations of floodplain hydraulics for an established regulatory discharge of specified frequency. Hydraulic model results are then geospatially mapped and depicted as a discrete boundary of flood extents and a binary representation of the probability of inundation (in or out) that is assumed constant over a project's lifetime. Consequently, existing methods utilized to define flood hazards and assess risk management are hindered by deterministic approaches that assume stationarity in a nonstationary world, failing to account for spatio-temporal variability of climate and land use as they translate to hydraulic models. This presentation outlines novel techniques for portraying flood hazards and the results of multiple flood inundation maps spanning hydroclimatic regions. Flood inundation maps generated through modeling of floodplain hydraulics are probabilistic reflecting uncertainty quantified through Monte-Carlo analyses of model inputs and parameters under current and future scenarios. The likelihood of inundation and range of variability in flood extents resulting from Monte-Carlo simulations are then compared with deterministic evaluations of flood hazards from current regulatory flood hazard maps. By facilitating alternative approaches of portraying flood hazards, the novel techniques described in this presentation can contribute to a shifting paradigm in flood management that acknowledges the inherent uncertainty in model estimates and the nonstationary behavior of land use and climate.

  16. A study of farmers' flood perceptions based on the entropy method: an application from Jianghan Plain, China.

    PubMed

    Luo, Xiaofeng; Lone, Todd; Jiang, Songying; Li, Rongrong; Berends, Patrick

    2016-07-01

    Using survey data from 280 farmers in Jianghan Plain, China, this paper establishes an evaluation index system for three dimensions of farmers' flood perceptions and then uses the entropy method to estimate their overall flood perception. Farmers' flood perceptions exhibit the following characteristics: (i) their flood-occurrence, flood-prevention, and overall flood perceptions gradually increase with age, whereas their flood-effects perception gradually decreases; (ii) their flood-occurrence and flood-effects perceptions gradually increase with a higher level of education, whereas their flood-prevention perception gradually decreases and their overall flood perception shows nonlinear change; (iii) flood-occurrence, flood-effects, and overall flood perceptions are higher among farmers who serve in public offices than among those who do not do so; (iv) the flood-occurrence, flood-effects, and overall flood perceptions of farmers who work off-farm are higher than those of farmers who work solely on-farm, contrary to the flood-prevention perception; and (v) the flood-effects and flood-prevention perceptions of male farmers are lower than those of female farmers, but the flood-occurrence and overall flood perceptions of male farmers are higher than those of female farmers. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.

  17. Probabilistic flood extent estimates from social media flood observations

    NASA Astrophysics Data System (ADS)

    Brouwer, Tom; Eilander, Dirk; van Loenen, Arnejan; Booij, Martijn J.; Wijnberg, Kathelijne M.; Verkade, Jan S.; Wagemaker, Jurjen

    2017-05-01

    The increasing number and severity of floods, driven by phenomena such as urbanization, deforestation, subsidence and climate change, create a growing need for accurate and timely flood maps. In this paper we present and evaluate a method to create deterministic and probabilistic flood maps from Twitter messages that mention locations of flooding. A deterministic flood map created for the December 2015 flood in the city of York (UK) showed good performance (F(2) = 0.69; a statistic ranging from 0 to 1, with 1 expressing a perfect fit with validation data). The probabilistic flood maps we created showed that, in the York case study, the uncertainty in flood extent was mainly induced by errors in the precise locations of flood observations as derived from Twitter data. Errors in the terrain elevation data or in the parameters of the applied algorithm contributed less to flood extent uncertainty. Although these maps tended to overestimate the actual probability of flooding, they gave a reasonable representation of flood extent uncertainty in the area. This study illustrates that inherently uncertain data from social media can be used to derive information about flooding.

  18. Integrated flood hazard assessment based on spatial ordered weighted averaging method considering spatial heterogeneity of risk preference.

    PubMed

    Xiao, Yangfan; Yi, Shanzhen; Tang, Zhongqian

    2017-12-01

    Flood is the most common natural hazard in the world and has caused serious loss of life and property. Assessment of flood prone areas is of great importance for watershed management and reduction of potential loss of life and property. In this study, a framework of multi-criteria analysis (MCA) incorporating geographic information system (GIS), fuzzy analytic hierarchy process (AHP) and spatial ordered weighted averaging (OWA) method was developed for flood hazard assessment. The factors associated with geographical, hydrological and flood-resistant characteristics of the basin were selected as evaluation criteria. The relative importance of the criteria was estimated through fuzzy AHP method. The OWA method was utilized to analyze the effects of different risk attitudes of the decision maker on the assessment result. The spatial ordered weighted averaging method with spatially variable risk preference was implemented in the GIS environment to integrate the criteria. The advantage of the proposed method is that it has considered spatial heterogeneity in assigning risk preference in the decision-making process. The presented methodology has been applied to the area including Hanyang, Caidian and Hannan of Wuhan, China, where flood events occur frequently. The outcome of flood hazard distribution presents a tendency of high risk towards populated and developed areas, especially the northeast part of Hanyang city, which has suffered frequent floods in history. The result indicates where the enhancement projects should be carried out first under the condition of limited resources. Finally, sensitivity of the criteria weights was analyzed to measure the stability of results with respect to the variation of the criteria weights. The flood hazard assessment method presented in this paper is adaptable for hazard assessment of a similar basin, which is of great significance to establish counterplan to mitigate life and property losses. Copyright © 2017 Elsevier B.V. All

  19. The weighted function method: A handy tool for flood frequency analysis or just a curiosity?

    NASA Astrophysics Data System (ADS)

    Bogdanowicz, Ewa; Kochanek, Krzysztof; Strupczewski, Witold G.

    2018-04-01

    The idea of the Weighted Function (WF) method for estimation of Pearson type 3 (Pe3) distribution introduced by Ma in 1984 has been revised and successfully applied for shifted inverse Gaussian (IGa3) distribution. Also the conditions of WF applicability to a shifted distribution have been formulated. The accuracy of WF flood quantiles for both Pe3 and IGa3 distributions was assessed by Monte Caro simulations under the true and false distribution assumption versus the maximum likelihood (MLM), moment (MOM) and L-moments (LMM) methods. Three datasets of annual peak flows of Polish catchments serve the case studies to compare the results of the WF, MOM, MLM and LMM performance for the real flood data. For the hundred-year flood the WF method revealed the explicit superiority only over the MLM surpassing the MOM and especially LMM both for the true and false distributional assumption with respect to relative bias and relative mean root square error values. Generally, the WF method performs well and for hydrological sample size and constitutes good alternative for the estimation of the flood upper quantiles.

  20. Passive aerobic treatment of net-alkaline, iron-laden drainage from a flooded underground anthracite mine, Pennsylvania, USA

    USGS Publications Warehouse

    Cravotta, C.A.

    2007-01-01

    This report evaluates the results of a continuous 4.5-day laboratory aeration experiment and the first year of passive, aerobic treatment of abandoned mine drainage (AMD) from a typical flooded underground anthracite mine in eastern Pennsylvania, USA. During 1991-2006, the AMD source, locally known as the Otto Discharge, had flows from 20 to 270 L/s (median 92 L/s) and water quality that was consistently suboxic (median 0.9 mg/L O2) and circumneutral (pH ??? 6.0; net alkalinity >10) with moderate concentrations of dissolved iron and manganese and low concentrations of dissolved aluminum (medians of 11, 2.2, and <0.2 mg/L, respectively). In 2001, the laboratory aeration experiment demonstrated rapid oxidation of ferrous iron (Fe 2+) without supplemental alkalinity; the initial Fe2+ concentration of 16.4 mg/L decreased to less than 0.5 mg/L within 24 h; pH values increased rapidly from 5.8 to 7.2, ultimately attaining a steady-state value of 7.5. The increased pH coincided with a rapid decrease in the partial pressure of carbon dioxide (PCO2) from an initial value of 10 -1.1atm to a steady-state value of 10-3.1atm. From these results, a staged aerobic treatment system was conceptualized consisting of a 2 m deep pond with innovative aeration and recirculation to promote rapid oxidation of Fe2+, two 0.3 m deep wetlands to facilitate iron solids removal, and a supplemental oxic limestone drain for dissolved manganese and trace-metal removal. The system was constructed, but without the aeration mechanism, and began operation in June 2005. During the first 12 months of operation, estimated detention times in the treatment system ranged from 9 to 38 h. However, in contrast with 80-100% removal of Fe2+ over similar elapsed times during the laboratory aeration experiment, the treatment system typically removed less than 35% of the influent Fe2+. Although concentrations of dissolved CO2 decreased progressively within the treatment system, the PCO2 values for treated effluent

  1. Evaluation of Alkaline Cleaner Materials

    NASA Technical Reports Server (NTRS)

    Partz, Earl

    1998-01-01

    Alkaline cleaners used to process aluminum substrates have contained chromium as the corrosion inhibitor. Chromium is a hazardous substance whose use and control are described by environmental laws. Replacement materials that have the characteristics of chromated alkaline cleaners need to be found that address both the cleaning requirements and environmental impacts. This report will review environmentally friendly candidates evaluated as non-chromium alkaline cleaner replacements and methods used to compare those candidates one versus another. The report will also list characteristics used to select candidates based on their declared contents. It will also describe and evaluate methods used to discriminate among the large number of prospective candidates.

  2. Estimated value of insurance premium due to Citarum River flood by using Bayesian method

    NASA Astrophysics Data System (ADS)

    Sukono; Aisah, I.; Tampubolon, Y. R. H.; Napitupulu, H.; Supian, S.; Subiyanto; Sidi, P.

    2018-03-01

    Citarum river flood in South Bandung, West Java Indonesia, often happens every year. It causes property damage, producing economic loss. The risk of loss can be mitigated by following the flood insurance program. In this paper, we discussed about the estimated value of insurance premiums due to Citarum river flood by Bayesian method. It is assumed that the risk data for flood losses follows the Pareto distribution with the right fat-tail. The estimation of distribution model parameters is done by using Bayesian method. First, parameter estimation is done with assumption that prior comes from Gamma distribution family, while observation data follow Pareto distribution. Second, flood loss data is simulated based on the probability of damage in each flood affected area. The result of the analysis shows that the estimated premium value of insurance based on pure premium principle is as follows: for the loss value of IDR 629.65 million of premium IDR 338.63 million; for a loss of IDR 584.30 million of its premium IDR 314.24 million; and the loss value of IDR 574.53 million of its premium IDR 308.95 million. The premium value estimator can be used as neither a reference in the decision of reasonable premium determination, so as not to incriminate the insured, nor it result in loss of the insurer.

  3. Improving the quantification of flash flood hydrographs and reducing their uncertainty using noncontact streamgauging methods

    NASA Astrophysics Data System (ADS)

    Branger, Flora; Dramais, Guillaume; Horner, Ivan; Le Boursicaud, Raphaël; Le Coz, Jérôme; Renard, Benjamin

    2015-04-01

    Continuous river discharge data are crucial for the study and management of floods. In most river discharge monitoring networks, these data are obtained at gauging stations, where the stage-discharge relation is modelled with a rating curve to derive discharge from the measurement of water level in the river. Rating curves are usually established using individual ratings (or gaugings). However, using traditional gauging methods during flash floods is challenging for many reasons including hazardous flow conditions (for both equipment and people), short duration of the flood events, transient flows during the time needed to perform the gauging, etc. The lack of gaugings implies that the rating curve is often extrapolated well beyond the gauged range for the highest floods, inducing large uncertainties in the computed discharges. We deployed two remote techniques for gauging floods and improving stage-discharge relations for high flow conditions at several hydrometric stations throughout the Ardèche river catchment in France : (1) permanent video-recording stations enabling the implementation of the image analysis LS-PIV technique (Large Scale Particle Image Velocimetry) ; (2) and mobile gaugings using handheld Surface Velocity Radars (SVR). These gaugings were used to estimate the rating curve and its uncertainty using the Bayesian method BaRatin (Le Coz et al., 2014). Importantly, this method explicitly accounts for the uncertainty of individual gaugings, which is especially relevant for remote gaugings since their uncertainty is generally much higher than that of standard intrusive gauging methods. Then, the uncertainty of streamflow records was derived by combining the uncertainty of the rating curve and the uncertainty of stage records. We assessed the impact of these methodological developments for peak flow estimation and for flood descriptors at various time steps. The combination of field measurement innovation and statistical developments allows

  4. Reinforcing flood-risk estimation.

    PubMed

    Reed, Duncan W

    2002-07-15

    Flood-frequency estimation is inherently uncertain. The practitioner applies a combination of gauged data, scientific method and hydrological judgement to derive a flood-frequency curve for a particular site. The resulting estimate can be thought fully satisfactory only if it is broadly consistent with all that is reliably known about the flood-frequency behaviour of the river. The paper takes as its main theme the search for information to strengthen a flood-risk estimate made from peak flows alone. Extra information comes in many forms, including documentary and monumental records of historical floods, and palaeological markers. Meteorological information is also useful, although rainfall rarity is difficult to assess objectively and can be a notoriously unreliable indicator of flood rarity. On highly permeable catchments, groundwater levels present additional data. Other types of information are relevant to judging hydrological similarity when the flood-frequency estimate derives from data pooled across several catchments. After highlighting information sources, the paper explores a second theme: that of consistency in flood-risk estimates. Following publication of the Flood estimation handbook, studies of flood risk are now using digital catchment data. Automated calculation methods allow estimates by standard methods to be mapped basin-wide, revealing anomalies at special sites such as river confluences. Such mapping presents collateral information of a new character. Can this be used to achieve flood-risk estimates that are coherent throughout a river basin?

  5. A simple statistical method for analyzing flood susceptibility with incorporating rainfall and impervious surface

    NASA Astrophysics Data System (ADS)

    Chiang, Shou-Hao; Chen, Chi-Farn

    2016-04-01

    Flood, as known as the most frequent natural hazard in Taiwan, has induced severe damages of residents and properties in urban areas. The flood risk is even more severe in Tainan since 1990s, with the significant urban development over recent decades. Previous studies have indicated that the characteristics and the vulnerability of flood are affected by the increase of impervious surface area (ISA) and the changing climate condition. Tainan City, in southern Taiwan is selected as the study area. This study uses logistic regression to functionalize the relationship between rainfall variables, ISA and historical flood events. Specifically, rainfall records from 2001 to 2014 were collected and mapped, and Landsat images of year 2001, 2004, 2007, 2010 and 2014 were used to generate the ISA with SVM (support vector machine) classifier. The result shows that rainfall variables and ISA are significantly correlated to the flood occurrence in Tainan City. With applying the logistic function, the likelihood of flood occurrence can be estimated and mapped over the study area. This study suggests the method is simple and feasible for rapid flood susceptibility mapping, when real-time rainfall observations can be available, and it has potential for future flood assessment, with incorporating climate change projections and urban growth prediction.

  6. New method for assessing the susceptibility of glacial lakes to outburst floods in the Cordillera Blanca, Peru

    NASA Astrophysics Data System (ADS)

    Emmer, A.; Vilímek, V.

    2014-09-01

    This paper presents a new and easily repeatable method for assessing the susceptibility of glacial lakes to outburst floods (GLOFs) within the Peruvian region of the Cordillera Blanca. The presented method was designed to: (a) be repeatable (from the point of view of the demands on input data), (b) be reproducible (to provide an instructive guide for different assessors), (c) provide multiple results for different GLOF scenarios and (d) be regionally focused on the lakes of the Cordillera Blanca. Based on the input data gained from remotely sensed images and digital terrain models/topographical maps, the susceptibility of glacial lakes to outburst floods is assessed using a combination of decision trees for clarity and numerical calculation for repeatability and reproducibility. A total of seventeen assessed characteristics are used, of which seven have not been used in this context before. Also, several ratios and calculations are defined for the first time. We assume that it is not relevant to represent the overall susceptibility of a particular lake to outburst floods by one result (number), thus it is described in the presented method by five separate results (representing five different GLOF scenarios). These are potentials for (a) dam overtopping resulting from a fast slope movement into the lake, (b) dam overtopping following the flood wave originating in a lake situated upstream, (c) dam failure resulting from a fast slope movement into the lake, (d) dam failure following the flood wave originating in a lake situated upstream and (e) dam failure following a strong earthquake. All of these potentials include two or three components and theoretically range from 0 to 1. The presented method was verified on the basis of assessing the pre-flood conditions of seven lakes which have produced ten glacial lake outburst floods in the past and ten lakes which have not. A comparison of these results showed that the presented method successfully identified lakes

  7. Establishing a rainfall threshold for flash flood warnings based on the DFFG method in Yunnan province, China

    NASA Astrophysics Data System (ADS)

    Ma, M.; Wang, H.; Chen, Y.; Tang, G.; Hong, Z.; Zhang, K.; Hong, Y.

    2017-12-01

    Flash floods, one of the deadliest natural hazards worldwide due to their multidisciplinary nature, rank highly in terms of heavy damage and casualties. Such as in the United States, flash flood is the No.1 cause of death and the No. 2 most deadly weather-related hazard among all storm-related hazards, with approximately 100 lives lost each year. According to China Floods and Droughts Disasters Bullet in 2015 (http://www.mwr.gov.cn/zwzc/hygb/zgshzhgb), about 935 deaths per year on average were caused by flash floods from 2000 to 2015, accounting for 73 % of the fatalities due to floods. Therefore, significant efforts have been made toward understanding flash flood processes as well as modeling and forecasting them, it still remains challenging because of their short response time and limited monitoring capacity. This study advances the use of high-resolution Global Precipitation Measurement forecasts (GPMs), disaster data obtained from the government officials in 2011 and 2016, and the improved Distributed Flash Flood Guidance (DFFG) method combining the Distributed Hydrologic Model and Soil Conservation Service Curve Numbers. The objectives of this paper are (1) to examines changes in flash flood occurrence, (2) to estimate the effect of the rainfall spatial variability ,(2) to improve the lead time in flash floods warning and get the rainfall threshold, (3) to assess the DFFG method applicability in Dongchuan catchments, and (4) to yield the probabilistic information about the forecast hydrologic response that accounts for the locational uncertainties of the GPMs. Results indicate: (1) flash flood occurrence increased in the study region, (2) the occurrence of predicted flash floods show high sensitivity to total infiltration and soil water content, (3) the DFFG method is generally capable of making accurate predictions of flash flood events in terms of their locations and time of occurrence, and (4) the accumulative rainfall over a certain time span is an

  8. floodX: urban flash flood experiments monitored with conventional and alternative sensors

    NASA Astrophysics Data System (ADS)

    Moy de Vitry, Matthew; Dicht, Simon; Leitão, João P.

    2017-09-01

    The data sets described in this paper provide a basis for developing and testing new methods for monitoring and modelling urban pluvial flash floods. Pluvial flash floods are a growing hazard to property and inhabitants' well-being in urban areas. However, the lack of appropriate data collection methods is often cited as an impediment for reliable flood modelling, thereby hindering the improvement of flood risk mapping and early warning systems. The potential of surveillance infrastructure and social media is starting to draw attention for this purpose. In the floodX project, 22 controlled urban flash floods were generated in a flood response training facility and monitored with state-of-the-art sensors as well as standard surveillance cameras. With these data, it is possible to explore the use of video data and computer vision for urban flood monitoring and modelling. The floodX project stands out as the largest documented flood experiment of its kind, providing both conventional measurements and video data in parallel and at high temporal resolution. The data set used in this paper is available at https://doi.org/10.5281/zenodo.830513.

  9. Adaptive finite volume methods with well-balanced Riemann solvers for modeling floods in rugged terrain: Application to the Malpasset dam-break flood (France, 1959)

    USGS Publications Warehouse

    George, D.L.

    2011-01-01

    The simulation of advancing flood waves over rugged topography, by solving the shallow-water equations with well-balanced high-resolution finite volume methods and block-structured dynamic adaptive mesh refinement (AMR), is described and validated in this paper. The efficiency of block-structured AMR makes large-scale problems tractable, and allows the use of accurate and stable methods developed for solving general hyperbolic problems on quadrilateral grids. Features indicative of flooding in rugged terrain, such as advancing wet-dry fronts and non-stationary steady states due to balanced source terms from variable topography, present unique challenges and require modifications such as special Riemann solvers. A well-balanced Riemann solver for inundation and general (non-stationary) flow over topography is tested in this context. The difficulties of modeling floods in rugged terrain, and the rationale for and efficacy of using AMR and well-balanced methods, are presented. The algorithms are validated by simulating the Malpasset dam-break flood (France, 1959), which has served as a benchmark problem previously. Historical field data, laboratory model data and other numerical simulation results (computed on static fitted meshes) are shown for comparison. The methods are implemented in GEOCLAW, a subset of the open-source CLAWPACK software. All the software is freely available at. Published in 2010 by John Wiley & Sons, Ltd.

  10. Pediatric reference intervals for alkaline phosphatase.

    PubMed

    Zierk, Jakob; Arzideh, Farhad; Haeckel, Rainer; Cario, Holger; Frühwald, Michael C; Groß, Hans-Jürgen; Gscheidmeier, Thomas; Hoffmann, Reinhard; Krebs, Alexander; Lichtinghagen, Ralf; Neumann, Michael; Ruf, Hans-Georg; Steigerwald, Udo; Streichert, Thomas; Rascher, Wolfgang; Metzler, Markus; Rauh, Manfred

    2017-01-01

    Interpretation of alkaline phosphatase activity in children is challenging due to extensive changes with growth and puberty leading to distinct sex- and age-specific dynamics. Continuous percentile charts from birth to adulthood allow accurate consideration of these dynamics and seem reasonable for an analyte as closely linked to growth as alkaline phosphatase. However, the ethical and practical challenges unique to pediatric reference intervals have restricted the creation of such percentile charts, resulting in limitations when clinical decisions are based on alkaline phosphatase activity. We applied an indirect method to generate percentile charts for alkaline phosphatase activity using clinical laboratory data collected during the clinical care of patients. A total of 361,405 samples from 124,440 patients from six German tertiary care centers and one German laboratory service provider measured between January 2004 and June 2015 were analyzed. Measurement of alkaline phosphatase activity was performed on Roche Cobas analyzers using the IFCC's photometric method. We created percentile charts for alkaline phosphatase activity in girls and boys from birth to 18 years which can be used as reference intervals. Additionally, data tables of age- and sex-specific percentile values allow the incorporation of these results into laboratory information systems. The percentile charts provided enable the appropriate differential diagnosis of changes in alkaline phosphatase activity due to disease and changes due to physiological development. After local validation, integration of the provided percentile charts into result reporting facilitates precise assessment of alkaline phosphatase dynamics in pediatrics.

  11. Theoretical considerations and a simple method for measuring alkalinity and acidity in low-pH waters by gran titration

    USGS Publications Warehouse

    Barringer, J.L.; Johnsson, P.A.

    1996-01-01

    Titrations for alkalinity and acidity using the technique described by Gran (1952, Determination of the equivalence point in potentiometric titrations, Part II: The Analyst, v. 77, p. 661-671) have been employed in the analysis of low-pH natural waters. This report includes a synopsis of the theory and calculations associated with Gran's technique and presents a simple and inexpensive method for performing alkalinity and acidity determinations. However, potential sources of error introduced by the chemical character of some waters may limit the utility of Gran's technique. Therefore, the cost- and time-efficient method for performing alkalinity and acidity determinations described in this report is useful for exploring the suitability of Gran's technique in studies of water chemistry.

  12. Remote-sensing-based rapid assessment of flood crop loss to support USDA flooding decision-making

    NASA Astrophysics Data System (ADS)

    Di, L.; Yu, G.; Yang, Z.; Hipple, J.; Shrestha, R.

    2016-12-01

    Floods often cause significant crop loss in the United States. Timely and objective assessment of flood-related crop loss is very important for crop monitoring and risk management in agricultural and disaster-related decision-making in USDA. Among all flood-related information, crop yield loss is particularly important. Decision on proper mitigation, relief, and monetary compensation relies on it. Currently USDA mostly relies on field surveys to obtain crop loss information and compensate farmers' loss claim. Such methods are expensive, labor intensive, and time consumptive, especially for a large flood that affects a large geographic area. Recent studies have demonstrated that Earth observation (EO) data are useful in post-flood crop loss assessment for a large geographic area objectively, timely, accurately, and cost effectively. There are three stages of flood damage assessment, including rapid assessment, early recovery assessment, and in-depth assessment. EO-based flood assessment methods currently rely on the time-series of vegetation index to assess the yield loss. Such methods are suitable for in-depth assessment but are less suitable for rapid assessment since the after-flood vegetation index time series is not available. This presentation presents a new EO-based method for the rapid assessment of crop yield loss immediately after a flood event to support the USDA flood decision making. The method is based on the historic records of flood severity, flood duration, flood date, crop type, EO-based both before- and immediate-after-flood crop conditions, and corresponding crop yield loss. It hypotheses that a flood of same severity occurring at the same pheonological stage of a crop will cause the similar damage to the crop yield regardless the flood years. With this hypothesis, a regression-based rapid assessment algorithm can be developed by learning from historic records of flood events and corresponding crop yield loss. In this study, historic records of

  13. Efficient GIS-based model-driven method for flood risk management and its application in central China

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Zhou, J.; Song, L.; Zou, Q.; Guo, J.; Wang, Y.

    2014-02-01

    In recent years, an important development in flood management has been the focal shift from flood protection towards flood risk management. This change greatly promoted the progress of flood control research in a multidisciplinary way. Moreover, given the growing complexity and uncertainty in many decision situations of flood risk management, traditional methods, e.g., tight-coupling integration of one or more quantitative models, are not enough to provide decision support for managers. Within this context, this paper presents a beneficial methodological framework to enhance the effectiveness of decision support systems, through the dynamic adaptation of support regarding the needs of the decision-maker. In addition, we illustrate a loose-coupling technical prototype for integrating heterogeneous elements, such as multi-source data, multidisciplinary models, GIS tools and existing systems. The main innovation is the application of model-driven concepts, which put the system in a state of continuous iterative optimization. We define the new system as a model-driven decision support system (MDSS ). Two characteristics that differentiate the MDSS are as follows: (1) it is made accessible to non-technical specialists; and (2) it has a higher level of adaptability and compatibility. Furthermore, the MDSS was employed to manage the flood risk in the Jingjiang flood diversion area, located in central China near the Yangtze River. Compared with traditional solutions, we believe that this model-driven method is efficient, adaptable and flexible, and thus has bright prospects of application for comprehensive flood risk management.

  14. Group decision-making approach for flood vulnerability identification using the fuzzy VIKOR method

    NASA Astrophysics Data System (ADS)

    Lee, G.; Jun, K. S.; Cung, E. S.

    2014-09-01

    This study proposes an improved group decision making (GDM) framework that combines VIKOR method with fuzzified data to quantify the spatial flood vulnerability including multi-criteria evaluation indicators. In general, GDM method is an effective tool for formulating a compromise solution that involves various decision makers since various stakeholders may have different perspectives on their flood risk/vulnerability management responses. The GDM approach is designed to achieve consensus building that reflects the viewpoints of each participant. The fuzzy VIKOR method was developed to solve multi-criteria decision making (MCDM) problems with conflicting and noncommensurable criteria. This comprising method can be used to obtain a nearly ideal solution according to all established criteria. Triangular fuzzy numbers are used to consider the uncertainty of weights and the crisp data of proxy variables. This approach can effectively propose some compromising decisions by combining the GDM method and fuzzy VIKOR method. The spatial flood vulnerability of the south Han River using the GDM approach combined with the fuzzy VIKOR method was compared with the results from general MCDM methods, such as the fuzzy TOPSIS and classical GDM methods, such as those developed by Borda, Condorcet, and Copeland. The evaluated priorities were significantly dependent on the employed decision-making method. The proposed fuzzy GDM approach can reduce the uncertainty in the data confidence and weight derivation techniques. Thus, the combination of the GDM approach with the fuzzy VIKOR method can provide robust prioritization because it actively reflects the opinions of various groups and considers uncertainty in the input data.

  15. Numerical Analysis of Flood modeling of upper Citarum River under Extreme Flood Condition

    NASA Astrophysics Data System (ADS)

    Siregar, R. I.

    2018-02-01

    This paper focuses on how to approach the numerical method and computation to analyse flood parameters. Water level and flood discharge are the flood parameters solved by numerical methods approach. Numerical method performed on this paper for unsteady flow conditions have strengths and weaknesses, among others easily applied to the following cases in which the boundary irregular flow. The study area is in upper Citarum Watershed, Bandung, West Java. This paper uses computation approach with Force2 programming and HEC-RAS to solve the flow problem in upper Citarum River, to investigate and forecast extreme flood condition. Numerical analysis based on extreme flood events that have occurred in the upper Citarum watershed. The result of water level parameter modeling and extreme flood discharge compared with measurement data to analyse validation. The inundation area about flood that happened in 2010 is about 75.26 square kilometres. Comparing two-method show that the FEM analysis with Force2 programs has the best approach to validation data with Nash Index is 0.84 and HEC-RAS that is 0.76 for water level. For discharge data Nash Index obtained the result analysis use Force2 is 0.80 and with use HEC-RAS is 0.79.

  16. Separator for alkaline batteries and method of making same

    NASA Technical Reports Server (NTRS)

    Hoyt, H. E.; Pfluger, H. L. (Inventor)

    1970-01-01

    The preparation of membranes suitable for use as separators in concentrated alkaline battery cells by selective solvolysis of copolymers of methacrylate esters with acrylate esters followed by addition of a base and to the resultant products is described. The method of making copolymers by first copolymerizing a methacrylate ester (or esters) with a more readily hydrolyzable ester, followed by a selective saponification whereby the methacrylate ester moieties remain essentially intact and the readily hydrolyzable ester moiety is suponified and to the partial or complete neutralization of the relatively brittle copolymer acid with a base to make membranes which are sufficiently flexible in the dry state so that they may be wrapped around electrodes without damage by handling is described.

  17. Net alkalinity and net acidity 2: Practical considerations

    USGS Publications Warehouse

    Kirby, C.S.; Cravotta, C.A.

    2005-01-01

    The pH, alkalinity, and acidity of mine drainage and associated waters can be misinterpreted because of the chemical instability of samples and possible misunderstandings of standard analytical method results. Synthetic and field samples of mine drainage having various initial pH values and concentrations of dissolved metals and alkalinity were titrated by several methods, and the results were compared to alkalinity and acidity calculated based on dissolved solutes. The pH, alkalinity, and acidity were compared between fresh, unoxidized and aged, oxidized samples. Data for Pennsylvania coal mine drainage indicates that the pH of fresh samples was predominantly acidic (pH 2.5-4) or near neutral (pH 6-7); ??? 25% of the samples had pH values between 5 and 6. Following oxidation, no samples had pH values between 5 and 6. The Standard Method Alkalinity titration is constrained to yield values >0. Most calculated and measured alkalinities for samples with positive alkalinities were in close agreement. However, for low-pH samples, the calculated alkalinity can be negative due to negative contributions by dissolved metals that may oxidize and hydrolyze. The Standard Method hot peroxide treatment titration for acidity determination (Hot Acidity) accurately indicates the potential for pH to decrease to acidic values after complete degassing of CO2 and oxidation of Fe and Mn, and it indicates either the excess alkalinity or that required for neutralization of the sample. The Hot Acidity directly measures net acidity (= -net alkalinity). Samples that had near-neutral pH after oxidation had negative Hot Acidity; samples that had pH < 6.3 after oxidation had positive Hot Acidity. Samples with similar pH values before oxidation had dissimilar Hot Acidities due to variations in their alkalinities and dissolved Fe, Mn, and Al concentrations. Hot Acidity was approximately equal to net acidity calculated based on initial pH and dissolved concentrations of Fe, Mn, and Al minus the

  18. On the objective identification of flood seasons

    NASA Astrophysics Data System (ADS)

    Cunderlik, Juraj M.; Ouarda, Taha B. M. J.; BobéE, Bernard

    2004-01-01

    The determination of seasons of high and low probability of flood occurrence is a task with many practical applications in contemporary hydrology and water resources management. Flood seasons are generally identified subjectively by visually assessing the temporal distribution of flood occurrences and, then at a regional scale, verified by comparing the temporal distribution with distributions obtained at hydrologically similar neighboring sites. This approach is subjective, time consuming, and potentially unreliable. The main objective of this study is therefore to introduce a new, objective, and systematic method for the identification of flood seasons. The proposed method tests the significance of flood seasons by comparing the observed variability of flood occurrences with the theoretical flood variability in a nonseasonal model. The method also addresses the uncertainty resulting from sampling variability by quantifying the probability associated with the identified flood seasons. The performance of the method was tested on an extensive number of samples with different record lengths generated from several theoretical models of flood seasonality. The proposed approach was then applied on real data from a large set of sites with different flood regimes across Great Britain. The results show that the method can efficiently identify flood seasons from both theoretical and observed distributions of flood occurrence. The results were used for the determination of the main flood seasonality types in Great Britain.

  19. Recurrent Early Cretaceous, Indo-Madagascar (89-86 Ma) and Deccan (66 Ma) alkaline magmatism in the Sarnu-Dandali complex, Rajasthan: 40Ar/39Ar age evidence and geodynamic significance

    NASA Astrophysics Data System (ADS)

    Sheth, Hetu; Pande, Kanchan; Vijayan, Anjali; Sharma, Kamal Kant; Cucciniello, Ciro

    2017-07-01

    The Sarnu-Dandali alkaline complex in Rajasthan, northwestern India, is considered to represent early, pre-flood basalt magmatism in the Deccan Traps province, based on a single 40Ar/39Ar age of 68.57 Ma. Rhyolites found in the complex are considered to be 750 Ma Malani basement. Our new 40Ar/39Ar ages of 88.9-86.8 Ma (for syenites, nephelinite, phonolite and rhyolite) and 66.3 ± 0.4 Ma (2σ, melanephelinite) provide clear evidence that whereas the complex has Deccan-age (66 Ma) components, it is dominantly an older (by 20 million years) alkaline complex, with rhyolites included. Basalt is also known to underlie the Early Cretaceous Sarnu Sandstone. Sarnu-Dandali is thus a periodically rejuvenated alkaline igneous centre, active twice in the Late Cretaceous and also earlier. Many such centres with recurrent continental alkaline magmatism (sometimes over hundreds of millions of years) are known worldwide. The 88.9-86.8 Ma 40Ar/39Ar ages for Sarnu-Dandali rocks fully overlap with those for the Indo-Madagascar flood basalt province formed during continental breakup between India (plus Seychelles) and Madagascar. Recent 40Ar/39Ar work on the Mundwara alkaline complex in Rajasthan, 120 km southeast of Sarnu-Dandali, has also shown polychronous emplacement (over ≥ 45 million years), and 84-80 Ma ages obtained from Mundwara also arguably represent post-breakup stages of the Indo-Madagascar flood basalt volcanism. Remnants of the Indo-Madagascar province are known from several localities in southern India but hitherto unknown from northwestern India 2000 km away. Additional equivalents buried under the vast Deccan Traps are highly likely.

  20. Designing a Software for Flood Risk Assessment Based on Multi Criteria Desicion Analysis and Information Diffusion Methods

    NASA Astrophysics Data System (ADS)

    Musaoglu, N.; Saral, A.; Seker, D. Z.

    2012-12-01

    Flooding is one of the major natural disasters not only in Turkey but also in all over the world and it causes serious damage and harm. It is estimated that of the total economic loss caused by all kinds of disasters, 40% was due to floods. In July 1995, the Ayamama Creek in Istanbul was flooded, the insurance sector received around 1,200 claims notices during that period, insurance companies had to pay a total of $40 million for claims. In 2009, the same creek was flooded again and killed 31 people over two days and insurance firms paid for damages around cost €150 million for claims. To solve these kinds of problems modern tools such as GIS and Remote Sensing should be utilized. In this study, a software was designed for the flood risk analysis with Analytic Hierarchy Process (AHP) and Information Diffusion( InfoDif) methods.In the developed sofware, five evaluation criterias were taken into account, which were slope, aspect, elevation, geology and land use which were extracted from the satellite sensor data. The Digital Elevation Model (DEM) of the Ayamama River Basin was acquired from the SPOT 5 satellite image with 2.5 meter spatial resolution. Slope and aspect values of the study basin were extracted from this DEM. The land use of the Ayamama Creek was obtained by performing object-oriented nearest neighbor classification method by image segmentation on SPOT 5 image dated 2010. All produced data were used as an input for the part of Multi Criteria Desicion Analysis (MCDA) method of this software. Criterias and their each sub criteras were weighted and flood vulnerability was determined with MCDA-AHP. Also, daily flood data was collected from Florya Meteorological Station, between 1975 to 2009 years and the daily flood peak discharge was calculated with the method of Soil Conservation Service-Curve Number (SCS-CN) and were used as an input in the software for the part of InfoDif.Obtained results were verified using ground truth data and it has been clearly

  1. Beyond 'flood hotspots': Modelling emergency service accessibility during flooding in York, UK

    NASA Astrophysics Data System (ADS)

    Coles, Daniel; Yu, Dapeng; Wilby, Robert L.; Green, Daniel; Herring, Zara

    2017-03-01

    This paper describes the development of a method that couples flood modelling with network analysis to evaluate the accessibility of city districts by emergency responders during flood events. We integrate numerical modelling of flood inundation with geographical analysis of service areas for the Ambulance Service and the Fire & Rescue Service. The method was demonstrated for two flood events in the City of York, UK to assess the vulnerability of care homes and sheltered accommodation. We determine the feasibility of emergency services gaining access within the statutory 8- and 10-min targets for high-priority, life-threatening incidents 75% of the time, during flood episodes. A hydrodynamic flood inundation model (FloodMap) simulates the 2014 pluvial and 2015 fluvial flood events. Predicted floods (with depth >25 cm and areas >100 m2) were overlain on the road network to identify sites with potentially restricted access. Accessibility of the city to emergency responders during flooding was quantified and mapped using; (i) spatial coverage from individual emergency nodes within the legislated timeframes, and; (ii) response times from individual emergency service nodes to vulnerable care homes and sheltered accommodation under flood and non-flood conditions. Results show that, during the 2015 fluvial flood, the area covered by two of the three Fire & Rescue Service stations reduced by 14% and 39% respectively, while the remaining station needed to increase its coverage by 39%. This amounts to an overall reduction of 6% and 20% for modelled and observed floods respectively. During the 2014 surface water flood, 7 out of 22 care homes (32%) and 15 out of 43 sheltered accommodation nodes (35%) had modelled response times above the 8-min threshold from any Ambulance station. Overall, modelled surface water flooding has a larger spatial footprint than fluvial flood events. Hence, accessibility of emergency services may be impacted differently depending on flood mechanism

  2. Novel Flood Detection and Analysis Method Using Recurrence Property

    NASA Astrophysics Data System (ADS)

    Wendi, Dadiyorto; Merz, Bruno; Marwan, Norbert

    2016-04-01

    Temporal changes in flood hazard are known to be difficult to detect and attribute due to multiple drivers that include processes that are non-stationary and highly variable. These drivers, such as human-induced climate change, natural climate variability, implementation of flood defence, river training, or land use change, could impact variably on space-time scales and influence or mask each other. Flood time series may show complex behavior that vary at a range of time scales and may cluster in time. This study focuses on the application of recurrence based data analysis techniques (recurrence plot) for understanding and quantifying spatio-temporal changes in flood hazard in Germany. The recurrence plot is known as an effective tool to visualize the dynamics of phase space trajectories i.e. constructed from a time series by using an embedding dimension and a time delay, and it is known to be effective in analyzing non-stationary and non-linear time series. The emphasis will be on the identification of characteristic recurrence properties that could associate typical dynamic behavior to certain flood situations.

  3. Method of improving heterogeneous oil reservoir polymer flooding effect by positively-charged gel profile control

    NASA Astrophysics Data System (ADS)

    Zhao, Ling; Xia, Huifen

    2018-01-01

    The project of polymer flooding has achieved great success in Daqing oilfield, and the main oil reservoir recovery can be improved by more than 15%. But, for some strong oil reservoir heterogeneity carrying out polymer flooding, polymer solution will be inefficient and invalid loop problem in the high permeability layer, then cause the larger polymer volume, and a significant reduction in the polymer flooding efficiency. Aiming at this problem, it is studied the method that improves heterogeneous oil reservoir polymer flooding effect by positively-charged gel profile control. The research results show that the polymer physical and chemical reaction of positively-charged gel with the residual polymer in high permeability layer can generate three-dimensional network of polymer, plugging high permeable layer, and increase injection pressure gradient, then improve the effect of polymer flooding development. Under the condition of the same dosage, positively-charged gel profile control can improve the polymer flooding recovery factor by 2.3∼3.8 percentage points. Under the condition of the same polymer flooding recovery factor increase value, after positively-charged gel profile control, it can reduce the polymer volume by 50 %. Applying mechanism of positively-charged gel profile control technology is feasible, cost savings, simple construction, and no environmental pollution, therefore has good application prospect.

  4. Technetium recovery from high alkaline solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nash, Charles A.

    2016-07-12

    Disclosed are methods for recovering technetium from a highly alkaline solution. The highly alkaline solution can be a liquid waste solution from a nuclear waste processing system. Methods can include combining the solution with a reductant capable of reducing technetium at the high pH of the solution and adding to or forming in the solution an adsorbent capable of adsorbing the precipitated technetium at the high pH of the solution.

  5. Looking at flood trends with different eyes: the value of a fuzzy flood classification scheme

    NASA Astrophysics Data System (ADS)

    Sikorska, A. E.; Viviroli, D.; Brunner, M. I.; Seibert, J.

    2016-12-01

    Natural floods can be governed by several processes such as heavy rainfall or intense snow- or glacier-melt. These processes result in different flood characteristics in terms of flood shape and magnitude. Pooling floods of different types might therefore impair the analyses of flood frequencies and trends. Thus, the categorization of flood events into different flood type classes and the determination of their respective frequencies is essential for a better understanding and for the prediction of floods. In reality however most flood events are caused by a mix of processes and a unique determination of a flood type per event often becomes difficult. This study proposes an innovative method for a more reliable categorization of floods according to similarities in flood drivers. The categorization of floods into subgroups relies on a fuzzy decision tree. While the classical (crisp) decision tree allows for the identification of only one flood type per event, the fuzzy approach enables the detection of mixed types. Hence, events are represented as a spectrum of six possible flood types, while a degree of acceptance attributed to each of them specifies the importance of each type during the event formation. Considered types are flash, short rainfall, long rainfall, snow-melt, rainfall-on-snow, and, in high altitude watersheds, also glacier-melt floods. The fuzzy concept also enables uncertainty present in the identification of flood processes and in the method to be incorporated into the flood categorization process. We demonstrate, for a set of nine Swiss watersheds and 30 years of observations, that this new concept provides more reliable flood estimates than the classical approach as it allows for a more dedicated flood prevention technique adapted to a specific flood type.

  6. Multiple flood vulnerability assessment approach based on fuzzy comprehensive evaluation method and coordinated development degree model.

    PubMed

    Yang, Weichao; Xu, Kui; Lian, Jijian; Bin, Lingling; Ma, Chao

    2018-05-01

    Flood is a serious challenge that increasingly affects the residents as well as policymakers. Flood vulnerability assessment is becoming gradually relevant in the world. The purpose of this study is to develop an approach to reveal the relationship between exposure, sensitivity and adaptive capacity for better flood vulnerability assessment, based on the fuzzy comprehensive evaluation method (FCEM) and coordinated development degree model (CDDM). The approach is organized into three parts: establishment of index system, assessment of exposure, sensitivity and adaptive capacity, and multiple flood vulnerability assessment. Hydrodynamic model and statistical data are employed for the establishment of index system; FCEM is used to evaluate exposure, sensitivity and adaptive capacity; and CDDM is applied to express the relationship of the three components of vulnerability. Six multiple flood vulnerability types and four levels are proposed to assess flood vulnerability from multiple perspectives. Then the approach is applied to assess the spatiality of flood vulnerability in Hainan's eastern area, China. Based on the results of multiple flood vulnerability, a decision-making process for rational allocation of limited resources is proposed and applied to the study area. The study shows that multiple flood vulnerability assessment can evaluate vulnerability more completely, and help decision makers learn more information about making decisions in a more comprehensive way. In summary, this study provides a new way for flood vulnerability assessment and disaster prevention decision. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Development of Probabilistic Flood Inundation Mapping For Flooding Induced by Dam Failure

    NASA Astrophysics Data System (ADS)

    Tsai, C.; Yeh, J. J. J.

    2017-12-01

    A primary function of flood inundation mapping is to forecast flood hazards and assess potential losses. However, uncertainties limit the reliability of inundation hazard assessments. Major sources of uncertainty should be taken into consideration by an optimal flood management strategy. This study focuses on the 20km reach downstream of the Shihmen Reservoir in Taiwan. A dam failure induced flood herein provides the upstream boundary conditions of flood routing. The two major sources of uncertainty that are considered in the hydraulic model and the flood inundation mapping herein are uncertainties in the dam break model and uncertainty of the roughness coefficient. The perturbance moment method is applied to a dam break model and the hydro system model to develop probabilistic flood inundation mapping. Various numbers of uncertain variables can be considered in these models and the variability of outputs can be quantified. The probabilistic flood inundation mapping for dam break induced floods can be developed with consideration of the variability of output using a commonly used HEC-RAS model. Different probabilistic flood inundation mappings are discussed and compared. Probabilistic flood inundation mappings are hoped to provide new physical insights in support of the evaluation of concerning reservoir flooded areas.

  8. A flood map based DOI decoding method for block detector: a GATE simulation study.

    PubMed

    Shi, Han; Du, Dong; Su, Zhihong; Peng, Qiyu

    2014-01-01

    Positron Emission Tomography (PET) systems using detectors with Depth of Interaction (DOI) capabilities could achieve higher spatial resolution and better image quality than those without DOI. Up till now, most DOI methods developed are not cost-efficient for a whole body PET system. In this paper, we present a DOI decoding method based on flood map for low-cost conventional block detector with four-PMT readout. Using this method, the DOI information can be directly extracted from the DOI-related crystal spot deformation in the flood map. GATE simulations are then carried out to validate the method, confirming a DOI sorting accuracy of 85.27%. Therefore, we conclude that this method has the potential to be applied in conventional detectors to achieve a reasonable DOI measurement without dramatically increasing their complexity and cost of an entire PET system.

  9. Characterization and quantification of biochar alkalinity.

    PubMed

    Fidel, Rivka B; Laird, David A; Thompson, Michael L; Lawrinenko, Michael

    2017-01-01

    Lack of knowledge regarding the nature of biochar alkalis has hindered understanding of pH-sensitive biochar-soil interactions. Here we investigate the nature of biochar alkalinity and present a cohesive suite of methods for its quantification. Biochars produced from cellulose, corn stover and wood feedstocks had significant low-pK a organic structural (0.03-0.34 meq g -1 ), other organic (0-0.92 meq g -1 ), carbonate (0.02-1.5 meq g -1 ), and other inorganic (0-0.26 meq g -1 ) alkalinities. All four categories of biochar alkalinity contributed to total biochar alkalinity and are therefore relevant to pH-sensitive soil processes. Total biochar alkalinity was strongly correlated with base cation concentration, but biochar alkalinity was not a simple function of elemental composition, soluble ash, fixed carbon, or volatile matter content. More research is needed to characterize soluble biochar alkalis other than carbonates and to establish predictive relationships among biochar production parameters and the composition of biochar alkalis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Flood Impact Modelling and Natural Flood Management

    NASA Astrophysics Data System (ADS)

    Owen, Gareth; Quinn, Paul; ODonnell, Greg

    2016-04-01

    Local implementation of Natural Flood Management methods are now being proposed in many flood schemes. In principal it offers a cost effective solution to a number of catchment based problem as NFM tackles both flood risk and WFD issues. However within larger catchments there is the issue of which subcatchments to target first and how much NFM to implement. If each catchment has its own configuration of subcatchment and rivers how can the issues of flood synchronisation and strategic investment be addressed? In this study we will show two key aspects to resolving these issues. Firstly, a multi-scale network water level recorder is placed throughout the system to capture the flow concentration and travel time operating in the catchment being studied. The second is a Flood Impact Model (FIM), which is a subcatchment based model that can generate runoff in any location using any hydrological model. The key aspect to the model is that it has a function to represent the impact of NFM in any subcatchment and the ability to route that flood wave to the outfall. This function allows a realistic representation of the synchronisation issues for that catchment. By running the model in interactive mode the user can define an appropriate scheme that minimises or removes the risk of synchornisation and gives confidence that the NFM investment is having a good level of impact downstream in large flood events.

  11. Modeling urban coastal flood severity from crowd-sourced flood reports using Poisson regression and Random Forest

    NASA Astrophysics Data System (ADS)

    Sadler, J. M.; Goodall, J. L.; Morsy, M. M.; Spencer, K.

    2018-04-01

    Sea level rise has already caused more frequent and severe coastal flooding and this trend will likely continue. Flood prediction is an essential part of a coastal city's capacity to adapt to and mitigate this growing problem. Complex coastal urban hydrological systems however, do not always lend themselves easily to physically-based flood prediction approaches. This paper presents a method for using a data-driven approach to estimate flood severity in an urban coastal setting using crowd-sourced data, a non-traditional but growing data source, along with environmental observation data. Two data-driven models, Poisson regression and Random Forest regression, are trained to predict the number of flood reports per storm event as a proxy for flood severity, given extensive environmental data (i.e., rainfall, tide, groundwater table level, and wind conditions) as input. The method is demonstrated using data from Norfolk, Virginia USA from September 2010 to October 2016. Quality-controlled, crowd-sourced street flooding reports ranging from 1 to 159 per storm event for 45 storm events are used to train and evaluate the models. Random Forest performed better than Poisson regression at predicting the number of flood reports and had a lower false negative rate. From the Random Forest model, total cumulative rainfall was by far the most dominant input variable in predicting flood severity, followed by low tide and lower low tide. These methods serve as a first step toward using data-driven methods for spatially and temporally detailed coastal urban flood prediction.

  12. Technical Note: Initial assessment of a multi-method approach to spring-flood forecasting in Sweden

    NASA Astrophysics Data System (ADS)

    Olsson, J.; Uvo, C. B.; Foster, K.; Yang, W.

    2016-02-01

    Hydropower is a major energy source in Sweden, and proper reservoir management prior to the spring-flood onset is crucial for optimal production. This requires accurate forecasts of the accumulated discharge in the spring-flood period (i.e. the spring-flood volume, SFV). Today's SFV forecasts are generated using a model-based climatological ensemble approach, where time series of precipitation and temperature from historical years are used to force a calibrated and initialized set-up of the HBV model. In this study, a number of new approaches to spring-flood forecasting that reflect the latest developments with respect to analysis and modelling on seasonal timescales are presented and evaluated. Three main approaches, represented by specific methods, are evaluated in SFV hindcasts for the Swedish river Vindelälven over a 10-year period with lead times between 0 and 4 months. In the first approach, historically analogue years with respect to the climate in the period preceding the spring flood are identified and used to compose a reduced ensemble. In the second, seasonal meteorological ensemble forecasts are used to drive the HBV model over the spring-flood period. In the third approach, statistical relationships between SFV and the large-sale atmospheric circulation are used to build forecast models. None of the new approaches consistently outperform the climatological ensemble approach, but for early forecasts improvements of up to 25 % are found. This potential is reasonably well realized in a multi-method system, which over all forecast dates reduced the error in SFV by ˜ 4 %. This improvement is limited but potentially significant for e.g. energy trading.

  13. Consistency of extreme flood estimation approaches

    NASA Astrophysics Data System (ADS)

    Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf

    2017-04-01

    Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.

  14. Quantifying peak discharges for historical floods

    USGS Publications Warehouse

    Cook, J.L.

    1987-01-01

    It is usually advantageous to use information regarding historical floods, if available, to define the flood-frequency relation for a stream. Peak stages can sometimes be determined for outstanding floods that occurred many years ago before systematic gaging of streams began. In the United States, this information is usually not available for more than 100-200 years, but in countries with long cultural histories, such as China, historical flood data are available at some sites as far back as 2,000 years or more. It is important in flood studies to be able to assign a maximum discharge rate and an associated error range to the historical flood. This paper describes the significant characteristics and uncertainties of four commonly used methods for estimating the peak discharge of a flood. These methods are: (1) rating curve (stage-discharge relation) extension; (2) slope conveyance; (3) slope area; and (4) step backwater. Logarithmic extensions of rating curves are based on theoretical plotting techniques that results in straight line extensions provided that channel shape and roughness do not change significantly. The slope-conveyance and slope-area methods are based on the Manning equation, which requires specific data on channel size, shape and roughness, as well as the water-surface slope for one or more cross-sections in a relatively straight reach of channel. The slope-conveyance method is used primarily for shaping and extending rating curves, whereas the slope-area method is used for specific floods. The step-backwater method, also based on the Manning equation, requires more cross-section data than the slope-area ethod, but has a water-surface profile convergence characteristic that negates the need for known or estimated water-surface slope. Uncertainties in calculating peak discharge for historical floods may be quite large. Various investigations have shown that errors in calculating peak discharges by the slope-area method under ideal conditions for

  15. Forecasting snowmelt flooding over Britain using the Grid-to-Grid model: a review and assessment of methods

    NASA Astrophysics Data System (ADS)

    Dey, Seonaid R. A.; Moore, Robert J.; Cole, Steven J.; Wells, Steven C.

    2017-04-01

    In many regions of high annual snowfall, snowmelt modelling can prove to be a vital component of operational flood forecasting and warning systems. Although Britain as a whole does not experience prolonged periods of lying snow, with the exception of the Scottish Highlands, the inclusion of snowmelt modelling can still have a significant impact on the skill of flood forecasts. Countrywide operational flood forecasts over Britain are produced using the national Grid-to-Grid (G2G) distributed hydrological model. For Scotland, snowmelt is included in these forecasts through a G2G snow hydrology module involving temperature-based snowfall/rainfall partitioning and functions for temperature-excess snowmelt, snowpack storage and drainage. Over England and Wales, the contribution of snowmelt is included by pre-processing the precipitation prior to input into G2G. This removes snowfall diagnosed from weather model outputs and adds snowmelt from an energy budget land surface scheme to form an effective liquid water gridded input to G2G. To review the operational options for including snowmelt modelling in G2G over Britain, a project was commissioned by the Environment Agency through the Flood Forecasting Centre (FFC) for England and Wales and in partnership with the Scottish Environment Protection Agency (SEPA) and Natural Resources Wales (NRW). Results obtained from this snowmelt review project will be reported on here. The operational methods used by the FFC and SEPA are compared on past snowmelt floods, alongside new alternative methods of treating snowmelt. Both case study and longer-term analyses are considered, covering periods selected from the winters 2009-2010, 2012-2013, 2013-2014 and 2014-2015. Over Scotland, both of the snowmelt methods used operationally by FFC and SEPA provided a clear improvement to the river flow simulations. Over England and Wales, fewer and less significant snowfall events occurred, leading to less distinction in the results between the

  16. Floods and Flash Flooding

    MedlinePlus

    Floods and flash flooding Now is the time to determine your area’s flood risk. If you are not sure whether you live in ... If you are in a floodplain, consider buying flood insurance. Do not drive around barricades. If your ...

  17. Floods and food security: A method to estimate the effect of inundation on crops availability

    NASA Astrophysics Data System (ADS)

    Pacetti, Tommaso; Caporali, Enrica; Rulli, Maria Cristina

    2017-12-01

    The inner connections between floods and food security are extremely relevant, especially in developing countries where food availability can be highly jeopardized by extreme events that damage the primary access to food, i.e. agriculture. A method for the evaluation of the effects of floods on food supply, consisting of the integration of remote sensing data, agricultural statistics and water footprint databases, is proposed and applied to two different case studies. Based on the existing literature related to extreme floods, the events in Bangladesh (2007) and in Pakistan (2010) have been selected as exemplary case studies. Results show that the use of remote sensing data combined with other sources of onsite information is particularly useful to assess the effects of flood events on food availability. The damages caused by floods on agricultural areas are estimated in terms of crop losses and then converted into lost calories and water footprint as complementary indicators. Method results are fully repeatable; whereas, for remote sensed data the sources of data are valid worldwide and the data regarding land use and crops characteristics are strongly site specific, which need to be carefully evaluated. A sensitivity analysis has been carried out for the water depth critical on the crops in Bangladesh, varying the assumed level by ±20%. The results show a difference in the energy content losses estimation of 12% underlying the importance of an accurate data choice.

  18. Physical parameters of Fluvisols on flooded and non-flooded terraces

    NASA Astrophysics Data System (ADS)

    Kercheva, Milena; Sokołowska, Zofia; Hajnos, Mieczysław; Skic, Kamil; Shishkov, Toma

    2017-01-01

    The heterogeneity of soil physical properties of Fluvisols, lack of large pristine areas, and different moisture regimes on non-flooded and flooded terraces impede the possibility to find a soil profile which can serve as a baseline for estimating the impact of natural or anthropogenic factors on soil evolution. The aim of this study is to compare the pore size distribution of pristine Fluvisols on flooded and non-flooded terraces using the method of the soil water retention curve, mercury intrusion porosimetry, nitrogen adsorption isotherms, and water vapour sorption. The pore size distribution of humic horizons of pristine Fluvisols on the non-flooded terrace differs from pore size distribution of Fluvisols on the flooded terrace. The peaks of textural and structural pores are higher in the humic horizons under more humid conditions. The structural characteristics of subsoil horizons depend on soil texture and evolution stage. The peaks of textural pores at about 1 mm diminish with lowering of the soil organic content. Structureless horizons are characterized by uni-modal pore size distribution. Although the content of structural pores of the subsoil horizons of Fluvisols on the non-flooded terrace is low, these pores are represented by biopores, as the coefficient of filtration is moderately high. The difference between non-flooded and flooded profiles is well expressed by the available water storage, volume and mean radius of pores, obtained by mercury intrusion porosimetry and water desorption, which are higher in the surface horizons of frequently flooded Fluvisols.

  19. Flood replenishment: a new method of processor control.

    PubMed

    Frank, E D; Gray, J E; Wilken, D A

    1980-01-01

    In mechanized radiographic film processors that process medium to low volumes of film, roll films, and those that process single-emulsion films from nuclear medicine scans, computed tomography, and ultrasound, it is difficult to maintain the developer solution at a stable processing level. We describe our experience using flood replenishment, which is a method in which developer replenisher containing starter solution is introduced in the processor at timed intervals, independent of the number of films being processed. By this process, a stable level of developer activity is maintained in a processor used to develop a medium to low volume of single-emulsion film.

  20. Hydrophilic Electrode For An Alkaline Electrochemical Cell, And Method Of Manufacture

    DOEpatents

    Senyarich, Stephane; Cocciantelli, Jean-Michel

    2000-03-07

    A negative electrode for an alkaline electrochemical cell. The electrode comprises an active material and a hydrophilic agent constituted by small cylindrical rods of polyolefin provided with hydrophilic groups. The mean length of the rods is less than 50 microns and the mean diameter thereof is less than 20 microns. A method of manufacturing a negative electrode in which hydrophilic rods are made by fragmenting long polyolefin fibers having a mean diameter of less than 20 microns by oxidizing them, with the rods being mixed with the active material and the mixture being applied to a current conductor.

  1. Adverse effects of mineral-alkali reactions in alkaline flooding: Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thornton, S.D.

    1988-01-01

    Two slim-tube experiments and supporting bottle tests were performed for a study of sandstone-alkali reactions. The two samples of reservoir sandstone used are from oilfields in the People's Republic of China. The first sandstone contains 16 percent clay and is from the Gu-Dao oilfield. The second sandstone contains 12 percent clay and is from the Liao-He oilfield. These two sandstones were allowed to react with alkaline solutions in 6-month bottle tests. Each sandstone consumed the most alkali from 0.5 N NaOH solution, an intermediate amount of alkali from 0.5 N Na/sub 2/SiO/sub 3/ solution, and the least amount of alkalimore » from 0.5 N Na/sub 2/CO/sub 3/ solution. 59 refs., 14 figs., 20 tabs.« less

  2. Flood Frequency Curves - Use of information on the likelihood of extreme floods

    NASA Astrophysics Data System (ADS)

    Faber, B.

    2011-12-01

    Investment in the infrastructure that reduces flood risk for flood-prone communities must incorporate information on the magnitude and frequency of flooding in that area. Traditionally, that information has been a probability distribution of annual maximum streamflows developed from the historical gaged record at a stream site. Practice in the United States fits a Log-Pearson type3 distribution to the annual maximum flows of an unimpaired streamflow record, using the method of moments to estimate distribution parameters. The procedure makes the assumptions that annual peak streamflow events are (1) independent, (2) identically distributed, and (3) form a representative sample of the overall probability distribution. Each of these assumptions can be challenged. We rarely have enough data to form a representative sample, and therefore must compute and display the uncertainty in the estimated flood distribution. But, is there a wet/dry cycle that makes precipitation less than independent between successive years? Are the peak flows caused by different types of events from different statistical populations? How does the watershed or climate changing over time (non-stationarity) affect the probability distribution floods? Potential approaches to avoid these assumptions vary from estimating trend and shift and removing them from early data (and so forming a homogeneous data set), to methods that estimate statistical parameters that vary with time. A further issue in estimating a probability distribution of flood magnitude (the flood frequency curve) is whether a purely statistical approach can accurately capture the range and frequency of floods that are of interest. A meteorologically-based analysis produces "probable maximum precipitation" (PMP) and subsequently a "probable maximum flood" (PMF) that attempts to describe an upper bound on flood magnitude in a particular watershed. This analysis can help constrain the upper tail of the probability distribution, well

  3. Visual Sensing for Urban Flood Monitoring

    PubMed Central

    Lo, Shi-Wei; Wu, Jyh-Horng; Lin, Fang-Pang; Hsu, Ching-Han

    2015-01-01

    With the increasing climatic extremes, the frequency and severity of urban flood events have intensified worldwide. In this study, image-based automated monitoring of flood formation and analyses of water level fluctuation were proposed as value-added intelligent sensing applications to turn a passive monitoring camera into a visual sensor. Combined with the proposed visual sensing method, traditional hydrological monitoring cameras have the ability to sense and analyze the local situation of flood events. This can solve the current problem that image-based flood monitoring heavily relies on continuous manned monitoring. Conventional sensing networks can only offer one-dimensional physical parameters measured by gauge sensors, whereas visual sensors can acquire dynamic image information of monitored sites and provide disaster prevention agencies with actual field information for decision-making to relieve flood hazards. The visual sensing method established in this study provides spatiotemporal information that can be used for automated remote analysis for monitoring urban floods. This paper focuses on the determination of flood formation based on image-processing techniques. The experimental results suggest that the visual sensing approach may be a reliable way for determining the water fluctuation and measuring its elevation and flood intrusion with respect to real-world coordinates. The performance of the proposed method has been confirmed; it has the capability to monitor and analyze the flood status, and therefore, it can serve as an active flood warning system. PMID:26287201

  4. Flood information for flood-plain planning

    USGS Publications Warehouse

    Bue, Conrad D.

    1967-01-01

    Floods are natural and normal phenomena. They are catastrophic simply because man occupies the flood plain, the highwater channel of a river. Man occupies flood plains because it is convenient and profitable to do so, but he must purchase his occupancy at a price-either sustain flood damage, or provide flood-control facilities. Although large sums of money have been, and are being, spent for flood control, flood damage continues to mount. However, neither complete flood control nor abandonment of the flood plain is practicable. Flood plains are a valuable resource and will continue to be occupied, but the nature and degree of occupancy should be compatible with the risk involved and with the degree of protection that is practicable to provide. It is primarily to meet the needs for defining the risk that the flood-inundation maps of the U.S. Geological Survey are prepared.

  5. Dynamic Properties of the Alkaline Vesicle Population at Hippocampal Synapses

    PubMed Central

    Röther, Mareike; Brauner, Jan M.; Ebert, Katrin; Welzel, Oliver; Jung, Jasmin; Bauereiss, Anna; Kornhuber, Johannes; Groemer, Teja W.

    2014-01-01

    In compensatory endocytosis, scission of vesicles from the plasma membrane to the cytoplasm is a prerequisite for intravesicular reacidification and accumulation of neurotransmitter molecules. Here, we provide time-resolved measurements of the dynamics of the alkaline vesicle population which appears upon endocytic retrieval. Using fast perfusion pH-cycling in live-cell microscopy, synapto-pHluorin expressing rat hippocampal neurons were electrically stimulated. We found that the relative size of the alkaline vesicle population depended significantly on the electrical stimulus size: With increasing number of action potentials the relative size of the alkaline vesicle population expanded. In contrast to that, increasing the stimulus frequency reduced the relative size of the population of alkaline vesicles. Measurement of the time constant for reacification and calculation of the time constant for endocytosis revealed that both time constants were variable with regard to the stimulus condition. Furthermore, we show that the dynamics of the alkaline vesicle population can be predicted by a simple mathematical model. In conclusion, here a novel methodical approach to analyze dynamic properties of alkaline vesicles is presented and validated as a convenient method for the detection of intracellular events. Using this method we show that the population of alkaline vesicles is highly dynamic and depends both on stimulus strength and frequency. Our results implicate that determination of the alkaline vesicle population size may provide new insights into the kinetics of endocytic retrieval. PMID:25079223

  6. Extreme flood estimation by the SCHADEX method in a snow-driven catchment: application to Atnasjø (Norway)

    NASA Astrophysics Data System (ADS)

    Paquet, Emmanuel; Lawrence, Deborah

    2013-04-01

    The SCHADEX method for extreme flood estimation was developed by Paquet et al. (2006, 2013), and since 2008, it is the reference method used by Electricité de France (EDF) for dam spillway design. SCHADEX is a so-called "semi-continuous" stochastic simulation method in that flood events are simulated on an event basis and are superimposed on a continuous simulation of the catchment saturation hazard usingrainfall-runoff modelling. The MORDOR hydrological model (Garçon, 1999) has thus far been used for the rainfall-runoff modelling. MORDOR is a conceptual, lumped, reservoir model with daily areal rainfall and air temperature as the driving input data. The principal hydrological processes represented are evapotranspiration, direct and indirect runoff, ground water, snow accumulation and melt, and routing. The model has been intensively used at EDF for more than 15 years, in particular for inflow forecasts for French mountainous catchments. SCHADEX has now also been applied to the Atnasjø catchment (463 km²), a well-documented inland catchment in south-central Norway, dominated by snowmelt flooding during spring/early summer. To support this application, a weather pattern classification based on extreme rainfall was first established for Norway (Fleig, 2012). This classification scheme was then used to build a Multi-Exponential Weather Pattern distribution (MEWP), as introduced by Garavaglia et al. (2010) for extreme rainfall estimation. The MORDOR model was then calibrated relative to daily discharge data for Atnasjø. Finally, a SCHADEX simulation was run to build a daily discharge distribution with a sufficient number of simulations for assessing the extreme quantiles. Detailed results are used to illustrate how SCHADEX handles the complex and interacting hydrological processes driving flood generation in this snow driven catchment. Seasonal and monthly distributions, as well as statistics for several thousand simulated events reaching a 1000 years return level

  7. Evaluation of design flood frequency methods for Iowa streams : final report, June 2009.

    DOT National Transportation Integrated Search

    2009-06-01

    The objective of this project was to assess the predictive accuracy of flood frequency estimation for small Iowa streams based : on the Rational Method, the NRCS curve number approach, and the Iowa Runoff Chart. The evaluation was based on : comparis...

  8. Back analysis of Swiss flood danger map to define local flood hazards

    NASA Astrophysics Data System (ADS)

    Choffet, Marc; Derron, Marc-Henri; Jaboyedoff, Michel; Leroi, Eric; Mayis, Arnaud

    2010-05-01

    The flood hazard maps for the entire Switzerland will be available at the end of 2011. Furthermore, the Swiss territory has been covered by aerial laser scanning (ALS) providing high resolution digital elevation model (DEM). This paper describes the development of a method for analyzing the local flood hazard based on Swiss hazard maps and HR-DEM. In their original state, Swiss hazard maps are constructed on the basis of an aggregation of information, a matrix intensity, and frequency. The degree of danger represented by the yellow, blue and red zones gives no information on the water level at each point of the territory. The developed method is based on a superposition of the danger map with the HR-DEM to determine the water level in a hazard area. To perform this method, (1) a triangulation is based on the intersection of the hazard map with the HR-DEM. It uses the limits of area where information is contrain. The hazard map perimeter and the boundaries of hazard areas give information on the widest possible overflow in case of flooding. It is also possible to associate it with a return period. (2) Based on these areas and the difference with the DEM, it is possible to calibrate the highest flood level and the extract water levels for the entire area. This analysis of existing documents opens up interesting perspectives for understanding how infrastructures are threatened by flood hazard by predicting water levels and potential damages to buildings while proposing remedial measures. Indeed, this method allows estimating the water level at each point of a building in case of flooding. It is designed to provide spatial information on water height levels; this offers a different approach of buildings in danger zones. Indeed, it is possible to discern several elements, such as areas of water accumulation involving longer flood duration, possible structural damages to buildings due to high hydrostatic pressure, determination of a local hazard, or the display of water

  9. Net alkalinity and net acidity 1: Theoretical considerations

    USGS Publications Warehouse

    Kirby, C.S.; Cravotta, C.A.

    2005-01-01

    Net acidity and net alkalinity are widely used, poorly defined, and commonly misunderstood parameters for the characterization of mine drainage. The authors explain theoretical expressions of 3 types of alkalinity (caustic, phenolphthalein, and total) and acidity (mineral, CO2, and total). Except for rarely-invoked negative alkalinity, theoretically defined total alkalinity is closely analogous to measured alkalinity and presents few practical interpretation problems. Theoretically defined "CO 2-acidity" is closely related to most standard titration methods with an endpoint pH of 8.3 used for determining acidity in mine drainage, but it is unfortunately named because CO2 is intentionally driven off during titration of mine-drainage samples. Using the proton condition/mass- action approach and employing graphs to illustrate speciation with changes in pH, the authors explore the concept of principal components and how to assign acidity contributions to aqueous species commonly present in mine drainage. Acidity is defined in mine drainage based on aqueous speciation at the sample pH and on the capacity of these species to undergo hydrolysis to pH 8.3. Application of this definition shows that the computed acidity in mg L -1 as CaCO3 (based on pH and analytical concentrations of dissolved FeII, FeIII, Mn, and Al in mg L -1):aciditycalculated=50{1000(10-pH)+[2(FeII)+3(FeIII)]/56+2(Mn)/ 55+3(Al)/27}underestimates contributions from HSO4- and H+, but overestimates the acidity due to Fe3+ and Al3+. However, these errors tend to approximately cancel each other. It is demonstrated that "net alkalinity" is a valid mathematical construction based on theoretical definitions of alkalinity and acidity. Further, it is shown that, for most mine-drainage solutions, a useful net alkalinity value can be derived from: (1) alkalinity and acidity values based on aqueous speciation, (2) measured alkalinity minus calculated acidity, or (3) taking the negative of the value obtained in a

  10. Enhancing flood hazard estimation methods on alluvial fans using an integrated hydraulic, geological and geomorphological approach

    NASA Astrophysics Data System (ADS)

    Mollaei, Zeinab; Davary, Kamran; Majid Hasheminia, Seyed; Faridhosseini, Alireza; Pourmohamad, Yavar

    2018-04-01

    Due to the uncertainty concerning the location of flow paths on active alluvial fans, alluvial fan floods could be more dangerous than riverine floods. The United States Federal Emergency Management Agency (FEMA) used a simple stochastic model named FAN for this purpose, which has been practiced for many years. In the last decade, this model has been criticized as a consequence of development of more complex computer models. This study was conducted on three alluvial fans located in northeast and southeast Iran using a combination of the FAN model, the hydraulic portion of the FLO-2D model, and geomorphological information. Initial stages included three steps: (a) identifying the alluvial fans' landforms, (b) determining the active and inactive areas of alluvial fans, and (c) delineating 100-year flood within these selected areas. This information was used as an input in the mentioned three approaches of the (i) FLO-2D model, (ii) geomorphological method, and (iii) FAN model. Thereafter, the results of each model were obtained and geographical information system (GIS) layers were created and overlaid. Afterwards, using a scoring system, the results were evaluated and compared. The goal of this research was to introduce a simple but effective solution to estimate the flood hazards. It was concluded that the integrated method proposed in this study is superior at projecting alluvial fan flood hazards with minimum required input data, simplicity, and affordability, which are considered the primary goals of such comprehensive studies. These advantages are more highlighted in underdeveloped and developing countries, which may well lack detailed data and financially cannot support such costly projects. Furthermore, such a highly cost-effective method could be greatly advantageous and pragmatic for developed countries.

  11. Prophylactic treatment with alkaline phosphatase in cardiac surgery induces endogenous alkaline phosphatase release.

    PubMed

    Kats, Suzanne; Brands, Ruud; Hamad, Mohamed A Soliman; Seinen, Willem; Scharnhorst, Volkher; Wulkan, Raymond W; Schönberger, Jacques P; Oeveren, Wim van

    2012-02-01

    Laboratory and clinical data have implicated endotoxin as an important factor in the inflammatory response to cardiopulmonary bypass. We assessed the effects of the administration of bovine intestinal alkaline phosphatase (bIAP), an endotoxin detoxifier, on alkaline phosphatase levels in patients undergoing coronary artery bypass grafting. A total of 63 patients undergoing coronary artery bypass grafting were enrolled and prospectively randomized. Bovine intestinal alkaline phosphatase (n=32) or placebo (n=31) was administered as an intravenous bolus followed by continuous infusion for 36 hours. The primary endpoint was to evaluate alkaline phosphatase levels in both groups and to find out if administration of bIAP to patients undergoing CABG would lead to endogenous alkaline phosphatase release. No significant adverse effects were identified in either group. In all the 32 patients of the bIAP-treated group, we found an initial rise of plasma alkaline phosphatase levels due to bolus administration (464.27±176.17 IU/L). A significant increase of plasma alkaline phosphatase at 4-6 hours postoperatively was observed (354.97±95.00 IU/L) as well. Using LHA inhibition, it was shown that this second peak was caused by the generation of tissue non specific alkaline phosphatase (TNSALP-type alkaline phosphatase). Intravenous bolus administration plus 8 hours continuous infusion of alkaline phosphatase in patients undergoing coronary artery bypass grafting with cardiopulmonary bypass results in endogenous alkaline phosphatase release. This endogenous alkaline phosphatase may play a role in the immune defense system.

  12. Identification of flood-rich and flood-poor periods in flood series

    NASA Astrophysics Data System (ADS)

    Mediero, Luis; Santillán, David; Garrote, Luis

    2015-04-01

    Recently, a general concern about non-stationarity of flood series has arisen, as changes in catchment response can be driven by several factors, such as climatic and land-use changes. Several studies to detect trends in flood series at either national or trans-national scales have been conducted. Trends are usually detected by the Mann-Kendall test. However, the results of this test depend on the starting and ending year of the series, which can lead to different results in terms of the period considered. The results can be conditioned to flood-poor and flood-rich periods located at the beginning or end of the series. A methodology to identify statistically significant flood-rich and flood-poor periods is developed, based on the comparison between the expected sampling variability of floods when stationarity is assumed and the observed variability of floods in a given series. The methodology is applied to a set of long series of annual maximum floods, peaks over threshold and counts of annual occurrences in peaks over threshold series observed in Spain in the period 1942-2009. Mediero et al. (2014) found a general decreasing trend in flood series in some parts of Spain that could be caused by a flood-rich period observed in 1950-1970, placed at the beginning of the flood series. The results of this study support the findings of Mediero et al. (2014), as a flood-rich period in 1950-1970 was identified in most of the selected sites. References: Mediero, L., Santillán, D., Garrote, L., Granados, A. Detection and attribution of trends in magnitude, frequency and timing of floods in Spain, Journal of Hydrology, 517, 1072-1088, 2014.

  13. Estimating flood hydrographs and volumes for Alabama streams

    USGS Publications Warehouse

    Olin, D.A.; Atkins, J.B.

    1988-01-01

    The hydraulic design of highway drainage structures involves an evaluation of the effect of the proposed highway structures on lives, property, and stream stability. Flood hydrographs and associated flood volumes are useful tools in evaluating these effects. For design purposes, the Alabama Highway Department needs information on flood hydrographs and volumes associated with flood peaks of specific recurrence intervals (design floods) at proposed or existing bridge crossings. This report will provide the engineer with a method to estimate flood hydrographs, volumes, and lagtimes for rural and urban streams in Alabama with drainage areas less than 500 sq mi. Existing computer programs and methods to estimate flood hydrographs and volumes for ungaged streams have been developed in Georgia. These computer programs and methods were applied to streams in Alabama. The report gives detailed instructions on how to estimate flood hydrographs for ungaged rural or urban streams in Alabama with drainage areas less than 500 sq mi, without significant in-channel storage or regulations. (USGS)

  14. Fluctuation Flooding Method (FFM) for accelerating conformational transitions of proteins.

    PubMed

    Harada, Ryuhei; Takano, Yu; Shigeta, Yasuteru

    2014-03-28

    A powerful conformational sampling method for accelerating structural transitions of proteins, "Fluctuation Flooding Method (FFM)," is proposed. In FFM, cycles of the following steps enhance the transitions: (i) extractions of largely fluctuating snapshots along anisotropic modes obtained from trajectories of multiple independent molecular dynamics (MD) simulations and (ii) conformational re-sampling of the snapshots via re-generations of initial velocities when re-starting MD simulations. In an application to bacteriophage T4 lysozyme, FFM successfully accelerated the open-closed transition with the 6 ns simulation starting solely from the open state, although the 1-μs canonical MD simulation failed to sample such a rare event.

  15. Fluctuation Flooding Method (FFM) for accelerating conformational transitions of proteins

    NASA Astrophysics Data System (ADS)

    Harada, Ryuhei; Takano, Yu; Shigeta, Yasuteru

    2014-03-01

    A powerful conformational sampling method for accelerating structural transitions of proteins, "Fluctuation Flooding Method (FFM)," is proposed. In FFM, cycles of the following steps enhance the transitions: (i) extractions of largely fluctuating snapshots along anisotropic modes obtained from trajectories of multiple independent molecular dynamics (MD) simulations and (ii) conformational re-sampling of the snapshots via re-generations of initial velocities when re-starting MD simulations. In an application to bacteriophage T4 lysozyme, FFM successfully accelerated the open-closed transition with the 6 ns simulation starting solely from the open state, although the 1-μs canonical MD simulation failed to sample such a rare event.

  16. Understanding flood-induced water chemistry variability extracting temporal patterns with the LDA method

    NASA Astrophysics Data System (ADS)

    Aubert, A. H.; Tavenard, R.; Emonet, R.; De Lavenne, A.; Malinowski, S.; Guyet, T.; Quiniou, R.; Odobez, J.; Merot, P.; Gascuel-odoux, C.

    2013-12-01

    events. The patterns themselves are carefully studied, as well as their repartition along the year and along the 12 years of the dataset. We would recommend the use of such model to any study based on patterns or signature extraction. It could be well suited to compare different geographical locations and analyzing the resulting different pattern distributions. (1) Aubert, A.H., Gascuel-Odoux, C., Gruau, G., Akkal, N., Faucheux, M., Fauvel, Y., Grimaldi, C., Hamon, Y., Jaffrezic, A., Lecoz Boutnik, M., Molenat, J., Petitjean, P., Ruiz, L., Merot, Ph. (2013), Solute transport dynamics in small, shallow groundwater-dominated agricultural catchments: insights from a high-frequency, multisolute 10 yr-long monitoring study. Hydrol. Earth Syst. Sci., 17(4): 1379-1391. (2) Aubert, A.H., Tavenard, R, Emonet, R., de Lavenne, A., Malinowski, S., Guyet, T., Quiniou, R., Odobez, J.-M., Merot, Ph., Gascuel-Odoux, C., submitted to WRR. Clustering with a probabilistic method newly applied in hydrology: application on flood events from water quality time-series.

  17. Chemical Methods for Ugnu Viscous Oils

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kishore Mohanty

    2012-03-31

    The North Slope of Alaska has large (about 20 billion barrels) deposits of viscous oil in Ugnu, West Sak and Shraeder Bluff reservoirs. These shallow reservoirs overlie existing productive reservoirs such as Kuparuk and Milne Point. The viscosity of the Ugnu reservoir on top of Milne Point varies from 200 cp to 10,000 cp and the depth is about 3300 ft. The same reservoir extends to the west on the top of the Kuparuk River Unit and onto the Beaufort Sea. The depth of the reservoir decreases and the viscosity increases towards the west. Currently, the operators are testing coldmore » heavy oil production with sand (CHOPS) in Ugnu, but oil recovery is expected to be low (< 10%). Improved oil recovery techniques must be developed for these reservoirs. The proximity to the permafrost is an issue for thermal methods; thus nonthermal methods must be considered. The objective of this project is to develop chemical methods for the Ugnu reservoir on the top of Milne Point. An alkaline-surfactant-polymer (ASP) formulation was developed for a viscous oil (330 cp) where as an alkaline-surfactant formulation was developed for a heavy oil (10,000 cp). These formulations were tested in one-dimensional and quarter five-spot Ugnu sand packs. Micromodel studies were conducted to determine the mechanisms of high viscosity ratio displacements. Laboratory displacements were modeled and transport parameters (such as relative permeability) were determined that can be used in reservoir simulations. Ugnu oil is suitable for chemical flooding because it is biodegraded and contains some organic acids. The acids react with injected alkali to produce soap. This soap helps in lowering interfacial tension between water and oil which in turn helps in the formation of macro and micro emulsions. A lower amount of synthetic surfactant is needed because of the presence of organic acids in the oil. Tertiary ASP flooding is very effective for the 330 cp viscous oil in 1D sand pack. This chemical

  18. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas

    PubMed Central

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-01-01

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation. PMID:27527202

  19. A simple solvent method for the recovery of LixCoO2 and its applications in alkaline rechargeable batteries

    NASA Astrophysics Data System (ADS)

    Xu, Yanan; Song, Dawei; Li, Li; An, Cuihua; Wang, Yijing; Jiao, LiFang; Yuan, Huatang

    2014-04-01

    A simple solvent method is proposed for the recovery of waste LixCoO2 from lithium-ion batteries, which employs inexpensive DMF to remove the binder of PVDF. This method is convenient to manipulate and low-cost to apply. Electrochemical investigations indicate that recovered LixCoO2 materials with a small amount of S-doping exhibit excellent properties as negative materials for alkaline rechargeable Ni/Co batteries. At the discharge current density of 100 mA g-1, the LixCoO2 + 1% S electrode displays the max discharge capacity of 357 mAh g-1 and outstanding capacity retention rate of 85.5% after 100 cycles. It could overcome not only the sophisticated, energy-intensive shortcomings of conventional recycling methods, but also the high-cost restriction on alkaline rechargeable Ni/Co batteries.

  20. Anti-transpirant activity in xylem sap from flooded tomato (Lycopersicon esculentum Mill.) plants is not due to pH-mediated redistributions of root- or shoot-sourced ABA.

    PubMed

    Else, Mark A; Taylor, June M; Atkinson, Christopher J

    2006-01-01

    In flooded soils, the rapid effects of decreasing oxygen availability on root metabolic activity are likely to generate many potential chemical signals that may impact on stomatal apertures. Detached leaf transpiration tests showed that filtered xylem sap, collected at realistic flow rates from plants flooded for 2 h and 4 h, contained one or more factors that reduced stomatal apertures. The closure could not be attributed to increased root output of the glucose ester of abscisic acid (ABA-GE), since concentrations and deliveries of ABA conjugates were unaffected by soil flooding. Although xylem sap collected from the shoot base of detopped flooded plants became more alkaline within 2 h of flooding, this rapid pH change of 0.5 units did not alter partitioning of root-sourced ABA sufficiently to prompt a transient increase in xylem ABA delivery. More shoot-sourced ABA was detected in the xylem when excised petiole sections were perfused with pH 7 buffer, compared with pH 6 buffer. Sap collected from the fifth oldest leaf of "intact" well-drained plants and plants flooded for 3 h was more alkaline, by approximately 0.4 pH units, than sap collected from the shoot base. Accordingly, xylem [ABA] was increased 2-fold in sap collected from the fifth oldest petiole compared with the shoot base of flooded plants. However, water loss from transpiring, detached leaves was not reduced when the pH of the feeding solution containing 3-h-flooded [ABA] was increased from 6.7 to 7.1 Thus, the extent of the pH-mediated, shoot-sourced ABA redistribution was not sufficient to raise xylem [ABA] to physiologically active levels. Using a detached epidermis bioassay, significant non-ABA anti-transpirant activity was also detected in xylem sap collected at intervals during the first 24 h of soil flooding.

  1. Global coastal flood hazard mapping

    NASA Astrophysics Data System (ADS)

    Eilander, Dirk; Winsemius, Hessel; Ward, Philip; Diaz Loaiza, Andres; Haag, Arjen; Verlaan, Martin; Luo, Tianyi

    2017-04-01

    Over 10% of the world's population lives in low-lying coastal areas (up to 10m elevation). Many of these areas are prone to flooding from tropical storm surges or extra-tropical high sea levels in combination with high tides. A 1 in 100 year extreme sea level is estimated to expose 270 million people and 13 trillion USD worth of assets to flooding. Coastal flood risk is expected to increase due to drivers such as ground subsidence, intensification of tropical and extra-tropical storms, sea level rise and socio-economic development. For better understanding of the hazard and drivers to global coastal flood risk, a globally consistent analysis of coastal flooding is required. In this contribution we present a comprehensive global coastal flood hazard mapping study. Coastal flooding is estimated using a modular inundation routine, based on a vegetation corrected SRTM elevation model and forced by extreme sea levels. Per tile, either a simple GIS inundation routine or a hydrodynamic model can be selected. The GIS inundation method projects extreme sea levels to land, taking into account physical obstructions and dampening of the surge level land inwards. For coastlines with steep slopes or where local dynamics play a minor role in flood behavior, this fast GIS method can be applied. Extreme sea levels are derived from the Global Tide and Surge Reanalysis (GTSR) dataset. Future sea level projections are based on probabilistic sea level rise for RCP 4.5 and RCP 8.5 scenarios. The approach is validated against observed flood extents from ground and satellite observations. The results will be made available through the online Aqueduct Global Flood Risk Analyzer of the World Resources Institute.

  2. 'System-Risk' Flood Task Force

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Ridder, Nina; Tavares da Costa, Ricardo; Diederen, Dirk; Viglione, Alberto

    2017-04-01

    Current scientific methods and engineering practice in flood risk assessment do not consider the full complexity of flood risk systems. Fundamental spatio-temporal dependencies, interactions and feedbacks need to be addressed to comprehensively quantify the effects of measures at various levels, ranging from local technical to high-level policy options. As each flood is unique, each event offers an unparalleled opportunity to collect data and to gain insights into system's behavior under extreme conditions potentially revealing exceptional circumstances, unexpected failures and cascading effects, and thus a chance to learn and to improve methods and models. To make use of this the Marie-Skłodowska-Curie European Training Network 'System-Risk' (www.system-risk.eu) establishes a Flood Task Force (FTF) that aims to learn about successful practical approaches, but also potential pitfalls and failures in the management of real flood events. The FTF consists of an interdisciplinary group of researchers who will apply in situ their latest methods and knowledge of e.g. how the event developed, how the risk management responded, and what the consequences were. This multi-layered perspective is intended to deepen the understanding of the complexity of flood risk systems as for instance in terms of interactions between hazard, the natural and the built environment, societal institutions and coping capacities. This contribution gives an overview of the conceptual approach to the System-Risk FTF.

  3. Acidity and alkalinity in mine drainage: Theoretical considerations

    USGS Publications Warehouse

    Kirby, Carl S.; Cravotta,, Charles A.

    2004-01-01

    Acidity, net acidity, and net alkalinity are widely used parameters for the characterization of mine drainage, but these terms are not well defined and are often misunderstood. Incorrect interpretation of acidity, alkalinity, and derivative terms can lead to inadequate treatment design or poor regulatory decisions. We briefly explain derivations of theoretical expressions of three types of alkalinities (caustic, phenolphthalein, and total) and acidities (mineral, CO2, and total). Theoretically defined total alkalinity is closely analogous to measured alkalinity and presents few practical interpretation problems. Theoretically defined “CO2- acidity” is closely related to most standard titration methods used for mine drainage with an endpoint pH of 8.3, but it presents numerous interpretation problems, and it is unfortunately named because CO2 is intentionally driven off during titration of mine-drainage samples. Using the proton condition/massaction approach and employing graphs for visualization, we explore the concept of principal components and how to assign acidity contributions to solution species, including aqueous complexes, commonly found in mine drainage. We define a comprehensive theoretical definition of acidity in mine drainage on the basis of aqueous speciation at the sample pH and the capacity of these species to undergo hydrolysis to pH 8.3. This definition indicates the computed acidity in milligrams per liter (mg L-1 ) as CaCO3 (based on pH and analytical concentrations of dissolved FeIII , FeII , Mn, and Al in mg L-1 ): Aciditycomputed = 50. (10(3-pH) + 3.CFeIII/55.8 + 2.CFeII/55.8 + 2.CMn/54.9 + 3.CAl/27.0) underestimates contributions from HSO4 - and H+ , but overestimates the acidity due to Fe3+. These errors tend to approximately cancel each other. We demonstrate that “net alkalinity” is a valid mathematical construction based on theoretical definitions of alkalinity and acidity. We demonstrate that, for most mine-drainage solutions, a

  4. Real-time simulation of large-scale floods

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  5. Flood area and damage estimation in Zhejiang, China.

    PubMed

    Liu, Renyi; Liu, Nan

    2002-09-01

    A GIS-based method to estimate flood area and damage is presented in this paper, which is oriented to developing countries like China, where labor is readily available for GIS data collecting, and tools such as, HEC-GeoRAS might not be readily available. At present local authorities in developing countries are often not predisposed to pay for commercial GIS platforms. To calculate flood area, two cases, non-source flood and source flood, are distinguished and a seed-spread algorithm suitable for source-flooding is described. The flood damage estimation is calculated in raster format by overlaying the flood area range with thematic maps and relating this to other socioeconomic data. Several measures used to improve the geometric accuracy and computing efficiency are presented. The management issues related to the application of this method, including the cost-effectiveness of approximate method in practice and supplementing two technical lines (self-programming and adopting commercial GIS software) to each other, are also discussed. The applications show that this approach has practical significance to flood fighting and control in developing countries like China.

  6. Cyber Surveillance for Flood Disasters

    PubMed Central

    Lo, Shi-Wei; Wu, Jyh-Horng; Lin, Fang-Pang; Hsu, Ching-Han

    2015-01-01

    Regional heavy rainfall is usually caused by the influence of extreme weather conditions. Instant heavy rainfall often results in the flooding of rivers and the neighboring low-lying areas, which is responsible for a large number of casualties and considerable property loss. The existing precipitation forecast systems mostly focus on the analysis and forecast of large-scale areas but do not provide precise instant automatic monitoring and alert feedback for individual river areas and sections. Therefore, in this paper, we propose an easy method to automatically monitor the flood object of a specific area, based on the currently widely used remote cyber surveillance systems and image processing methods, in order to obtain instant flooding and waterlogging event feedback. The intrusion detection mode of these surveillance systems is used in this study, wherein a flood is considered a possible invasion object. Through the detection and verification of flood objects, automatic flood risk-level monitoring of specific individual river segments, as well as the automatic urban inundation detection, has become possible. The proposed method can better meet the practical needs of disaster prevention than the method of large-area forecasting. It also has several other advantages, such as flexibility in location selection, no requirement of a standard water-level ruler, and a relatively large field of view, when compared with the traditional water-level measurements using video screens. The results can offer prompt reference for appropriate disaster warning actions in small areas, making them more accurate and effective. PMID:25621609

  7. A Mixed Method to Evaluate Burden of Malaria Due to Flooding and Waterlogging in Mengcheng County, China: A Case Study

    PubMed Central

    Ding, Guoyong; Gao, Lu; Li, Xuewen; Zhou, Maigeng; Liu, Qiyong; Ren, Hongyan; Jiang, Baofa

    2014-01-01

    Background Malaria is a highly climate-sensitive vector-borne infectious disease that still represents a significant public health problem in Huaihe River Basin. However, little comprehensive information about the burden of malaria caused by flooding and waterlogging is available from this region. This study aims to quantitatively assess the impact of flooding and waterlogging on the burden of malaria in a county of Anhui Province, China. Methods A mixed method evaluation was conducted. A case-crossover study was firstly performed to evaluate the relationship between daily number of cases of malaria and flooding and waterlogging from May to October 2007 in Mengcheng County, China. Stratified Cox models were used to examine the lagged time and hazard ratios (HRs) of the risk of flooding and waterlogging on malaria. Years lived with disability (YLDs) of malaria attributable to flooding and waterlogging were then estimated based on the WHO framework of calculating potential impact fraction in the Global Burden of Disease study. Results A total of 3683 malaria were notified during the study period. The strongest effect was shown with a 25-day lag for flooding and a 7-day lag for waterlogging. Multivariable analysis showed that an increased risk of malaria was significantly associated with flooding alone [adjusted hazard ratio (AHR)  = 1.467, 95% CI = 1.257, 1.713], waterlogging alone (AHR = 1.879, 95% CI = 1.696, 2.121), and flooding and waterlogging together (AHR = 2.926, 95% CI = 2.576, 3.325). YLDs per 1000 of malaria attributable to flooding alone, waterlogging alone and flooding and waterlogging together were 0.009 per day, 0.019 per day and 0.022 per day, respectively. Conclusion Flooding and waterlogging can lead to higher burden of malaria in the study area. Public health action should be taken to avoid and control a potential risk of malaria epidemics after these two weather disasters. PMID:24830808

  8. Application of flood-intensity-duration curve, rainfall-intensity-duration curve and time of concentration to analyze the pattern of storms and their corresponding floods for the natural flood events

    NASA Astrophysics Data System (ADS)

    Kim, Nam Won; Shin, Mun-Ju; Lee, Jeong Eun

    2016-04-01

    The analysis of storm effects on floods is essential step for designing hydraulic structure and flood plain. There are previous studies for analyzing the relationship between the storm patterns and peak flow, flood volume and durations for various sizes of the catchments, but they are not enough to analyze the natural storm effects on flood responses quantitatively. This study suggests a novel method of quantitative analysis using unique factors extracted from the time series of storms and floods to investigate the relationship between natural storms and their corresponding flood responses. We used a distributed rainfall-runoff model of Grid based Rainfall-runoff Model (GRM) to generate the simulated flow and areal rainfall for 50 catchments in Republic of Korea size from 5.6 km2 to 1584.2 km2, which are including overlapped dependent catchments and non-overlapped independent catchments. The parameters of the GRM model were calibrated to get the good model performances of Nash-Sutcliffe efficiency. Then Flood-Intensity-Duration Curve (FIDC) and Rainfall-Intensity-Duration Curve (RIDC) were generated by Flood-Duration-Frequency and Intensity-Duration-Frequency methods respectively using the time series of hydrographs and hyetographs. Time of concentration developed for the Korea catchments was used as a consistent measure to extract the unique factors from the FIDC and RIDC over the different size of catchments. These unique factors for the storms and floods were analyzed against the different size of catchments to investigate the natural storm effects on floods. This method can be easily used to get the intuition of the natural storm effects with various patterns on flood responses. Acknowledgement This research was supported by a grant (11-TI-C06) from Advanced Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  9. Research of Water Level Prediction for a Continuous Flood due to Typhoons Based on a Machine Learning Method

    NASA Astrophysics Data System (ADS)

    Nakatsugawa, M.; Kobayashi, Y.; Okazaki, R.; Taniguchi, Y.

    2017-12-01

    This research aims to improve accuracy of water level prediction calculations for more effective river management. In August 2016, Hokkaido was visited by four typhoons, whose heavy rainfall caused severe flooding. In the Tokoro river basin of Eastern Hokkaido, the water level (WL) at the Kamikawazoe gauging station, which is at the lower reaches exceeded the design high-water level and the water rose to the highest level on record. To predict such flood conditions and mitigate disaster damage, it is necessary to improve the accuracy of prediction as well as to prolong the lead time (LT) required for disaster mitigation measures such as flood-fighting activities and evacuation actions by residents. There is the need to predict the river water level around the peak stage earlier and more accurately. Previous research dealing with WL prediction had proposed a method in which the WL at the lower reaches is estimated by the correlation with the WL at the upper reaches (hereinafter: "the water level correlation method"). Additionally, a runoff model-based method has been generally used in which the discharge is estimated by giving rainfall prediction data to a runoff model such as a storage function model and then the WL is estimated from that discharge by using a WL discharge rating curve (H-Q curve). In this research, an attempt was made to predict WL by applying the Random Forest (RF) method, which is a machine learning method that can estimate the contribution of explanatory variables. Furthermore, from the practical point of view, we investigated the prediction of WL based on a multiple correlation (MC) method involving factors using explanatory variables with high contribution in the RF method, and we examined the proper selection of explanatory variables and the extension of LT. The following results were found: 1) Based on the RF method tuned up by learning from previous floods, the WL for the abnormal flood case of August 2016 was properly predicted with a lead

  10. POISON SPIDER FIELD CHEMICAL FLOOD PROJECT, WYOMING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Douglas Arnell; Malcolm Pitts; Jie Qi

    2004-11-01

    -rock compatibility, polymer injectivity, dynamic chemical retention by rock, and recommended injected polymer concentration. Average initial oil saturation was 0.796 Vp. Produced water injection recovered 53% OOIP leaving an average residual oil saturation of 0.375 Vp. Poison Spider rock was strongly water-wet with a mobility ratio for produced water displacing the 280 cp crude oil of 8.6. Core was not sensitive to either alkali or surfactant injection. Injectivity increased 60 to 80% with alkali plus surfactant injection. Low and medium molecular weight polyacrylamide polymers (Flopaam 3330S and Flopaam 3430S) dissolved in either an alkaline-surfactant solution or softened produced water injected and flowed through Poison Spider rock. Recommended injected polyacrylamide concentration is 2,100 mg/L for both polymers for a unit mobility ratio. Radial corefloods were performed to evaluate oil recovery efficiency of different chemical solutions. Waterflood oil recovery averaged 46.4 OOIP and alkaline-surfactant-polymer flood oil recovery averaged an additional 18.1% OIP for a total of 64.6% OOIP. Oil cut change due to injection of a 1.5 wt% Na{sub 2}CO{sub 3} plus 0.05 wt% Petrostep B-100 plus 0.05 wt% Stepantan AS1216 plus 2100 mg/L Flopaam 3430S was from 2% to a peak of 23.5%. Additional study might determine the impact on oil recovery of a lower polymer concentration. An alkaline-surfactant-polymer flood field implementation outline report was written.« less

  11. Method of estimating flood-frequency parameters for streams in Idaho

    USGS Publications Warehouse

    Kjelstrom, L.C.; Moffatt, R.L.

    1981-01-01

    Skew coefficients for the log-Pearson type III distribution are generalized on the basis of some similarity of floods in the Snake River basin and other parts of Idaho. Generalized skew coefficients aid in shaping flood-frequency curves because skew coefficients computed from gaging stations having relatively short periods of peak flow records can be unreliable. Generalized skew coefficients can be obtained for a gaging station from one of three maps in this report. The map to be used depends on whether (1) snowmelt floods are domiant (generally when more than 20 percent of the drainage area is above 6,000 feet altitude), (2) rainstorm floods are dominant (generally when the mean altitude is less than 3,000 feet), or (3) either snowmelt or rainstorm floods can be the annual miximum discharge. For the latter case, frequency curves constructed using separate arrays of each type of runoff can be combined into one curve, which, for some stations, is significantly different than the frequency curve constructed using only annual maximum discharges. For 269 gaging stations, flood-frequency curves that include the generalized skew coefficients in the computation of the log-Pearson type III equation tend to fit the data better than previous analyses. Frequency curves for ungaged sites can be derived by estimating three statistics of the log-Pearson type III distribution. The mean and standard deviation of logarithms of annual maximum discharges are estimated by regression equations that use basin characteristics as independent variables. Skew coefficient estimates are the generalized skews. The log-Pearson type III equation is then applied with the three estimated statistics to compute the discharge at selected exceedance probabilities. Standard errors at the 2-percent exceedance probability range from 41 to 90 percent. (USGS)

  12. Development of a national Flash flood warning system in France using the AIGA method: first results and main issues

    NASA Astrophysics Data System (ADS)

    Javelle, Pierre; Organde, Didier; Demargne, Julie; de Saint-Aubin, Céline; Garandeau, Léa; Janet, Bruno; Saint-Martin, Clotilde; Fouchier, Catherine

    2016-04-01

    Developing a national flash flood (FF) warning system is an ambitious and difficult task. On one hand it rises huge expectations from exposed populations and authorities since induced damages are considerable (ie 20 casualties in the recent October 2015 flood at the French Riviera). But on the other hand, many practical and scientific issues have to be addressed and limitations should be clearly stated. The FF warning system to be implemented by 2016 in France by the SCHAPI (French national service in charge of flood forecasting) will be based on a discharge-threshold flood warning method called AIGA (Javelle et al. 2014). The AIGA method has been experimented in real time in the south of France in the RHYTMME project (http://rhytmme.irstea.fr). It consists in comparing discharges generated by a simple conceptual hourly hydrologic model run at a 1-km² resolution to reference flood quantiles of different return periods, at any point along the river network. The hydrologic model ingests operational rainfall radar-gauge products from Météo-France. Model calibration was based on ~700 hydrometric stations over the 2002-2015 period and then hourly discharges were computed at ~76 000 catchment outlets, with areas ranging from 10 to 3 500 km², over the last 19 years. This product makes it possible to calculate reference flood quantiles at each outlet. The on-going evaluation of the FF warnings is currently made at two levels: in a 'classical' way, using discharges available at the hydrometric stations, but also in a more 'exploratory' way, by comparing past flood reports and warnings issued by the system over the 76 000 catchment outlets. The interest of the last method is that it better fit the system objectives since it is designed to monitor small ungauged catchments. Javelle, P., Demargne, J., Defrance, D, .Pansu, J, .Arnaud, P. (2014). Evaluating flash-flood warnings at ungauged locations using post-event surveys: a case study with the AIGA warning system

  13. Flood Insurance in Canada: Implications for Flood Management and Residential Vulnerability to Flood Hazards

    NASA Astrophysics Data System (ADS)

    Oulahen, Greg

    2015-03-01

    Insurance coverage of damage caused by overland flooding is currently not available to Canadian homeowners. As flood disaster losses and water damage claims both trend upward, insurers in Canada are considering offering residential flood coverage in order to properly underwrite the risk and extend their business. If private flood insurance is introduced in Canada, it will have implications for the current regime of public flood management and for residential vulnerability to flood hazards. This paper engages many of the competing issues surrounding the privatization of flood risk by addressing questions about whether flood insurance can be an effective tool in limiting exposure to the hazard and how it would exacerbate already unequal vulnerability. A case study investigates willingness to pay for flood insurance among residents in Metro Vancouver and how attitudes about insurance relate to other factors that determine residential vulnerability to flood hazards. Findings indicate that demand for flood insurance is part of a complex, dialectical set of determinants of vulnerability.

  14. Guidelines for determining flood flow frequency—Bulletin 17C

    USGS Publications Warehouse

    England, John F.; Cohn, Timothy A.; Faber, Beth A.; Stedinger, Jery R.; Thomas, Wilbert O.; Veilleux, Andrea G.; Kiang, Julie E.; Mason, Robert R.

    2018-03-29

    Accurate estimates of flood frequency and magnitude are a key component of any effective nationwide flood risk management and flood damage abatement program. In addition to accuracy, methods for estimating flood risk must be uniformly and consistently applied because management of the Nation’s water and related land resources is a collaborative effort involving multiple actors including most levels of government and the private sector.Flood frequency guidelines have been published in the United States since 1967, and have undergone periodic revisions. In 1967, the U.S. Water Resources Council presented a coherent approach to flood frequency with Bulletin 15, “A Uniform Technique for Determining Flood Flow Frequencies.” The method it recommended involved fitting the log-Pearson Type III distribution to annual peak flow data by the method of moments.The first extension and update of Bulletin 15 was published in 1976 as Bulletin 17, “Guidelines for Determining Flood Flow Frequency” (Guidelines). It extended the Bulletin 15 procedures by introducing methods for dealing with outliers, historical flood information, and regional skew. Bulletin 17A was published the following year to clarify the computation of weighted skew. The next revision of the Bulletin, the Bulletin 17B, provided a host of improvements and new techniques designed to address situations that often arise in practice, including better methods for estimating and using regional skew, weighting station and regional skew, detection of outliers, and use of the conditional probability adjustment.The current version of these Guidelines are presented in this document, denoted Bulletin 17C. It incorporates changes motivated by four of the items listed as “Future Work” in Bulletin 17B and 30 years of post-17B research on flood processes and statistical methods. The updates include: adoption of a generalized representation of flood data that allows for interval and censored data types; a new method

  15. Estimating design flood and HEC-RAS modelling approach for flood analysis in Bojonegoro city

    NASA Astrophysics Data System (ADS)

    Prastica, R. M. S.; Maitri, C.; Hermawan, A.; Nugroho, P. C.; Sutjiningsih, D.; Anggraheni, E.

    2018-03-01

    Bojonegoro faces flood every year with less advanced prevention development. Bojonegoro city development could not peak because the flood results material losses. It affects every sectors in Bojonegoro: education, politics, economy, social, and infrastructure development. This research aims to analyse and to ensure that river capacity has high probability to be the main factor of flood in Bojonegoro. Flood discharge analysis uses Nakayasu synthetic unit hydrograph for period of 5 years, 10 years, 25 years, 50 years, and 100 years. They would be compared to the water maximum capacity that could be loaded by downstream part of Bengawan Solo River in Bojonegoro. According to analysis result, Bengawan Solo River in Bojonegoro could not able to load flood discharges. Another method used is HEC-RAS analysis. The conclusion that shown by HEC-RAS analysis has the same view. It could be observed that flood water loading is more than full bank capacity elevation in the river. To conclude, the main factor that should be noticed by government to solve flood problem is river capacity.

  16. Real Time Monitoring of Flooding from Microwave Satellite Observations

    NASA Technical Reports Server (NTRS)

    Galantowicz, John F.; Frey, Herb (Technical Monitor)

    2002-01-01

    We have developed a new method for making high-resolution flood extent maps (e.g., at the 30-100 m scale of digital elevation models) in real-time from low-resolution (20-70 km) passive microwave observations. The method builds a "flood-potential" database from elevations and historic flood imagery and uses it to create a flood-extent map consistent with the observed open water fraction. Microwave radiometric measurements are useful for flood monitoring because they sense surface water in clear-or-cloudy conditions and can provide more timely data (e.g., compared to radars) from relatively wide swath widths and an increasing number of available platforms (DMSP, ADEOS-II, Terra, NPOESS, GPM). The chief disadvantages for flood mapping are the radiometers' low resolution and the need for local calibration of the relationship between radiances and open-water fraction. We present our method for transforming microwave sensor-scale open water fraction estimates into high-resolution flood extent maps and describe 30-day flood map sequences generated during a retrospective study of the 1993 Great Midwest Flood. We discuss the method's potential improvement through as yet unimplemented algorithm enhancements and expected advancements in microwave radiometry (e.g., improved resolution and atmospheric correction).

  17. Methods of use of calcium hexa aluminate refractory linings and/or chemical barriers in high alkali or alkaline environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGowan, Kenneth A; Cullen, Robert M; Keiser, James R

    A method for improving the insulating character/and or penetration resistance of a liner in contact with at least one of an alkali and/or alkaline environments is provided. The method comprises lining a surface that is subject to wear by an alkali environment and/or an alkaline environment with a refractory composition comprising a refractory aggregate consisting essentially of a calcium hexa aluminate clinker having the formula CA.sub.6, wherein C is equal to calcium oxide, wherein A is equal to aluminum oxide, and wherein the hexa aluminate clinker has from zero to less than about fifty weight percent C.sub.12A.sub.7, and wherein greatermore » than 98 weight percent of the calcium hexa aluminate clinker having a particle size ranging from -20 microns to +3 millimeters, for forming a liner of the surface. This method improves the insulating character/and or penetration resistance of the liner.« less

  18. Field measurement of alkalinity and pH

    USGS Publications Warehouse

    Barnes, Ivan

    1964-01-01

    The behavior of electrometric pH equipment under field conditions departs from the behavior predicted from Nernst's law. The response is a linear function of pH, and hence measured pH values may be corrected to true pH if the instrument is calibrated with two reference solutions for each measurement. Alkalinity titrations may also be made in terms of true pH. Standard methods, such as colorimetric titrations, were rejected as unreliable or too cumbersome for rapid field use. The true pH of the end point of the alkalinity titration as a function of temperature, ionic strength, and total alkalinity has been calculated. Total alkalinity in potable waters is the most important factor influencing the end point pH, which varies from 5.38 (0 ? C, 5 ppm (parts per million) HC0a-) to 4.32 (300 ppm HC0a-,35 ? C), for the ranges of variables considered. With proper precautions, the pH may be determined to =i:0.02 pH and the alkalinity to =i:0.6 ppm HCO3- for many naturally occurring bodies of fresh water.

  19. Discovering temporal patterns in water quality time series, focusing on floods with the LDA method

    NASA Astrophysics Data System (ADS)

    Hélène Aubert, Alice; Tavenard, Romain; Emonet, Rémi; Malinowski, Simon; Guyet, Thomas; Quiniou, René; Odobez, Jean-Marc; Gascuel-Odoux, Chantal

    2013-04-01

    of several flood patterns. The output of LDA is a set of patterns that can easily be represented in graphics. These patterns correspond to typical reactions to rainfall events. The patterns themselves are carefully studied, as well as their repartition along the year and along the 12 years of the dataset. The novelties are fourfold. First, as a methodological point of view, we learn that hydrological data can be analyzed with this LDA model giving a typology of a multivariate chemical signature of floods. Second, we outline that chemistry parameters are sufficient to obtain meaningful patterns. There is no need to include hydro-meteorological parameters to define the patterns. However, hydro-meteorological parameters are useful to understand the processes leading to these patterns. Third, our hypothesis of seasonal specific reaction to rainfall is verified, moreover detailed; so is our hypothesis of different reactions to rainfall for years with different hydro-meteorological conditions. Fourth, this method allows the consideration of overlapping floods that are usually not studied. We would recommend the use of such model to study chemical reactions of stream after rainfall events, or more broadly after any hydrological events. The typology that has been provided by this method is a kind of bar code of water chemistry during floods. It could be well suited to compare different geographical locations by using the same patterns and analysing the resulting different pattern distributions. (1) Aubert, A.H. et al., 2012. The chemical signature of a livestock farming catchment: synthesis from a high-frequency multi-element long term monitoring. HESSD, 9(8): 9715 - 9741. (2) Aubert, A.H., Gascuel-Odoux, C., Merot, P., 2013. Annual hysteresis of water quality: A method to analyse the effect of intra- and inter-annual climatic conditions. Journal of Hydrology, 478(0): 29-39. (3) Blei, D. M.; Ng, A. Y.; Jordan, M. I., 2003. Latent Dirichlet allocation. Journal of Machine

  20. Modelling and scale-up of chemical flooding: Second annual report for the period October 1986--September 1987

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, G.A.; Lake, L.W.; Sepehrnoori, K.

    1988-11-01

    The objective of this research is to develop, validate, and apply a comprehensive chemical flooding simulator for chemical recovery processes involving surfactants, polymers, and alkaline chemicals in various combinations. This integrated program includes components of laboratory experiments, physical property modelling, scale-up theory, and numerical analysis as necessary and integral components of the simulation activity. Developing, testing and applying flooding simulator (UTCHEM) to a wide variety of laboratory and reservoir problems involving tracers, polymers, polymer gels, surfactants, and alkaline agent has been continued. Improvements in both the physical-chemical and numerical aspects of UTCHEM have been made which enhance its versatility, accuracymore » and speed. Supporting experimental studies during the past year include relative permeability and trapping of microemulsion, tracer flow studies oil recovery in cores using alcohol free surfactant slugs, and microemulsion viscosity measurements. These have enabled model improvement simulator testing. Another code called PROPACK has also been developed which is used as a preprocessor for UTCHEM. Specifically, it is used to evaluate input to UTCHEM by computing and plotting key physical properties such as phase behavior interfacial tension.« less

  1. Flood Foresight: A near-real time flood monitoring and forecasting tool for rapid and predictive flood impact assessment

    NASA Astrophysics Data System (ADS)

    Revilla-Romero, Beatriz; Shelton, Kay; Wood, Elizabeth; Berry, Robert; Bevington, John; Hankin, Barry; Lewis, Gavin; Gubbin, Andrew; Griffiths, Samuel; Barnard, Paul; Pinnell, Marc; Huyck, Charles

    2017-04-01

    -scale view of the extent and depth of possible riverine flood events several days in advance by linking forecast river flow from a hydrological model to a global flood risk map. The Monitoring component provides a similar local-scale view of a flood inundation extent but in near real time, as an event unfolds, by combining the global flood risk map with observed river gauge telemetry. Immediately following an event, the maximum extent of the flood is also generated. Users of Flood Foresight will be able to receive current and forecast flood extents and depth information via API into their own GIS or analytics software. The set of tools is currently operational for the UK and Europe; the methods presented can be applied globally, allowing provision of service to any country or region. This project was supported by InnovateUK under the Solving Business Problems with Environmental Data competition.

  2. Combined fluvial and pluvial urban flood hazard analysis: method development and application to Can Tho City, Mekong Delta, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, H.; Trepat, O. M.; Hung, N. N.; Chinh, D. T.; Merz, B.; Dung, N. V.

    2015-08-01

    Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU) for time-efficient flood propagation modelling. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by

  3. New test method for the evaluation of the preservation efficacy of soaps at very alkaline pH made by saponification.

    PubMed

    Témoin-Fardini, S; Servant, J; Sellam, S

    2017-10-01

    The aim of this study was to develop a test method to evaluate the preservation efficacy for a specific product, a very high-alkaline liquid soap (pH around 10) made by a saponification process. Several manufacturers have experienced contamination issues with these high-pH soaps despite passing a classic preservative efficacy challenge test or even a multi-inoculation challenge test. Bacteria were isolated from contaminated soaps and were identified using 16S rRNA gene sequencing. High-alkaline-pH unpreserved soaps were tested using the Thor Personal Care internal multichallenge test method (TM206) with classical microorganisms and then with the bacterial strains isolated from various contaminated soaps (TM768). Preservatives were added to these soaps and assessed for their efficacy using the newly developed test. Four different species of bacteria (Nesterenkonia lacusekhoensis, Dermacoccus sp., Halomonas sp. and Roseomonas sp.) were identified by sequencing among the contaminants of the various soaps tested. Among these, only one bacterial species, Nesterenkonia lacusekhoensis, appeared to be responsible for the specific contamination of these high-alkaline soaps. Thus, one specific wild-type strain of Nesterenkonia lacusekhoensis, named as strain 768, was used in a new multi-inoculation test (TM768). Unlike the single inoculation challenge test, the multi-inoculation test using the Nesterenkonia strain 768 was able to predict the sensitivity of a product towards this bacterium. Among the 27 different preservatives tested, 10 were able to protect the formula against contamination with this bacterium. This study enabled the development of a test method to evaluate the efficacy of preservation using a specific bacterium, Nesterenkonia lacusekhoensis, responsible for the contamination of very alkaline soaps made by saponification and identify an appropriate preservative system. © 2017 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  4. Impacts of calibration strategies and ensemble methods on ensemble flood forecasting over Lanjiang basin, Southeast China

    NASA Astrophysics Data System (ADS)

    Liu, Li; Xu, Yue-Ping

    2017-04-01

    Ensemble flood forecasting driven by numerical weather prediction products is becoming more commonly used in operational flood forecasting applications.In this study, a hydrological ensemble flood forecasting system based on Variable Infiltration Capacity (VIC) model and quantitative precipitation forecasts from TIGGE dataset is constructed for Lanjiang Basin, Southeast China. The impacts of calibration strategies and ensemble methods on the performance of the system are then evaluated.The hydrological model is optimized by parallel programmed ɛ-NSGAII multi-objective algorithm and two respectively parameterized models are determined to simulate daily flows and peak flows coupled with a modular approach.The results indicatethat the ɛ-NSGAII algorithm permits more efficient optimization and rational determination on parameter setting.It is demonstrated that the multimodel ensemble streamflow mean have better skills than the best singlemodel ensemble mean (ECMWF) and the multimodel ensembles weighted on members and skill scores outperform other multimodel ensembles. For typical flood event, it is proved that the flood can be predicted 3-4 days in advance, but the flows in rising limb can be captured with only 1-2 days ahead due to the flash feature. With respect to peak flows selected by Peaks Over Threshold approach, the ensemble means from either singlemodel or multimodels are generally underestimated as the extreme values are smoothed out by ensemble process.

  5. Flood Resilient Systems and their Application for Flood Resilient Planning

    NASA Astrophysics Data System (ADS)

    Manojlovic, N.; Gabalda, V.; Antanaskovic, D.; Gershovich, I.; Pasche, E.

    2012-04-01

    Following the paradigm shift in flood management from traditional to more integrated approaches, and considering the uncertainties of future development due to drivers such as climate change, one of the main emerging tasks of flood managers becomes the development of (flood) resilient cities. It can be achieved by application of non-structural - flood resilience measures, summarised in the 4As: assistance, alleviation, awareness and avoidance (FIAC, 2007). As a part of this strategy, the key aspect of development of resilient cities - resilient built environment can be reached by efficient application of Flood Resilience Technology (FReT) and its meaningful combination into flood resilient systems (FRS). FRS are given as [an interconnecting network of FReT which facilitates resilience (including both restorative and adaptive capacity) to flooding, addressing physical and social systems and considering different flood typologies] (SMARTeST, http://www.floodresilience.eu/). Applying the system approach (e.g. Zevenbergen, 2008), FRS can be developed at different scales from the building to the city level. Still, a matter of research is a method to define and systematise different FRS crossing those scales. Further, the decision on which resilient system is to be applied for the given conditions and given scale is a complex task, calling for utilisation of decision support tools. This process of decision-making should follow the steps of flood risk assessment (1) and development of a flood resilience plan (2) (Manojlovic et al, 2009). The key problem in (2) is how to match the input parameters that describe physical&social system and flood typology to the appropriate flood resilient system. Additionally, an open issue is how to integrate the advances in FReT and findings on its efficiency into decision support tools. This paper presents a way to define, systematise and make decisions on FRS at different scales of an urban system developed within the 7th FP Project

  6. Mitigating flood exposure

    PubMed Central

    Shultz, James M; McLean, Andrew; Herberman Mash, Holly B; Rosen, Alexa; Kelly, Fiona; Solo-Gabriele, Helena M; Youngs Jr, Georgia A; Jensen, Jessica; Bernal, Oscar; Neria, Yuval

    2013-01-01

    Introduction. In 2011, following heavy winter snowfall, two cities bordering two rivers in North Dakota, USA faced major flood threats. Flooding was foreseeable and predictable although the extent of risk was uncertain. One community, Fargo, situated in a shallow river basin, successfully mitigated and prevented flooding. For the other community, Minot, located in a deep river valley, prevention was not possible and downtown businesses and one-quarter of the homes were inundated, in the city’s worst flood on record. We aimed at contrasting the respective hazards, vulnerabilities, stressors, psychological risk factors, psychosocial consequences, and disaster risk reduction strategies under conditions where flood prevention was, and was not, possible. Methods. We applied the “trauma signature analysis” (TSIG) approach to compare the hazard profiles, identify salient disaster stressors, document the key components of disaster risk reduction response, and examine indicators of community resilience. Results. Two demographically-comparable communities, Fargo and Minot, faced challenging river flood threats and exhibited effective coordination across community sectors. We examined the implementation of disaster risk reduction strategies in situations where coordinated citizen action was able to prevent disaster impact (hazard avoidance) compared to the more common scenario when unpreventable disaster strikes, causing destruction, harm, and distress. Across a range of indicators, it is clear that successful mitigation diminishes both physical and psychological impact, thereby reducing the trauma signature of the event. Conclusion. In contrast to experience of historic flooding in Minot, the city of Fargo succeeded in reducing the trauma signature by way of reducing risk through mitigation. PMID:28228985

  7. Water-surface profile and flood boundaries for the computed 100-year flood, lower Salt River, Lincoln County, Wyoming

    USGS Publications Warehouse

    Miller, Kirk A.; Mason, John P.

    2000-01-01

    The water-surface profile and flood boundaries for the computed 100-year flood were determined for a part of the lower Salt River in Lincoln County, Wyoming. Channel cross-section data were provided by Lincoln County. Cross-section data for bridges and other structures were collected and compiled by the U.S. Geological Survey. Roughness coefficients ranged from 0.034 to 0.100. The 100-year flood was computed using standard methods, ranged from 5,170 to 4,120 cubic feet per second through the study reach, and was adjusted proportional to contributing drainage area. Water-surface elevations were determined by the standard step-backwater method. Flood boundaries were plotted on digital basemaps.

  8. A Mixed-Method Study of Princeville's Rebuilding from the Flood of 1999: Lessons on the Importance of Invisible Community Assets

    ERIC Educational Resources Information Center

    Yoon, Intae

    2009-01-01

    Guided by previous studies and the community assets perspective, a concurrent mixed-method case study was conducted five years after a devastating flood to investigate how invisible community assets played a role in Princeville's rebuilding process from the flood of 1999. The independent variables in this study included retrospectively assessed…

  9. Assessment of Methods to Determine Tree Ring Response to Large Magnitude Mississippi River Floods

    NASA Astrophysics Data System (ADS)

    Therrell, M. D.; Meko, M. D.; Bialecki, M.; Remo, J. W.

    2017-12-01

    Riparian trees that experience prolonged inundation can record major flood events as inter-and intra-annual variability in size, shape and arrangement of vessels in the annual xylem growth increment. As part of an NSF-funded project to develop tree-ring records of past flooding, we have made collections of several oak species (e.g., Quercus lyrata, Q. macrocarpa) at six sites in the Mississippi River Basin. At each of these sites sampled trees exhibit notably anomalous anatomy of growth increments formed in years coinciding with major recorded floods. We have used these "flood rings" to develop individual site chronologies as well as a regional chronology of spring flood events in the basin for the past several hundred years. We have also analyzed earlywood vessel diameter as a proxy for flooding and find that although this variable reflects only a fraction of the annual-growth increment it strongly reflects tree response to flooding at all the sites so far examined. We compare both these chronologies with the instrumental and historical record of flooding and find that our chronologies are recording nearly all large observed Mississippi River floods in the 20th century, and provide a new record of similar events in the 18th and 19th centuries. These results suggest that tree-rings can be effectively used to develop and improve pre-instrumental flood records throughout the basin and potentially other similar systems.

  10. Flood Hazard Mapping Assessment for Lebanon

    NASA Astrophysics Data System (ADS)

    Abdallah, Chadi; Darwich, Talal; Hamze, Mouin; Zaarour, Nathalie

    2014-05-01

    Of all natural disasters, floods affect the greatest number of people worldwide and have the greatest potential to cause damage. In fact, floods are responsible for over one third of people affected by natural disasters; almost 190 million people in more than 90 countries are exposed to catastrophic floods every year. Nowadays, with the emerging global warming phenomenon, this number is expected to increase, therefore, flood prediction and prevention has become a necessity in many places around the globe to decrease damages caused by flooding. Available evidence hints at an increasing frequency of flooding disasters being witnessed in the last 25 years in Lebanon. The consequences of such events are tragic including annual financial losses of around 15 million dollars. In this work, a hydrologic-hydraulic modeling framework for flood hazard mapping over Lebanon covering 19 watershed was introduced. Several empirical, statistical and stochastic methods to calculate the flood magnitude and its related return periods, where rainfall and river gauge data are neither continuous nor available on a long term basis with an absence of proper river sections that under estimate flows during flood events. TRMM weather satellite information, automated drainage networks, curve numbers and other geometrical characteristics for each basin was prepared using WMS-software and then exported into HMS files to implement the hydrologic modeling (rainfall-runoff) for single designed storm of uniformly distributed depth along each basin. The obtained flow hydrographs were implemented in the hydraulic model (HEC-RAS) where relative water surface profiles are calculated and flood plains are delineated. The model was calibrated using the last flood event of January 2013, field investigation, and high resolution satellite images. Flow results proved to have an accuracy ranging between 83-87% when compared to the computed statistical and stochastic methods. Results included the generation of

  11. Preparing for floods: flood forecasting and early warning

    NASA Astrophysics Data System (ADS)

    Cloke, Hannah

    2016-04-01

    Flood forecasting and early warning has continued to stride ahead in strengthening the preparedness phases of disaster risk management, saving lives and property and reducing the overall impact of severe flood events. For example, continental and global scale flood forecasting systems such as the European Flood Awareness System and the Global Flood Awareness System provide early information about upcoming floods in real time to various decisionmakers. Studies have found that there are monetary benefits to implementing these early flood warning systems, and with the science also in place to provide evidence of benefit and hydrometeorological institutional outlooks warming to the use of probabilistic forecasts, the uptake over the last decade has been rapid and sustained. However, there are many further challenges that lie ahead to improve the science supporting flood early warning and to ensure that appropriate decisions are made to maximise flood preparedness.

  12. An improved method for analysis of hydroxide and carbonate in alkaline electrolytes containing zinc

    NASA Technical Reports Server (NTRS)

    Reid, M. A.

    1978-01-01

    A simplified method for titration of carbonate and hydroxide in alkaline battery electrolyte is presented involving a saturated KSCN solution as a complexing agent for zinc. Both hydroxide and carbonate can be determined in one titration, and the complexing reagent is readily prepared. Since the pH at the end point is shifted from 8.3 to 7.9-8.0, m-cresol purple or phenol red are used as indicators rather than phenolphthalein. Bromcresol green is recommended for determination of the second end point of a pH of 4.3 to 4.4.

  13. An improved method for analysis of hydroxide and carbonate in alkaline electrolytes containing zinc

    NASA Technical Reports Server (NTRS)

    Reid, M. A.

    1978-01-01

    A simplified method for titration of carbonate and hydroxide in alkaline battery electrolyte is presented involving a saturated KSCN solution as a complexing agent for zinc. Both hydroxide and carbonate can be determined in one titration, and the complexing reagent is readily prepared. Since the pH at the end point is shifted from 8.3 to 7.9 - 8.0, m-cresol purple or phenol red are used as indicators rather than phenolphthalein. Bromcresol green is recommended for determination of the second end point of a pH of 4.3 to 4.4.

  14. Combining Neural Networks with Existing Methods to Estimate 1 in 100-Year Flood Event Magnitudes

    NASA Astrophysics Data System (ADS)

    Newson, A.; See, L.

    2005-12-01

    Over the last fifteen years artificial neural networks (ANN) have been shown to be advantageous for the solution of many hydrological modelling problems. The use of ANNs for flood magnitude estimation in ungauged catchments, however, is a relatively new and under researched area. In this paper ANNs are used to make estimates of the magnitude of the 100-year flood event (Q100) for a number of ungauged catchments. The data used in this study were provided by the Centre for Ecology and Hydrology's Flood Estimation Handbook (FEH), which contains information on catchments across the UK. Sixteen catchment descriptors for 719 catchments were used to train an ANN, which was split into a training, validation and test data set. The goodness-of-fit statistics on the test data set indicated good model performance, with an r-squared value of 0.8 and a coefficient of efficiency of 79 percent. Data for twelve ungauged catchments were then put through the trained ANN to produce estimates of Q100. Two other accepted methodologies were also employed: the FEH statistical method and the FSR (Flood Studies Report) design storm technique, both of which are used to produce flood frequency estimates. The advantage of developing an ANN model is that it provides a third figure to aid a hydrologist in making an accurate estimate. For six of the twelve catchments, there was a relatively low spread between estimates. In these instances, an estimate of Q100 could be made with a fair degree of certainty. Of the remaining six catchments, three had areas greater than 1000km2, which means the FSR design storm estimate cannot be used. Armed with the ANN model and the FEH statistical method the hydrologist still has two possible estimates to consider. For these three catchments, the estimates were also fairly similar, providing additional confidence to the estimation. In summary, the findings of this study have shown that an accurate estimation of Q100 can be made using the catchment descriptors of

  15. Development of flood index by characterisation of flood hydrographs

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Biswa; Suman, Asadusjjaman

    2015-04-01

    In recent years the world has experienced deaths, large-scale displacement of people, billions of Euros of economic damage, mental stress and ecosystem impacts due to flooding. Global changes (climate change, population and economic growth, and urbanisation) are exacerbating the severity of flooding. The 2010 floods in Pakistan and the 2011 floods in Australia and Thailand demonstrate the need for concerted action in the face of global societal and environmental changes to strengthen resilience against flooding. Due to climatological characteristics there are catchments where flood forecasting may have a relatively limited role and flood event management may have to be trusted upon. For example, in flash flood catchments, which often may be tiny and un-gauged, flood event management often depends on approximate prediction tools such as flash flood guidance (FFG). There are catchments fed largely by flood waters coming from upstream catchments, which are un-gauged or due to data sharing issues in transboundary catchments the flow of information from upstream catchment is limited. Hydrological and hydraulic modelling of these downstream catchments will never be sufficient to provide any required forecasting lead time and alternative tools to support flood event management will be required. In FFG, or similar approaches, the primary motif is to provide guidance by synthesising the historical data. We follow a similar approach to characterise past flood hydrographs to determine a flood index (FI), which varies in space and time with flood magnitude and its propagation. By studying the variation of the index the pockets of high flood risk, requiring attention, can be earmarked beforehand. This approach can be very useful in flood risk management of catchments where information about hydro-meteorological variables is inadequate for any forecasting system. This paper presents the development of FI and its application to several catchments including in Kentucky in the USA

  16. Real-time flood extent maps based on social media

    NASA Astrophysics Data System (ADS)

    Eilander, Dirk; van Loenen, Arnejan; Roskam, Ruud; Wagemaker, Jurjen

    2015-04-01

    During a flood event it is often difficult to get accurate information about the flood extent and the people affected. This information is very important for disaster risk reduction management and crisis relief organizations. In the post flood phase, information about the flood extent is needed for damage estimation and calibrating hydrodynamic models. Currently, flood extent maps are derived from a few sources such as satellite images, areal images and post-flooding flood marks. However, getting accurate real-time or maximum flood extent maps remains difficult. With the rise of social media, we now have a new source of information with large numbers of observations. In the city of Jakarta, Indonesia, the intensity of unique flood related tweets during a flood event, peaked at 8 tweets per second during floods in early 2014. A fair amount of these tweets also contains observations of water depth and location. Our hypothesis is that based on the large numbers of tweets it is possible to generate real-time flood extent maps. In this study we use tweets from the city of Jakarta, Indonesia, to generate these flood extent maps. The data-mining procedure looks for tweets with a mention of 'banjir', the Bahasa Indonesia word for flood. It then removes modified and retweeted messages in order to keep unique tweets only. Since tweets are not always sent directly from the location of observation, the geotag in the tweets is unreliable. We therefore extract location information using mentions of names of neighborhoods and points of interest. Finally, where encountered, a mention of a length measure is extracted as water depth. These tweets containing a location reference and a water level are considered to be flood observations. The strength of this method is that it can easily be extended to other regions and languages. Based on the intensity of tweets in Jakarta during a flood event we can provide a rough estimate of the flood extent. To provide more accurate flood extend

  17. Flood resilience and uncertainty in flood risk assessment

    NASA Astrophysics Data System (ADS)

    Beven, K.; Leedal, D.; Neal, J.; Bates, P.; Hunter, N.; Lamb, R.; Keef, C.

    2012-04-01

    Flood risk assessments do not normally take account of the uncertainty in assessing flood risk. There is no requirement in the EU Floods Directive to do so. But given the generally short series (and potential non-stationarity) of flood discharges, the extrapolation to smaller exceedance potentials may be highly uncertain. This means that flood risk mapping may also be highly uncertainty, with additional uncertainties introduced by the representation of flood plain and channel geometry, conveyance and infrastructure. This suggests that decisions about flood plain management should be based on exceedance probability of risk rather than the deterministic hazard maps that are common in most EU countries. Some examples are given from 2 case studies in the UK where a framework for good practice in assessing uncertainty in flood risk mapping has been produced as part of the Flood Risk Management Research Consortium and Catchment Change Network Projects. This framework provides a structure for the communication and audit of assumptions about uncertainties.

  18. An algorithm for computing moments-based flood quantile estimates when historical flood information is available

    USGS Publications Warehouse

    Cohn, T.A.; Lane, W.L.; Baier, W.G.

    1997-01-01

    This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record; information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters, of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.

  19. An algorithm for computing moments-based flood quantile estimates when historical flood information is available

    NASA Astrophysics Data System (ADS)

    Cohn, T. A.; Lane, W. L.; Baier, W. G.

    This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record; information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters, of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.

  20. Fault tree analysis for urban flooding.

    PubMed

    ten Veldhuis, J A E; Clemens, F H L R; van Gelder, P H A J M

    2009-01-01

    Traditional methods to evaluate flood risk generally focus on heavy storm events as the principal cause of flooding. Conversely, fault tree analysis is a technique that aims at modelling all potential causes of flooding. It quantifies both overall flood probability and relative contributions of individual causes of flooding. This paper presents a fault model for urban flooding and an application to the case of Haarlem, a city of 147,000 inhabitants. Data from a complaint register, rainfall gauges and hydrodynamic model calculations are used to quantify probabilities of basic events in the fault tree. This results in a flood probability of 0.78/week for Haarlem. It is shown that gully pot blockages contribute to 79% of flood incidents, whereas storm events contribute only 5%. This implies that for this case more efficient gully pot cleaning is a more effective strategy to reduce flood probability than enlarging drainage system capacity. Whether this is also the most cost-effective strategy can only be decided after risk assessment has been complemented with a quantification of consequences of both types of events. To do this will be the next step in this study.

  1. Flood type specific construction of synthetic design hydrographs

    NASA Astrophysics Data System (ADS)

    Brunner, Manuela I.; Viviroli, Daniel; Sikorska, Anna E.; Vannier, Olivier; Favre, Anne-Catherine; Seibert, Jan

    2017-02-01

    Accurate estimates of flood peaks, corresponding volumes, and hydrographs are required to design safe and cost-effective hydraulic structures. In this paper, we propose a statistical approach for the estimation of the design variables peak and volume by constructing synthetic design hydrographs for different flood types such as flash-floods, short-rain floods, long-rain floods, and rain-on-snow floods. Our approach relies on the fitting of probability density functions to observed flood hydrographs of a certain flood type and accounts for the dependence between peak discharge and flood volume. It makes use of the statistical information contained in the data and retains the process information of the flood type. The method was tested based on data from 39 mesoscale catchments in Switzerland and provides catchment specific and flood type specific synthetic design hydrographs for all of these catchments. We demonstrate that flood type specific synthetic design hydrographs are meaningful in flood-risk management when combined with knowledge on the seasonality and the frequency of different flood types.

  2. Methods for estimating magnitude and frequency of floods in Montana based on data through 1983

    USGS Publications Warehouse

    Omang, R.J.; Parrett, Charles; Hull, J.A.

    1986-01-01

    Equations are presented for estimating flood magnitudes for ungaged sites in Montana based on data through 1983. The State was divided into eight regions based on hydrologic conditions, and separate multiple regression equations were developed for each region. These equations relate annual flood magnitudes and frequencies to basin characteristics and are applicable only to natural flow streams. In three of the regions, equations also were developed relating flood magnitudes and frequencies to basin characteristics and channel geometry measurements. The standard errors of estimate for an exceedance probability of 1% ranged from 39% to 87%. Techniques are described for estimating annual flood magnitude and flood frequency information at ungaged sites based on data from gaged sites on the same stream. Included are curves relating flood frequency information to drainage area for eight major streams in the State. Maximum known flood magnitudes in Montana are compared with estimated 1 %-chance flood magnitudes and with maximum known floods in the United States. Values of flood magnitudes for selected exceedance probabilities and values of significant basin characteristics and channel geometry measurements for all gaging stations used in the analysis are tabulated. Included are 375 stations in Montana and 28 nearby stations in Canada and adjoining States. (Author 's abstract)

  3. Floods - Multiple Languages

    MedlinePlus

    ... Arabic (العربية) Expand Section Floods and Flash Flooding - English PDF Floods and Flash Flooding - العربية (Arabic) PDF ... Bosnian (bosanski) Expand Section Floods and Flash Flooding - English PDF Floods and Flash Flooding - bosanski (Bosnian) PDF ...

  4. The role of Natural Flood Management in managing floods in large scale basins during extreme events

    NASA Astrophysics Data System (ADS)

    Quinn, Paul; Owen, Gareth; ODonnell, Greg; Nicholson, Alex; Hetherington, David

    2016-04-01

    There is a strong evidence database showing the negative impacts of land use intensification and soil degradation in NW European river basins on hydrological response and to flood impact downstream. However, the ability to target zones of high runoff production and the extent to which we can manage flood risk using nature-based flood management solution are less known. A move to planting more trees and having less intense farmed landscapes is part of natural flood management (NFM) solutions and these methods suggest that flood risk can be managed in alternative and more holistic ways. So what local NFM management methods should be used, where in large scale basin should they be deployed and how does flow is propagate to any point downstream? Generally, how much intervention is needed and will it compromise food production systems? If we are observing record levels of rainfall and flow, for example during Storm Desmond in Dec 2015 in the North West of England, what other flood management options are really needed to complement our traditional defences in large basins for the future? In this paper we will show examples of NFM interventions in the UK that have impacted at local scale sites. We will demonstrate the impact of interventions at local, sub-catchment (meso-scale) and finally at the large scale. These tools include observations, process based models and more generalised Flood Impact Models. Issues of synchronisation and the design level of protection will be debated. By reworking observed rainfall and discharge (runoff) for observed extreme events in the River Eden and River Tyne, during Storm Desmond, we will show how much flood protection is needed in large scale basins. The research will thus pose a number of key questions as to how floods may have to be managed in large scale basins in the future. We will seek to support a method of catchment systems engineering that holds water back across the whole landscape as a major opportunity to management water

  5. Flooding and Mental Health: A Systematic Mapping Review

    PubMed Central

    Fernandez, Ana; Black, John; Jones, Mairwen; Wilson, Leigh; Salvador-Carulla, Luis; Astell-Burt, Thomas; Black, Deborah

    2015-01-01

    Background Floods are the most common type of global natural disaster. Floods have a negative impact on mental health. Comprehensive evaluation and review of the literature are lacking. Objective To systematically map and review available scientific evidence on mental health impacts of floods caused by extended periods of heavy rain in river catchments. Methods We performed a systematic mapping review of published scientific literature in five languages for mixed studies on floods and mental health. PUBMED and Web of Science were searched to identify all relevant articles from 1994 to May 2014 (no restrictions). Results The electronic search strategy identified 1331 potentially relevant papers. Finally, 83 papers met the inclusion criteria. Four broad areas are identified: i) the main mental health disorders—post-traumatic stress disorder, depression and anxiety; ii] the factors associated with mental health among those affected by floods; iii) the narratives associated with flooding, which focuses on the long-term impacts of flooding on mental health as a consequence of the secondary stressors; and iv) the management actions identified. The quantitative and qualitative studies have consistent findings. However, very few studies have used mixed methods to quantify the size of the mental health burden as well as exploration of in-depth narratives. Methodological limitations include control of potential confounders and short-term follow up. Limitations Floods following extreme events were excluded from our review. Conclusions Although the level of exposure to floods has been systematically associated with mental health problems, the paucity of longitudinal studies and lack of confounding controls precludes strong conclusions. Implications We recommend that future research in this area include mixed-method studies that are purposefully designed, using more rigorous methods. Studies should also focus on vulnerable groups and include analyses of policy and practical

  6. Climate-Related Hazards: A Method for Global Assessment of Urban and Rural Population Exposure to Cyclones, Droughts, and Floods

    PubMed Central

    Christenson, Elizabeth; Elliott, Mark; Banerjee, Ovik; Hamrick, Laura; Bartram, Jamie

    2014-01-01

    Global climate change (GCC) has led to increased focus on the occurrence of, and preparation for, climate-related extremes and hazards. Population exposure, the relative likelihood that a person in a given location was exposed to a given hazard event(s) in a given period of time, was the outcome for this analysis. Our objectives were to develop a method for estimating the population exposure at the country level to the climate-related hazards cyclone, drought, and flood; develop a method that readily allows the addition of better datasets to an automated model; differentiate population exposure of urban and rural populations; and calculate and present the results of exposure scores and ranking of countries based on the country-wide, urban, and rural population exposures to cyclone, drought, and flood. Gridded global datasets on cyclone, drought and flood occurrence as well as population density were combined and analysis was carried out using ArcGIS. Results presented include global maps of ranked country-level population exposure to cyclone, drought, flood and multiple hazards. Analyses by geography and human development index (HDI) are also included. The results and analyses of this exposure assessment have implications for country-level adaptation. It can also be used to help prioritize aid decisions and allocation of adaptation resources between countries and within a country. This model is designed to allow flexibility in applying cyclone, drought and flood exposure to a range of outcomes and adaptation measures. PMID:24566046

  7. Flood management: prediction of microbial contamination in large-scale floods in urban environments.

    PubMed

    Taylor, Jonathon; Lai, Ka Man; Davies, Mike; Clifton, David; Ridley, Ian; Biddulph, Phillip

    2011-07-01

    With a changing climate and increased urbanisation, the occurrence and the impact of flooding is expected to increase significantly. Floods can bring pathogens into homes and cause lingering damp and microbial growth in buildings, with the level of growth and persistence dependent on the volume and chemical and biological content of the flood water, the properties of the contaminating microbes, and the surrounding environmental conditions, including the restoration time and methods, the heat and moisture transport properties of the envelope design, and the ability of the construction material to sustain the microbial growth. The public health risk will depend on the interaction of these complex processes and the vulnerability and susceptibility of occupants in the affected areas. After the 2007 floods in the UK, the Pitt review noted that there is lack of relevant scientific evidence and consistency with regard to the management and treatment of flooded homes, which not only put the local population at risk but also caused unnecessary delays in the restoration effort. Understanding the drying behaviour of flooded buildings in the UK building stock under different scenarios, and the ability of microbial contaminants to grow, persist, and produce toxins within these buildings can help inform recovery efforts. To contribute to future flood management, this paper proposes the use of building simulations and biological models to predict the risk of microbial contamination in typical UK buildings. We review the state of the art with regard to biological contamination following flooding, relevant building simulation, simulation-linked microbial modelling, and current practical considerations in flood remediation. Using the city of London as an example, a methodology is proposed that uses GIS as a platform to integrate drying models and microbial risk models with the local building stock and flood models. The integrated tool will help local governments, health authorities

  8. Towards a Flood Severity Index

    NASA Astrophysics Data System (ADS)

    Kettner, A.; Chong, A.; Prades, L.; Brakenridge, G. R.; Muir, S.; Amparore, A.; Slayback, D. A.; Poungprom, R.

    2017-12-01

    Flooding is the most common natural hazard worldwide, affecting 21 million people every year. In the immediate moments following a flood event, humanitarian actors like the World Food Program need to make rapid decisions ( 72 hrs) on how to prioritize affected areas impacted by such an event. For other natural disasters like hurricanes/cyclones and earthquakes, there are industry-recognized standards on how the impacted areas are to be classified. Shake maps, quantifying peak ground motion, from for example the US Geological Survey are widely used for assessing earthquakes. Similarly, cyclones are tracked by Joint Typhoon Warning Center (JTWC) and Global Disaster Alert and Coordination System (GDACS) who release storm nodes and tracks (forecasted and actual), with wind buffers and classify the event according to the Saffir-Simpson Hurricane Wind Scale. For floods, the community is usually able to acquire unclassified data of the flood extent as identified from satellite imagery. Most often no water discharge hydrograph is available to classify the event into recurrence intervals simply because there is no gauging station, or the gauging station was unable to record the maximum discharge due to overtopping or flood damage. So, the question remains: How do we methodically turn a flooded area into classified areas of different gradations of impact? Here, we present a first approach towards developing a global applicable flood severity index. The flood severity index is set up such that it considers relatively easily obtainable physical parameters in a short period of time like: flood frequency (relating the current flood to historical events) and magnitude, as well as land cover, slope, and where available pre-event simulated flood depth. The scale includes categories ranging from very minor flooding to catastrophic flooding. We test and evaluate the postulated classification scheme against a set of past flood events. Once a severity category is determined, socio

  9. Evaluation of design flood estimates with respect to sample size

    NASA Astrophysics Data System (ADS)

    Kobierska, Florian; Engeland, Kolbjorn

    2016-04-01

    Estimation of design floods forms the basis for hazard management related to flood risk and is a legal obligation when building infrastructure such as dams, bridges and roads close to water bodies. Flood inundation maps used for land use planning are also produced based on design flood estimates. In Norway, the current guidelines for design flood estimates give recommendations on which data, probability distribution, and method to use dependent on length of the local record. If less than 30 years of local data is available, an index flood approach is recommended where the local observations are used for estimating the index flood and regional data are used for estimating the growth curve. For 30-50 years of data, a 2 parameter distribution is recommended, and for more than 50 years of data, a 3 parameter distribution should be used. Many countries have national guidelines for flood frequency estimation, and recommended distributions include the log Pearson II, generalized logistic and generalized extreme value distributions. For estimating distribution parameters, ordinary and linear moments, maximum likelihood and Bayesian methods are used. The aim of this study is to r-evaluate the guidelines for local flood frequency estimation. In particular, we wanted to answer the following questions: (i) Which distribution gives the best fit to the data? (ii) Which estimation method provides the best fit to the data? (iii) Does the answer to (i) and (ii) depend on local data availability? To answer these questions we set up a test bench for local flood frequency analysis using data based cross-validation methods. The criteria were based on indices describing stability and reliability of design flood estimates. Stability is used as a criterion since design flood estimates should not excessively depend on the data sample. The reliability indices describe to which degree design flood predictions can be trusted.

  10. Xylan extraction from pretreated sugarcane bagasse using alkaline and enzymatic approaches.

    PubMed

    Sporck, Daniele; Reinoso, Felipe A M; Rencoret, Jorge; Gutiérrez, Ana; Del Rio, José C; Ferraz, André; Milagres, Adriane M F

    2017-01-01

    New biorefinery concepts are necessary to drive industrial use of lignocellulose biomass components. Xylan recovery before enzymatic hydrolysis of the glucan component is a way to add value to the hemicellulose fraction, which can be used in papermaking, pharmaceutical, and food industries. Hemicellulose removal can also facilitate subsequent cellulolytic glucan hydrolysis. Sugarcane bagasse was pretreated with an alkaline-sulfite chemithermomechanical process to facilitate subsequent extraction of xylan by enzymatic or alkaline procedures. Alkaline extraction methods yielded 53% (w/w) xylan recovery. The enzymatic approach provided a limited yield of 22% (w/w) but produced the xylan with the lowest contamination with lignin and glucan components. All extracted xylans presented arabinosyl side groups and absence of acetylation. 2D-NMR data suggested the presence of O -methyl-glucuronic acid and p -coumarates only in enzymatically extracted xylan. Xylans isolated using the enzymatic approach resulted in products with molecular weights (Mw) lower than 6 kDa. Higher Mw values were detected in the alkali-isolated xylans. Alkaline extraction of xylan provided a glucan-enriched solid readily hydrolysable with low cellulase loads, generating hydrolysates with a high glucose/xylose ratio. Hemicellulose removal before enzymatic hydrolysis of the cellulosic fraction proved to be an efficient manner to add value to sugarcane bagasse biorefining. Xylans with varied yield, purity, and structure can be obtained according to the extraction method. Enzymatic extraction procedures produce high-purity xylans at low yield, whereas alkaline extraction methods provided higher xylan yields with more lignin and glucan contamination. When xylan extraction is performed with alkaline methods, the residual glucan-enriched solid seems suitable for glucose production employing low cellulase loadings.

  11. Spatial Analysis of Land Subsidence and Flood Pattern Based on DInSAR Method in Sentinel Sar Imagery and Weighting Method in Geo-Hazard Parameters Combination in North Jakarta Region

    NASA Astrophysics Data System (ADS)

    Prasetyo, Y.; Yuwono, B. D.; Ramadhanis, Z.

    2018-02-01

    The reclamation program carried out in most cities in North Jakarta is directly adjacent to the Jakarta Bay. Beside this program, the density of population and development center in North Jakarta office has increased the need for underground water excessively. As a result of these things, land subsidence in North Jakarta area is relatively high and so intense. The research methodology was developed based on the method of remote sensing and geographic information systems, expected to describe the spatial correlation between the land subsidence and flood phenomenon in North Jakarta. The DInSAR (Differential Interferometric Synthetic Aperture Radar) method with satellite image data Radar (SAR Sentinel 1A) for the years 2015 to 2016 acquisitions was used in this research. It is intended to obtain a pattern of land subsidence in North Jakarta and then combined with flood patterns. For the preparation of flood threat zoning pattern, this research has been modeling in spatial technique based on a weighted parameter of rainfall, elevation, flood zones and land use. In the final result, we have obtained a flood hazard zonation models then do the overlap against DInSAR processing results. As a result of the research, Geo-hazard modelling has a variety results as: 81% of flood threat zones consist of rural area, 12% consists of un-built areas and 7% consists of water areas. Furthermore, the correlation of land subsidence to flood risk zone is divided into three levels of suitability with 74% in high class, 22% in medium class and 4% in low class. For the result of spatial correlation area between land subsidence and flood risk zone are 77% detected in rural area, 17% detected in un-built area and 6% detected in a water area. Whereas the research product is the geo-hazard maps in North Jakarta as the basis of the spatial correlation analysis between the land subsidence and flooding phenomena.double point.

  12. Evaluating sediment transport in flood-driven ephemeral tributaries using direct and acoustic methods.

    NASA Astrophysics Data System (ADS)

    Stark, K.

    2017-12-01

    One common source of uncertainty in sediment transport modeling of large semi-arid rivers is sediment influx delivered by ephemeral, flood-driven tributaries. Large variations in sediment delivery are associated with these regimes due to the highly variable nature of flows within them. While there are many sediment transport equations, they are typically developed for perennial streams and can be inaccurate for ephemeral channels. Discrete, manual sampling is labor intensive and requires personnel to be on site during flooding. In addition, flooding within these tributaries typically last on the order of hours, making it difficult to be present during an event. To better understand these regimes, automated systems are needed to continuously sample bedload and suspended load. In preparation for the pending installation of an automated site on the Arroyo de los Piños in New Mexico, manual sediment and flow samples have been collected over the summer monsoon season of 2017, in spite of the logistical challenges. These data include suspended and bedload sediment samples at the basin outlet, and stage and precipitation data from throughout the basin. Data indicate a complex system; flow is generated primarily in areas of exposed bedrock in the center and higher elevations of the watershed. Bedload samples show a large coarse-grained fraction, with 50% >2 mm and 25% >6 mm, which is compatible with acoustic measuring techniques. These data will be used to inform future site operations, which will combine direct sediment measurement from Reid-type slot samplers and non-invasive acoustic measuring methods. Bedload will be indirectly monitored using pipe-style microphones, plate-style geophones, channel hydrophones, and seismometers. These instruments record vibrations and acoustic signals from bedload impacts and movement. Indirect methods for measuring of bedload have never been extensively evaluated in ephemeral channels in the southwest United States. Once calibrated

  13. Mineralogical, petrological and geochemical aspects of alkaline and alkaline-carbonatite associations from Brazil

    NASA Astrophysics Data System (ADS)

    Morbidelli, L.; Gomes, C. B.; Beccaluva, L.; Brotzu, P.; Conte, A. M.; Ruberti, E.; Traversa, G.

    1995-12-01

    A general description of Mesozoic and Tertiary (Fortaleza) Brazilian alkaline and alkaline-carbonatite districts is presented with reference to mineralogy, petrology, geochemistry and geochronology. It mainly refers to scientific results obtained during the last decade by an Italo-Brazilian research team. Alkaline occurrences are distributed across Brazilian territory from the southern (Piratini, Rio Grande do Sul State) to the northeastern (Fortaleza, Ceará State) regions and are mainly concentrated along the borders of the Paraná Basin generally coinciding with important tectonic lineaments. The most noteworthy characteristics of these alkaline and alkaline-carbonatite suites are: (i) prevalence of intrusive forms; (ii) abundance of cumulate assemblages (minor dunites, frequent clinopyroxenites and members of the ijolite series) and (iii) abundance of evolved rock-types. Many data demonstrate that crystal fractionation was the main process responsible for magma evolution of all Brazilian alkaline rocks. A hypothesis is proposed for the genesis of carbonatite liquids by immiscibility processes. The incidence of REE and trace elements for different major groups of lithotypes, belonging both to carbonatite-bearing and carbonatite-free districts, are documented. Sr and preliminary Nd isotopic data are indicative of a mantle origin for the least evolved magmas of all the studied occurrences. Mantle source material and melting models for the generation of the Brazilian alkaline magma types are also discussed.

  14. Evaluation of a liquid chromatography method for compound-specific δ13C analysis of plant carbohydrates in alkaline media.

    PubMed

    Rinne, Katja T; Saurer, Matthias; Streit, Kathrin; Siegwolf, Rolf T W

    2012-09-30

    Isotope analysis of carbohydrates is important for improved understanding of plant carbon metabolism and plant physiological response to the environment. High-performance liquid chromatography/isotope ratio mass spectrometry (HPLC/IRMS) for direct compound-specific δ(13)C measurements of soluble carbohydrates has recently been developed, but the still challenging sample preparation and the fact that no single method is capable of separating all compounds of interest hinder its wide-spread application. Here we tested in detail a chromatography method in alkaline media. We examined the most suitable chromatographic conditions for HPLC/IRMS analysis of carbohydrates in aqueous conifer needle extracts using a CarboPac PA20 anion-exchange column with NaOH eluent, paying specific attention to compound yields, carbon isotope fractionation processes and the reproducibility of the method. Furthermore, we adapted and calibrated sample preparation methods for HPLC/IRMS analysis. OnGuard II cartridges were used for sample purification. Good peak separation and highly linear and reproducible concentration and δ(13)C measurements were obtained. The alkaline eluent was observed to induce isomerization of hexoses, detected as reduced yields and (13)C fractionation of the affected compounds. A reproducible pre-purification method providing ~100% yield for the carbohydrate compounds of interest was calibrated. The good level of peak separation obtained in this study is reflected in the good precision and linearity of concentration and δ(13)C results. The data provided crucial information on the behaviour of sugars in LC analysis with alkaline media. The observations highlight the importance for the application of compound-matched standard solution for the detection and correction of instrumental biases in concentration and δ(13)C analysis performed under identical chromatographic conditions. The calibrated pre-purification method is well suited for studies with complex matrices

  15. Alkaline "Permanent" Paper.

    ERIC Educational Resources Information Center

    Pacey, Antony

    1991-01-01

    Discussion of paper manufacturing processes and their effects on library materials focuses on the promotion of alkaline "permanent" paper, with less acid, by Canadian library preservation specialists. Standards for paper acidity are explained; advantages of alkaline paper are described, including decreased manufacturing costs; and…

  16. Developing a Global Database of Historic Flood Events to Support Machine Learning Flood Prediction in Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Tellman, B.; Sullivan, J.; Kettner, A.; Brakenridge, G. R.; Slayback, D. A.; Kuhn, C.; Doyle, C.

    2016-12-01

    There is an increasing need to understand flood vulnerability as the societal and economic effects of flooding increases. Risk models from insurance companies and flood models from hydrologists must be calibrated based on flood observations in order to make future predictions that can improve planning and help societies reduce future disasters. Specifically, to improve these models both traditional methods of flood prediction from physically based models as well as data-driven techniques, such as machine learning, require spatial flood observation to validate model outputs and quantify uncertainty. A key dataset that is missing for flood model validation is a global historical geo-database of flood event extents. Currently, the most advanced database of historical flood extent is hosted and maintained at the Dartmouth Flood Observatory (DFO) that has catalogued 4320 floods (1985-2015) but has only mapped 5% of these floods. We are addressing this data gap by mapping the inventory of floods in the DFO database to create a first-of- its-kind, comprehensive, global and historical geospatial database of flood events. To do so, we combine water detection algorithms on MODIS and Landsat 5,7 and 8 imagery in Google Earth Engine to map discrete flood events. The created database will be available in the Earth Engine Catalogue for download by country, region, or time period. This dataset can be leveraged for new data-driven hydrologic modeling using machine learning algorithms in Earth Engine's highly parallelized computing environment, and we will show examples for New York and Senegal.

  17. Quantitative Analysis of Burden of Infectious Diarrhea Associated with Floods in Northwest of Anhui Province, China: A Mixed Method Evaluation

    PubMed Central

    Ding, Guoyong; Zhang, Ying; Gao, Lu; Ma, Wei; Li, Xiujun; Liu, Jing; Liu, Qiyong; Jiang, Baofa

    2013-01-01

    Background Persistent and heavy rainfall in the upper and middle Huaihe River of China brought about severe floods during the end of June and July 2007. However, there has been no assessment on the association between the floods and infectious diarrhea. This study aimed to quantify the impact of the floods in 2007 on the burden of disease due to infectious diarrhea in northwest of Anhui Province. Methods A time-stratified case-crossover analysis was firstly conducted to examine the relationship between daily cases of infectious diarrhea and the 2007 floods in Fuyang and Bozhou of Anhui Province. Odds ratios (ORs) of the flood risk were quantified by conditional logistic regression. The years lived with disability (YLDs) of infectious diarrhea attributable to floods were then estimated based on the WHO framework of the calculating potential impact fraction in the Burden of Disease study. Results A total of 197 infectious diarrheas were notified during the exposure and control periods in the two study areas. The strongest effect was shown with a 2-day lag in Fuyang and a 5-day lag in Bozhou. Multivariable analysis showed that floods were significantly associated with an increased risk of the number cases of infectious diarrhea (OR = 3.175, 95%CI: 1.126–8.954 in Fuyang; OR = 6.754, 95%CI: 1.954–23.344 in Bozhou). Attributable YLD per 1000 of infectious diarrhea resulting from the floods was 0.0081 in Fuyang and 0.0209 in Bozhou. Conclusions Our findings confirm that floods have significantly increased the risks of infectious diarrhea in the study areas. In addition, prolonged moderate flood may cause more burdens of infectious diarrheas than severe flood with a shorter duration. More attention should be paid to particular vulnerable groups, including younger children and elderly, in developing public health preparation and intervention programs. Findings have significant implications for developing strategies to prevent and reduce health impact of floods

  18. Quantification of Uncertainty in the Flood Frequency Analysis

    NASA Astrophysics Data System (ADS)

    Kasiapillai Sudalaimuthu, K.; He, J.; Swami, D.

    2017-12-01

    Flood frequency analysis (FFA) is usually carried out for planning and designing of water resources and hydraulic structures. Owing to the existence of variability in sample representation, selection of distribution and estimation of distribution parameters, the estimation of flood quantile has been always uncertain. Hence, suitable approaches must be developed to quantify the uncertainty in the form of prediction interval as an alternate to deterministic approach. The developed framework in the present study to include uncertainty in the FFA discusses a multi-objective optimization approach to construct the prediction interval using ensemble of flood quantile. Through this approach, an optimal variability of distribution parameters is identified to carry out FFA. To demonstrate the proposed approach, annual maximum flow data from two gauge stations (Bow river at Calgary and Banff, Canada) are used. The major focus of the present study was to evaluate the changes in magnitude of flood quantiles due to the recent extreme flood event occurred during the year 2013. In addition, the efficacy of the proposed method was further verified using standard bootstrap based sampling approaches and found that the proposed method is reliable in modeling extreme floods as compared to the bootstrap methods.

  19. Regional flood probabilities

    USGS Publications Warehouse

    Troutman, Brent M.; Karlinger, Michael R.

    2003-01-01

    The T‐year annual maximum flood at a site is defined to be that streamflow, that has probability 1/T of being exceeded in any given year, and for a group of sites the corresponding regional flood probability (RFP) is the probability that at least one site will experience a T‐year flood in any given year. The RFP depends on the number of sites of interest and on the spatial correlation of flows among the sites. We present a Monte Carlo method for obtaining the RFP and demonstrate that spatial correlation estimates used in this method may be obtained with rank transformed data and therefore that knowledge of the at‐site peak flow distribution is not necessary. We examine the extent to which the estimates depend on specification of a parametric form for the spatial correlation function, which is known to be nonstationary for peak flows. It is shown in a simulation study that use of a stationary correlation function to compute RFPs yields satisfactory estimates for certain nonstationary processes. Application of asymptotic extreme value theory is examined, and a methodology for separating channel network and rainfall effects on RFPs is suggested. A case study is presented using peak flow data from the state of Washington. For 193 sites in the Puget Sound region it is estimated that a 100‐year flood will occur on the average every 4.5 years.

  20. Chemical weathering outputs from the flood plain of the Ganga

    NASA Astrophysics Data System (ADS)

    Bickle, Michael J.; Chapman, Hazel J.; Tipper, Edward; Galy, Albert; De La Rocha, Christina L.; Ahmad, Talat

    2018-03-01

    inputs were derived from the acetic-acid leach compositions and silicate Na/Ca and 87Sr/86Sr ratios derived from silicate residues from leaching. Modelling based on the 87Sr/86Sr and Sr/Ca ratios of the carbonate inputs and 87Sr/86Sr ratios of the silicates indicates that the flood plain waters have lost up to 70% of their Ca (average ∼ 50%) to precipitation of secondary calcite which is abundant as a diagenetic cement in the flood plain sediments. 31% of the Sr, 8% of the Ca and 45% of the Mg are calculated to be derived from silicate minerals. Because of significant evaporative loss of water across the flood plain, and in the absence of hydrological data for flood plain tributaries, chemical weathering fluxes from the flood plain are best calculated by mass balance of the Na, K, Ca, Mg, Sr, SO4 and 87Sr/86Sr compositions of the inputs, comprising the flood plain tributaries, Himalayan rivers and southern rivers, with the chemical discharge in the Ganga at Farakka. The calculated fluxes from the flood plain for Na, K, Ca and Mg are within error of those estimated from changes in sediment chemistry across the flood plain (Lupker et al., 2012, Geochemica Cosmochimica Acta). Flood plain weathering supplies between 41 and 63% of the major cation and Sr fluxes and 58% of the alkalinity flux carried by the Ganga at Farakka which compares with 24% supplied by Himalayan rivers and 18% by the southern tributaries.

  1. Effect of catchment properties and flood generation regime on copula selection for bivariate flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Filipova, Valeriya; Lawrence, Deborah; Klempe, Harald

    2018-02-01

    Applying copula-based bivariate flood frequency analysis is advantageous because the results provide information on both the flood peak and volume. More data are, however, required for such an analysis, and it is often the case that only data series with a limited record length are available. To overcome this issue of limited record length, data regarding climatic and geomorphological properties can be used to complement statistical methods. In this paper, we present a study of 27 catchments located throughout Norway, in which we assess whether catchment properties, flood generation processes and flood regime have an effect on the correlation between flood peak and volume and, in turn, on the selection of copulas. To achieve this, the annual maximum flood events were first classified into events generated primarily by rainfall, snowmelt or a combination of these. The catchments were then classified into flood regime, depending on the predominant flood generation process producing the annual maximum flood events. A contingency table and Fisher's exact test were used to determine the factors that affect the selection of copulas in the study area. The results show that the two-parameter copulas BB1 and BB7 are more commonly selected in catchments with high steepness, high mean annual runoff and rainfall flood regime. These findings suggest that in these types of catchments, the dependence structure between flood peak and volume is more complex and cannot be modeled effectively using a one-parameter copula. The results illustrate that by relating copula types to flood regime and catchment properties, additional information can be supplied for selecting copulas in catchments with limited data.

  2. Economic impact due to Cimanuk river flood disaster in Garut district using Cobb-Douglas analysis with least square method

    NASA Astrophysics Data System (ADS)

    Bestari, T. A. S.; Supian, S.; Purwani, S.

    2018-03-01

    Cimanuk River, Garut District, West Java which have upper course in Papandayan Mountain have an important purpose in dialy living of Garut people as a water source. But in 2016 flash flood in this river was hitted and there was 26 peple dead and 23 peole gone. Flash flood which hitted last year make the settlement almost align with the ground, soaking school and hospital. BPLHD Jawa Barat saw this condition as a disaster which coused by distroyed upper course of Cimanuk River. Flash Flood which happened on the 2016 had ever made economic sector paralized. Least square method selected to analyze economic condition in residents affected post disaster, after the mathematical equations was determined by Cobb Douglas Method. By searching proportion value of the damage, and the result expected became a view to the stakeholder to know which sector that become a worse and be able to make a priority in development

  3. Proposal for a quantitative index of flood disasters.

    PubMed

    Feng, Lihua; Luo, Gaoyuan

    2010-07-01

    Drawing on calculations of wind scale and earthquake magnitude, this paper develops a new quantitative method for measuring flood magnitude and disaster intensity. Flood magnitude is the quantitative index that describes the scale of a flood; the flood's disaster intensity is the quantitative index describing the losses caused. Both indices have numerous theoretical and practical advantages with definable concepts and simple applications, which lend them key practical significance.

  4. Probabilistic modelling of flood events using the entropy copula

    NASA Astrophysics Data System (ADS)

    Li, Fan; Zheng, Qian

    2016-11-01

    The estimation of flood frequency is vital for the flood control strategies and hydraulic structure design. Generating synthetic flood events according to statistical properties of observations is one of plausible methods to analyze the flood frequency. Due to the statistical dependence among the flood event variables (i.e. the flood peak, volume and duration), a multidimensional joint probability estimation is required. Recently, the copula method is widely used for multivariable dependent structure construction, however, the copula family should be chosen before application and the choice process is sometimes rather subjective. The entropy copula, a new copula family, employed in this research proposed a way to avoid the relatively subjective process by combining the theories of copula and entropy. The analysis shows the effectiveness of the entropy copula for probabilistic modelling the flood events of two hydrological gauges, and a comparison of accuracy with the popular copulas was made. The Gibbs sampling technique was applied for trivariate flood events simulation in order to mitigate the calculation difficulties of extending to three dimension directly. The simulation results indicate that the entropy copula is a simple and effective copula family for trivariate flood simulation.

  5. A hybrid method for flood simulation in small catchments combining hydrodynamic and hydrological techniques

    NASA Astrophysics Data System (ADS)

    Bellos, Vasilis; Tsakiris, George

    2016-09-01

    The study presents a new hybrid method for the simulation of flood events in small catchments. It combines a physically-based two-dimensional hydrodynamic model and the hydrological unit hydrograph theory. Unit hydrographs are derived using the FLOW-R2D model which is based on the full form of two-dimensional Shallow Water Equations, solved by a modified McCormack numerical scheme. The method is tested at a small catchment in a suburb of Athens-Greece for a storm event which occurred in February 2013. The catchment is divided into three friction zones and unit hydrographs of 15 and 30 min are produced. The infiltration process is simulated by the empirical Kostiakov equation and the Green-Ampt model. The results from the implementation of the proposed hybrid method are compared with recorded data at the hydrometric station at the outlet of the catchment and the results derived from the fully hydrodynamic model FLOW-R2D. It is concluded that for the case studied, the proposed hybrid method produces results close to those of the fully hydrodynamic simulation at substantially shorter computational time. This finding, if further verified in a variety of case studies, can be useful in devising effective hybrid tools for the two-dimensional flood simulations, which are lead to accurate and considerably faster results than those achieved by the fully hydrodynamic simulations.

  6. Assessment of flood risk in Tokyo metropolitan area

    NASA Astrophysics Data System (ADS)

    Hirano, J.; Dairaku, K.

    2013-12-01

    Flood is one of the most significant natural hazards in Japan. The Tokyo metropolitan area has been affected by several large flood disasters. Therefore, investigating potential flood risk in Tokyo metropolitan area is important for development of adaptation strategy for future climate change. We aim to develop a method for evaluating flood risk in Tokyo Metropolitan area by considering effect of historical land use and land cover change, socio-economic change, and climatic change. Ministry of land, infrastructure, transport and tourism in Japan published 'Statistics of flood', which contains data for flood causes, number of damaged houses, area of wetted surface, and total amount of damage for each flood at small municipal level. By using these flood data, we estimated damage by inundation inside a levee for each prefecture based on a statistical method. On the basis of estimated damage, we developed flood risk curves in the Tokyo metropolitan area, representing relationship between damage and exceedance probability of flood for the period 1976-2008 for each prefecture. Based on the flood risk curve, we attempted evaluate potential flood risk in the Tokyo metropolitan area and clarify the cause for regional difference of flood risk. By analyzing flood risk curves, we found out regional differences of flood risk. We identified high flood risk in Tokyo and Saitama prefecture. On the other hand, flood risk was relatively low in Ibaraki and Chiba prefecture. We found that these regional differences of flood risk can be attributed to spatial distribution of entire property value and ratio of damaged housing units in each prefecture.We also attempted to evaluate influence of climate change on potential flood risk by considering variation of precipitation amount and precipitation intensity in the Tokyo metropolitan area. Results shows that we can evaluate potential impact of precipitation change on flood risk with high accuracy by using our methodology. Acknowledgments

  7. A mixed method to evaluate burden of malaria due to flooding and waterlogging in Mengcheng County, China: a case study.

    PubMed

    Ding, Guoyong; Gao, Lu; Li, Xuewen; Zhou, Maigeng; Liu, Qiyong; Ren, Hongyan; Jiang, Baofa

    2014-01-01

    Malaria is a highly climate-sensitive vector-borne infectious disease that still represents a significant public health problem in Huaihe River Basin. However, little comprehensive information about the burden of malaria caused by flooding and waterlogging is available from this region. This study aims to quantitatively assess the impact of flooding and waterlogging on the burden of malaria in a county of Anhui Province, China. A mixed method evaluation was conducted. A case-crossover study was firstly performed to evaluate the relationship between daily number of cases of malaria and flooding and waterlogging from May to October 2007 in Mengcheng County, China. Stratified Cox models were used to examine the lagged time and hazard ratios (HRs) of the risk of flooding and waterlogging on malaria. Years lived with disability (YLDs) of malaria attributable to flooding and waterlogging were then estimated based on the WHO framework of calculating potential impact fraction in the Global Burden of Disease study. A total of 3683 malaria were notified during the study period. The strongest effect was shown with a 25-day lag for flooding and a 7-day lag for waterlogging. Multivariable analysis showed that an increased risk of malaria was significantly associated with flooding alone [adjusted hazard ratio (AHR)  = 1.467, 95% CI = 1.257, 1.713], waterlogging alone (AHR = 1.879, 95% CI = 1.696, 2.121), and flooding and waterlogging together (AHR = 2.926, 95% CI = 2.576, 3.325). YLDs per 1000 of malaria attributable to flooding alone, waterlogging alone and flooding and waterlogging together were 0.009 per day, 0.019 per day and 0.022 per day, respectively. Flooding and waterlogging can lead to higher burden of malaria in the study area. Public health action should be taken to avoid and control a potential risk of malaria epidemics after these two weather disasters.

  8. Flood impacts on a water distribution network

    NASA Astrophysics Data System (ADS)

    Arrighi, Chiara; Tarani, Fabio; Vicario, Enrico; Castelli, Fabio

    2017-12-01

    Floods cause damage to people, buildings and infrastructures. Water distribution systems are particularly exposed, since water treatment plants are often located next to the rivers. Failure of the system leads to both direct losses, for instance damage to equipment and pipework contamination, and indirect impact, since it may lead to service disruption and thus affect populations far from the event through the functional dependencies of the network. In this work, we present an analysis of direct and indirect damages on a drinking water supply system, considering the hazard of riverine flooding as well as the exposure and vulnerability of active system components. The method is based on interweaving, through a semi-automated GIS procedure, a flood model and an EPANET-based pipe network model with a pressure-driven demand approach, which is needed when modelling water distribution networks in highly off-design conditions. Impact measures are defined and estimated so as to quantify service outage and potential pipe contamination. The method is applied to the water supply system of the city of Florence, Italy, serving approximately 380 000 inhabitants. The evaluation of flood impact on the water distribution network is carried out for different events with assigned recurrence intervals. Vulnerable elements exposed to the flood are identified and analysed in order to estimate their residual functionality and to simulate failure scenarios. Results show that in the worst failure scenario (no residual functionality of the lifting station and a 500-year flood), 420 km of pipework would require disinfection with an estimated cost of EUR 21 million, which is about 0.5 % of the direct flood losses evaluated for buildings and contents. Moreover, if flood impacts on the water distribution network are considered, the population affected by the flood is up to 3 times the population directly flooded.

  9. Discomfort from an Alkaline Formulation Delivered Subcutaneously in Humans

    PubMed Central

    Ward, W. Kenneth; Castle, Jessica R.; Branigan, Deborah L.; Massoud, Ryan G.; Youssef, Joseph El

    2013-01-01

    Background and Objective There is a paucity of data regarding tolerability of alkaline drugs administered subcutaneously. The aim of this study was to assess the tolerability of alkaline preparations of human albumin delivered subcutaneously to healthy humans. Methods We compared the tolerability of neutral versus alkaline (pH 10) formulations of human albumin in ten volunteers. With an intent to minimize the time required to reach physiological pH after injection, the alkaline formulation was buffered with a low concentration of glycine (20 mmol/L). Each formulation was given at two rates: over 5 seconds and over 60 seconds. A six-point scale was used to assess discomfort. Results For slow injections, there was a significant difference between pH 7.4 and pH 10 injections (0.4 ± 0.2 vs 1.1 ± 0.2, mean ± SEM; p = 0.025), though the degree of discomfort at pH 10 injections was only ‘mild or slight’. For fast injections, the difference between neutral and alkaline formulations was of borderline significance. Inflammation and oedema, as judged by a physician, were very minimal for all injections, irrespective of pH. Conclusion For subcutaneous drug administration (especially when delivered slowly), there was more discomfort associated with alkaline versus neutral formulations of albumin, though the discomfort was mild. This study suggests that there is little discomfort and inflammation resulting from subcutaneous administration of protein drugs formulated with weak buffers at alkaline pH. PMID:22568666

  10. Multi-temporal clustering of continental floods and associated atmospheric circulations

    NASA Astrophysics Data System (ADS)

    Liu, Jianyu; Zhang, Yongqiang

    2017-12-01

    Investigating clustering of floods has important social, economic and ecological implications. This study examines the clustering of Australian floods at different temporal scales and its possible physical mechanisms. Flood series with different severities are obtained by peaks-over-threshold (POT) sampling in four flood thresholds. At intra-annual scale, Cox regression and monthly frequency methods are used to examine whether and when the flood clustering exists, respectively. At inter-annual scale, dispersion indices with four-time variation windows are applied to investigate the inter-annual flood clustering and its variation. Furthermore, the Kernel occurrence rate estimate and bootstrap resampling methods are used to identify flood-rich/flood-poor periods. Finally, seasonal variation of horizontal wind at 850 hPa and vertical wind velocity at 500 hPa are used to investigate the possible mechanisms causing the temporal flood clustering. Our results show that: (1) flood occurrences exhibit clustering at intra-annual scale, which are regulated by climate indices representing the impacts of the Pacific and Indian Oceans; (2) the flood-rich months occur from January to March over northern Australia, and from July to September over southwestern and southeastern Australia; (3) stronger inter-annual clustering takes place across southern Australia than northern Australia; and (4) Australian floods are characterised by regional flood-rich and flood-poor periods, with 1987-1992 identified as the flood-rich period across southern Australia, but the flood-poor period across northern Australia, and 2001-2006 being the flood-poor period across most regions of Australia. The intra-annual and inter-annual clustering and temporal variation of flood occurrences are in accordance with the variation of atmospheric circulation. These results provide relevant information for flood management under the influence of climate variability, and, therefore, are helpful for developing

  11. Mapping flood and flooding potential indices: a methodological approach to identifying areas susceptible to flood and flooding risk. Case study: the Prahova catchment (Romania)

    NASA Astrophysics Data System (ADS)

    Zaharia, Liliana; Costache, Romulus; Prăvălie, Remus; Ioana-Toroimac, Gabriela

    2017-04-01

    Given that floods continue to cause yearly significant worldwide human and material damages, flood risk mitigation is a key issue and a permanent challenge in developing policies and strategies at various spatial scales. Therefore, a basic phase is elaborating hazard and flood risk maps, documents which are an essential support for flood risk management. The aim of this paper is to develop an approach that allows for the identification of flash-flood and flood-prone susceptible areas based on computing and mapping of two indices: FFPI (Flash-Flood Potential Index) and FPI (Flooding Potential Index). These indices are obtained by integrating in a GIS environment several geographical variables which control runoff (in the case of the FFPI) and favour flooding (in the case of the FPI). The methodology was applied in the upper (mountainous) and middle (hilly) catchment of the Prahova River, a densely populated and socioeconomically well-developed area which has been affected repeatedly by water-related hazards over the past decades. The resulting maps showing the spatialization of the FFPI and FPI allow for the identification of areas with high susceptibility to flashfloods and flooding. This approach can provide useful mapped information, especially for areas (generally large) where there are no flood/hazard risk maps. Moreover, the FFPI and FPI maps can constitute a preliminary step for flood risk and vulnerability assessment.

  12. An improved method to represent DEM uncertainty in glacial lake outburst flood propagation using stochastic simulations

    NASA Astrophysics Data System (ADS)

    Watson, Cameron S.; Carrivick, Jonathan; Quincey, Duncan

    2015-10-01

    Modelling glacial lake outburst floods (GLOFs) or 'jökulhlaups', necessarily involves the propagation of large and often stochastic uncertainties throughout the source to impact process chain. Since flood routing is primarily a function of underlying topography, communication of digital elevation model (DEM) uncertainty should accompany such modelling efforts. Here, a new stochastic first-pass assessment technique was evaluated against an existing GIS-based model and an existing 1D hydrodynamic model, using three DEMs with different spatial resolution. The analysis revealed the effect of DEM uncertainty and model choice on several flood parameters and on the prediction of socio-economic impacts. Our new model, which we call MC-LCP (Monte Carlo Least Cost Path) and which is distributed in the supplementary information, demonstrated enhanced 'stability' when compared to the two existing methods, and this 'stability' was independent of DEM choice. The MC-LCP model outputs an uncertainty continuum within its extent, from which relative socio-economic risk can be evaluated. In a comparison of all DEM and model combinations, the Shuttle Radar Topography Mission (SRTM) DEM exhibited fewer artefacts compared to those with the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM), and were comparable to those with a finer resolution Advanced Land Observing Satellite Panchromatic Remote-sensing Instrument for Stereo Mapping (ALOS PRISM) derived DEM. Overall, we contend that the variability we find between flood routing model results suggests that consideration of DEM uncertainty and pre-processing methods is important when assessing flow routing and when evaluating potential socio-economic implications of a GLOF event. Incorporation of a stochastic variable provides an illustration of uncertainty that is important when modelling and communicating assessments of an inherently complex process.

  13. The effect of alkaline pretreatment methods on cellulose structure and accessibility

    DOE PAGES

    Bali, Garima; Meng, Xianzhi; Deneff, Jacob I.; ...

    2014-11-24

    The effects of different alkaline pretreatments on cellulose structural features and accessibility are compared and correlated with the enzymatic hydrolysis of Populus. The pretreatments are shown to modify polysaccharides and lignin content to enhance the accessibility for cellulase enzymes. The highest increase in the cellulose accessibility was observed in dilute sodium hydroxide, followed by methods using ammonia soaking and lime (Ca(OH) 2). The biggest increase of cellulose accessibility occurs during the first 10 min of pretreatment, with further increases at a slower rate as severity increases. Low temperature ammonia soaking at longer residence times dissolved a major portion of hemicellulosemore » and exhibited higher cellulose accessibility than high temperature soaking. Moreover, the most significant reduction of degree of polymerization (DP) occurred for dilute sodium hydroxide (NaOH) and ammonia pretreated Populus samples. The study thus identifies important cellulose structural features and relevant parameters related to biomass recalcitrance.« less

  14. From flood management systems to flood resilient systems: integration of flood resilient technologies

    NASA Astrophysics Data System (ADS)

    Salagnac, J.-L.; Diez, J.; Tourbier, J.

    2012-04-01

    Flooding has always been a major risk world-wide. Humans chose to live and develop settlements close to water (rivers, seas) due to the resources water brings, i.e. food, energy, capacity to economically transport persons and goods, and recreation. However, the risk from flooding, including pluvial flooding, often offsets these huge advantages. Floods sometimes have terrible consequences from both a human and economic point of view. The permanence and growth of urban areas in flood-prone zones despite these risks is a clear indication of the choices of concerned human groups. The observed growing concentration of population along the sea shore, the increase of urban population worldwide, the exponential growth of the world population and possibly climate change are factors that confirm flood will remain a major issue for the next decades. Flood management systems are designed and implemented to cope with such situations. In spite of frequent events, lessons look to be difficult to draw out and progresses are rather slow. The list of potential triggers to improve flood management systems is nevertheless well established: information, education, awareness raising, alert, prevention, protection, feedback from events, ... Many disciplines are concerned which cover a wide range of soft and hard sciences. A huge amount of both printed and electronic literature is available. Regulations are abundant. In spite of all these potentially favourable elements, similar questions spring up after each new significant event: • Was the event forecast precise enough? • Was the alert system efficient? • Why were buildings built in identified flood prone areas? • Why did the concerned population not follow instructions? • Why did the dike break? • What should we do to avoid it happens again? • What about damages evaluation, wastes and debris evacuation, infrastructures and buildings repair, activity recovery, temporary relocation of inhabitants, health concerns, insurance

  15. Near Real-Time Flood Monitoring and Impact Assessment Systems. Chapter 6; [Case Study: 2011 Flooding in Southeast Asia

    NASA Technical Reports Server (NTRS)

    Ahamed, Aakash; Bolten, John; Doyle, Colin; Fayne, Jessica

    2016-01-01

    Floods are the costliest natural disaster, causing approximately 6.8 million deaths in the twentieth century alone. Worldwide economic flood damage estimates in 2012 exceed $19 Billion USD. Extended duration floods also pose longer term threats to food security, water, sanitation, hygiene, and community livelihoods, particularly in developing countries. Projections by the Intergovernmental Panel on Climate Change (IPCC) suggest that precipitation extremes, rainfall intensity, storm intensity, and variability are increasing due to climate change. Increasing hydrologic uncertainty will likely lead to unprecedented extreme flood events. As such, there is a vital need to enhance and further develop traditional techniques used to rapidly assess flooding and extend analytical methods to estimate impacted population and infrastructure. Measuring flood extent in situ is generally impractical, time consuming, and can be inaccurate. Remotely sensed imagery acquired from space-borne and airborne sensors provides a viable platform for consistent and rapid wall-to-wall monitoring of large flood events through time. Terabytes of freely available satellite imagery are made available online each day by NASA, ESA, and other international space research institutions. Advances in cloud computing and data storage technologies allow researchers to leverage these satellite data and apply analytical methods at scale. Repeat-survey earth observations help provide insight about how natural phenomena change through time, including the progression and recession of floodwaters. In recent years, cloud-penetrating radar remote sensing techniques (e.g., Synthetic Aperture Radar) and high temporal resolution imagery platforms (e.g., MODIS and its 1-day return period), along with high performance computing infrastructure, have enabled significant advances in software systems that provide flood warning, assessments, and hazard reduction potential. By incorporating social and economic data

  16. Flooding and mental health: a systematic mapping review.

    PubMed

    Fernandez, Ana; Black, John; Jones, Mairwen; Wilson, Leigh; Salvador-Carulla, Luis; Astell-Burt, Thomas; Black, Deborah

    2015-01-01

    Floods are the most common type of global natural disaster. Floods have a negative impact on mental health. Comprehensive evaluation and review of the literature are lacking. To systematically map and review available scientific evidence on mental health impacts of floods caused by extended periods of heavy rain in river catchments. We performed a systematic mapping review of published scientific literature in five languages for mixed studies on floods and mental health. PUBMED and Web of Science were searched to identify all relevant articles from 1994 to May 2014 (no restrictions). The electronic search strategy identified 1331 potentially relevant papers. Finally, 83 papers met the inclusion criteria. Four broad areas are identified: i) the main mental health disorders-post-traumatic stress disorder, depression and anxiety; ii] the factors associated with mental health among those affected by floods; iii) the narratives associated with flooding, which focuses on the long-term impacts of flooding on mental health as a consequence of the secondary stressors; and iv) the management actions identified. The quantitative and qualitative studies have consistent findings. However, very few studies have used mixed methods to quantify the size of the mental health burden as well as exploration of in-depth narratives. Methodological limitations include control of potential confounders and short-term follow up. Floods following extreme events were excluded from our review. Although the level of exposure to floods has been systematically associated with mental health problems, the paucity of longitudinal studies and lack of confounding controls precludes strong conclusions. We recommend that future research in this area include mixed-method studies that are purposefully designed, using more rigorous methods. Studies should also focus on vulnerable groups and include analyses of policy and practical responses.

  17. Low-heat, mild alkaline pretreatment of switchgrass for anaerobic digestion.

    PubMed

    Jin, Guang; Bierma, Tom; Walker, Paul M

    2014-01-01

    This study examines the effectiveness of alkaline pretreatment under mild heat conditions (100°C or 212°F) on the anaerobic co-digestion of switchgrass. The effects of alkaline concentration, types of alkaline, heating time and rinsing were evaluated. In addition to batch studies, continuous-feed studies were performed in triplicate to identify potential digester operational problems caused by switchgrass co-digestion while accounting for uncertainty due to digester variability. Few studies have examined anaerobic digestion of switchgrass or the effects of mild heating to enhance alkaline pretreatment prior to biomass digestion. Results indicate that pretreatment can significantly enhance digestion of coarse-ground (≤ 0.78 cm particle size) switchgrass. Energy conversion efficiency as high as 63% was observed, and was comparable or superior to fine-grinding as a pretreatment method. The optimal NaOH concentration was found to be 5.5% (wt/wt alkaline/biomass) with a 91.7% moisture level. No evidence of operational problems such as solids build-up, poor mixing, or floating materials were observed. These results suggest the use of waste heat from a generator could reduce the concentration of alkaline required to adequately pretreat lignocellulosic feedstock prior to anaerobic digestion.

  18. Flood resilience urban territories. Flood resilience urban territories.

    NASA Astrophysics Data System (ADS)

    Beraud, Hélène; Barroca, Bruno; Hubert, Gilles

    2010-05-01

    The flood's impact during the last twenty years on French territory reveals our lack of preparation towards large-extended floods which might cause the stopping of companies' activity, services, or lead to housing unavailability during several months. New Orleans' case has to exemplify us: four years after the disaster, the city still couldn't get back its dynamism. In France, more than 300 towns are flood-exposed. While these towns are the mainspring of territory's development, it is likely that the majority of them couldn't get up quickly after a large-extended flood. Therefore, to understand and improve the urban territory's resilience facing floods is a real stake for territory's development. Urban technical networks supply, unify and irrigate all urban territories' constituents. Characterizing their flood resilience can be interesting to understand better urban resilience. In this context, waste management during and after floods is completely crucial. During a flood, the waste management network can become dysfunctional (roads cut, waste storage installations or waste treatment flooded). How can the mayor respect his obligation to guarantee salubrity and security in his city? In post flood the question is even more problematic. The waste management network presents a real stake for territory's restart. After a flood, building materials, lopped-of branches, furniture, business stocks, farm stocks, mud, rubbles, animal cadavers are wet, mixed, even polluted by hydrocarbons or toxic substances. The waste's volume can be significant. Sanitary and environmental risks can be crucial. In view of this situation, waste's management in post crisis period raises a real problem. What to make of this waste? How to collect it? Where to stock it? How to process it? Who is responsible? Answering these questions is all the more strategic since this waste is the mark of disaster. Thus, cleaning will be the first population's and local actor's reflex in order to forget the

  19. Chapter A6. Section 6.6. Alkalinity and Acid Neutralizing Capacity

    USGS Publications Warehouse

    Rounds, Stewart A.; Wilde, Franceska D.

    2002-01-01

    Alkalinity (determined on a filtered sample) and Acid Neutralizing Capacity (ANC) (determined on a whole-water sample) are measures of the ability of a water sample to neutralize strong acid. Alkalinity and ANC provide information on the suitability of water for uses such as irrigation, determining the efficiency of wastewater processes, determining the presence of contamination by anthropogenic wastes, and maintaining ecosystem health. In addition, alkalinity is used to gain insights on the chemical evolution of an aqueous system. This section of the National Field Manual (NFM) describes the USGS field protocols for alkalinity/ANC determination using either the inflection-point or Gran function plot methods, including calculation of carbonate species, and provides guidance on equipment selection.

  20. Nonstationary decision model for flood risk decision scaling

    NASA Astrophysics Data System (ADS)

    Spence, Caitlin M.; Brown, Casey M.

    2016-11-01

    Hydroclimatic stationarity is increasingly questioned as a default assumption in flood risk management (FRM), but successor methods are not yet established. Some potential successors depend on estimates of future flood quantiles, but methods for estimating future design storms are subject to high levels of uncertainty. Here we apply a Nonstationary Decision Model (NDM) to flood risk planning within the decision scaling framework. The NDM combines a nonstationary probability distribution of annual peak flow with optimal selection of flood management alternatives using robustness measures. The NDM incorporates structural and nonstructural FRM interventions and valuation of flows supporting ecosystem services to calculate expected cost of a given FRM strategy. A search for the minimum-cost strategy under incrementally varied representative scenarios extending across the plausible range of flood trend and value of the natural flow regime discovers candidate FRM strategies that are evaluated and compared through a decision scaling analysis (DSA). The DSA selects a management strategy that is optimal or close to optimal across the broadest range of scenarios or across the set of scenarios deemed most likely to occur according to estimates of future flood hazard. We illustrate the decision framework using a stylized example flood management decision based on the Iowa City flood management system, which has experienced recent unprecedented high flow episodes. The DSA indicates a preference for combining infrastructural and nonstructural adaptation measures to manage flood risk and makes clear that options-based approaches cannot be assumed to be "no" or "low regret."

  1. A macro-enzyme cause of an isolated increase of alkaline phosphatase.

    PubMed

    Cervinski, Mark A; Lee, Hong Kee; Martin, Isabella W; Gavrilov, Dimitar K

    2015-02-02

    Macroenzyme complexes of serum enzymes and antibody can increase the circulating enzymatic activity and may lead to unnecessary additional testing and procedures. Laboratory physicians and scientists need to be aware of techniques to identify macroenzyme complexes when suspected. To investigate the possibility of a macro-alkaline phosphatase in the serum of a 74 year old male with persistently increased alkaline phosphatase we coupled a protein A/G agarose affinity chromatography technique with isoenzyme electrophoresis to look for the presence of macro-alkaline phosphatase. The majority of the alkaline phosphatase activity in the patient's serum sample was bound to the column and only a minor fraction (25%) of alkaline phosphatase activity was present in the column flow-through. The alkaline phosphatase activity was also found to co-elute with the immunoglobulins in the patient sample. The alkaline phosphatase activity in a control serum sample concurrently treated in the same manner did not bind to the column and was found in the column flow-through. The use of protein A/G agarose affinity chromatography is a rapid and simple method that can be applied to the investigation of other macro-enzyme complexes. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Application of GPR Method for Detection of Loose Zones in Flood Levee

    NASA Astrophysics Data System (ADS)

    Gołębiowski, Tomisław; Małysa, Tomasz

    2018-02-01

    In the paper the results of non-invasive georadar (GPR) surveys carried out for detection of loose zones located in the flood levee was presented. Terrain measurements were performed on the Vistula river flood levee in the village of Wawrzeńczyce near Cracow. In the investigation site, during the flood in 2010, leakages of levee were observed, so detection of inner water filtration paths was an important matter taking into account the stability of the levee during the next flood. GPR surveys had reconnaissance character, so they were carried out with the use of short-offset reflection profiling (SORP) technique and radargrams were subjected to standard signal processing. The results of surveys allowed to outline main loose zone in the levee which were the reason of leakages in 2010. Additionally gravel interbeddings in sand were detected which had an important influence, due to higher porosity of such zones, to water filtration inside of the levee. In the paper three solutions which allow to increase quality and resolution of radargrams were presented, i.e. changeable-polarisation surveys, advanced signal processing and DHA procedure.

  3. Documentary evidence of past floods in Europe and their utility in flood frequency estimation

    NASA Astrophysics Data System (ADS)

    Kjeldsen, T. R.; Macdonald, N.; Lang, M.; Mediero, L.; Albuquerque, T.; Bogdanowicz, E.; Brázdil, R.; Castellarin, A.; David, V.; Fleig, A.; Gül, G. O.; Kriauciuniene, J.; Kohnová, S.; Merz, B.; Nicholson, O.; Roald, L. A.; Salinas, J. L.; Sarauskiene, D.; Šraj, M.; Strupczewski, W.; Szolgay, J.; Toumazis, A.; Vanneuville, W.; Veijalainen, N.; Wilson, D.

    2014-09-01

    This review outlines the use of documentary evidence of historical flood events in contemporary flood frequency estimation in European countries. The study shows that despite widespread consensus in the scientific literature on the utility of documentary evidence, the actual migration from academic to practical application has been limited. A detailed review of flood frequency estimation guidelines from different countries showed that the value of historical data is generally recognised, but practical methods for systematic and routine inclusion of this type of data into risk analysis are in most cases not available. Studies of historical events were identified in most countries, and good examples of national databases attempting to collate the available information were identified. The conclusion is that there is considerable potential for improving the reliability of the current flood risk assessments by harvesting the valuable information on past extreme events contained in the historical data sets.

  4. Reducing uncertainty with flood frequency analysis: The contribution of paleoflood and historical flood information

    NASA Astrophysics Data System (ADS)

    Lam, Daryl; Thompson, Chris; Croke, Jacky; Sharma, Ashneel; Macklin, Mark

    2017-03-01

    Using a combination of stream gauge, historical, and paleoflood records to extend extreme flood records has proven to be useful in improving flood frequency analysis (FFA). The approach has typically been applied in localities with long historical records and/or suitable river settings for paleoflood reconstruction from slack-water deposits (SWDs). However, many regions around the world have neither extensive historical information nor bedrock gorges suitable for SWDs preservation and paleoflood reconstruction. This study from subtropical Australia demonstrates that confined, semialluvial channels such as macrochannels provide relatively stable boundaries over the 1000-2000 year time period and the preserved SWDs enabled paleoflood reconstruction and their incorporation into FFA. FFA for three sites in subtropical Australia with the integration of historical and paleoflood data using Bayesian Inference methods showed a significant reduction in uncertainty associated with the estimated discharge of a flood quantile. Uncertainty associated with estimated discharge for the 1% Annual Exceedance Probability (AEP) flood is reduced by more than 50%. In addition, sensitivity analysis of possible within-channel boundary changes shows that FFA is not significantly affected by any associated changes in channel capacity. Therefore, a greater range of channel types may be used for reliable paleoflood reconstruction by evaluating the stability of inset alluvial units, thereby increasing the quantity of temporal data available for FFA. The reduction in uncertainty, particularly in the prediction of the ≤1% AEP design flood, will improve flood risk planning and management in regions with limited temporal flood data.

  5. Effects of Flood Control Strategies on Flood Resilience Under Sociohydrological Disturbances

    NASA Astrophysics Data System (ADS)

    Sung, Kyungmin; Jeong, Hanseok; Sangwan, Nikhil; Yu, David J.

    2018-04-01

    A community capacity to cope with flood hazards, or community flood resilience, emerges from the interplay of hydrological and social processes. This interplay can be significantly influenced by the flood control strategy adopted by a society, i.e., how a society sets its desired flood protection level and strives to achieve this goal. And this interplay can be further complicated by rising land-sea level differences, seasonal water level fluctuations, and economic change. But not much research has been done on how various forms of flood control strategies affect human-flood interactions under these disturbances and therefore flood resilience in the long run. The current study is an effort to address these issues by developing a conceptual model of human-flood interaction mediated by flood control strategies. Our model extends the existing model of Yu et al. (2017), who investigated the flood resilience of a community-based flood protection system in coastal Bangladesh. The major extensions made in this study are inclusions of various forms of flood control strategies (both adaptive and nonadaptive ones), the challenge of rising land-sea level differences, and various high tide level scenarios generated from modifying the statistical variances and averages. Our results show that adaptive forms of flood control strategies tend to outperform nonadaptive ones for maintaining the model community's flood protection system. Adaptive strategies that dynamically adjust target flood protection levels through close monitoring of flood damages and social memories of flood risk can help the model community deal with various disturbances.

  6. Multi-dimensional perspectives of flood risk - using a participatory framework to develop new approaches to flood risk communication

    NASA Astrophysics Data System (ADS)

    Rollason, Edward; Bracken, Louise; Hardy, Richard; Large, Andy

    2017-04-01

    Flooding is a major hazard across Europe which, since, 1998 has caused over €52 million in damages and displaced over half a million people. Climate change is predicted to increase the risks posed by flooding in the future. The 2007 EU Flood Directive cemented the use of flood risk maps as a central tool in understanding and communicating flood risk. Following recent flooding in England, an urgent need to integrate people living at risk from flooding into flood management approaches, encouraging flood resilience and the up-take of resilient activities has been acknowledged. The effective communication of flood risk information plays a major role in allowing those at risk to make effective decisions about flood risk and increase their resilience, however, there are emerging concerns over the effectiveness of current approaches. The research presented explores current approaches to flood risk communication in England and the effectiveness of these methods in encouraging resilient actions before and during flooding events. The research also investigates how flood risk communications could be undertaken more effectively, using a novel participatory framework to integrate the perspectives of those living at risk. The research uses co-production between local communities and researchers in the environmental sciences, using a participatory framework to bring together local knowledge of flood risk and flood communications. Using a local competency group, the research explores what those living at risk from flooding want from flood communications in order to develop new approaches to help those at risk understand and respond to floods. Suggestions for practice are refined by the communities to co-produce recommendations. The research finds that current approaches to real-time flood risk communication fail to forecast the significance of predicted floods, whilst flood maps lack detailed information about how floods occur, or use scientific terminology which people at risk

  7. Rapid flood loss estimation for large scale floods in Germany

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Merz, Bruno

    2013-04-01

    Rapid evaluations of flood events are needed for efficient responses both in emergency management and financial appraisal. Beyond that, closely monitoring and documenting the formation and development of flood events and their impacts allows for an improved understanding and in depth analyses of the interplay between meteorological, hydrological, hydraulic and societal causes leading to flood damage. This contribution focuses on the development of a methodology for the rapid assessment of flood events. In the first place, the focus is on the prediction of damage to residential buildings caused by large scale floods in Germany. For this purpose an operational flood event analysis system is developed. This system has basic spatial thematic data available and supports data capturing about the current flood situation. This includes the retrieval of online gauge data and the integration of remote sensing data. Further, it provides functionalities to evaluate the current flood situation, to assess the hazard extent and intensity and to estimate the current flood impact using the flood loss estimation model FLEMOps+r. The operation of the flood event analysis system will be demonstrated for the past flood event from January 2011 with a focus on the Elbe/Saale region. On this grounds, further requirements and potential for improving the information basis as for instance by including hydrological and /or hydraulic model results as well as information from social sensors will be discussed.

  8. Osteoblast Differentiation on Collagen Scaffold with Immobilized Alkaline Phosphatase.

    PubMed

    Jafary, F; Hanachi, P; Gorjipour, K

    2017-01-01

    In tissue engineering, scaffold characteristics play an important role in the biological interactions between cells and the scaffold. Cell adhesion, proliferation, and activation depend on material properties used for the fabrication of scaffolds. In the present investigation, we used collagen with proper characteristics including mechanically stability, biodegradability and low antigenicity. Optimization of the scaffold was done by immobilization of alkaline phosphatase on the collagen surface via cross-linking method, because this enzyme is one of the most important markers of osteoblast, which increases inorganic phosphate concentration and promote mineralization of bone formation. Alkaline phosphatase was immobilized on a collagen surface by 1-ethyl-3-(dimethylaminopropyl) carbodiimide hydrochloride, as a reagent. Then, rat mesenchymal stem cells were cultured in osteogenic medium in control and treated groups. The osteogenesis-related genes were compared between treatments (differentiated cells with immobilized alkaline phosphatase/collagen scaffold) and control groups (differentiated cells on collagen surface without alkaline phosphatase) on days 3 and 7 by quantitative real-time PCR (QIAGEN software). Several genes, including alkaline phosphatase, collagen type I and osteocalcine associated with calcium binding and mineralization, showed upregulation in expression during the first 3 days, whereas tumor necrosis factor-α, acting as an inhibitor of differentiation, was down-regulated during osteogenesis. Collagen scaffold with immobilized alkaline phosphatase can be utilized as a good candidate for enhancing the differentiation of osteoblasts from mesenchymal stem cells.

  9. Timing of floods in southeastern China: Seasonal properties and potential causes

    NASA Astrophysics Data System (ADS)

    Zhang, Qiang; Gu, Xihui; Singh, Vijay P.; Shi, Peijun; Luo, Ming

    2017-09-01

    Flood hazards and flood risks in southeastern China have been causing increasing concerns due to dense population and highly-developed economy. This study attempted to address changes of seasonality, timing of peak floods and variability of occurrence date of peak floods using circular statistical methods and the modified Mann-Kendall trend detection method. The causes of peak flood changes were also investigated. Results indicated that: (1) floods were subject to more seasonality and temporal clustering when compared to precipitation extremes. However, seasonality of floods and extreme precipitation was subject to spatial heterogeneity in northern Guangdong. Similar changing patterns of peak floods and extreme precipitation were found in coastal regions; (2) significant increasing/decreasing seasonality, but no confirmed spatial patterns, were observed for peak floods and extreme precipitation. Peak floods in northern Guangdong province had decreasing variability, but had larger variability in coastal regions; (3) tropical cyclones had remarkable impacts on extreme precipitation changes in coastal regions of southeastern China, and peak floods as well. The landfalling of tropical cyclones was decreasing and concentrated during June-September; this is the major reason for earlier but enhanced seasonality of peak floods in coastal regions. This study sheds new light on flood behavior in coastal regions in a changing environment.

  10. Evaluation of medium-range ensemble flood forecasting based on calibration strategies and ensemble methods in Lanjiang Basin, Southeast China

    NASA Astrophysics Data System (ADS)

    Liu, Li; Gao, Chao; Xuan, Weidong; Xu, Yue-Ping

    2017-11-01

    Ensemble flood forecasts by hydrological models using numerical weather prediction products as forcing data are becoming more commonly used in operational flood forecasting applications. In this study, a hydrological ensemble flood forecasting system comprised of an automatically calibrated Variable Infiltration Capacity model and quantitative precipitation forecasts from TIGGE dataset is constructed for Lanjiang Basin, Southeast China. The impacts of calibration strategies and ensemble methods on the performance of the system are then evaluated. The hydrological model is optimized by the parallel programmed ε-NSGA II multi-objective algorithm. According to the solutions by ε-NSGA II, two differently parameterized models are determined to simulate daily flows and peak flows at each of the three hydrological stations. Then a simple yet effective modular approach is proposed to combine these daily and peak flows at the same station into one composite series. Five ensemble methods and various evaluation metrics are adopted. The results show that ε-NSGA II can provide an objective determination on parameter estimation, and the parallel program permits a more efficient simulation. It is also demonstrated that the forecasts from ECMWF have more favorable skill scores than other Ensemble Prediction Systems. The multimodel ensembles have advantages over all the single model ensembles and the multimodel methods weighted on members and skill scores outperform other methods. Furthermore, the overall performance at three stations can be satisfactory up to ten days, however the hydrological errors can degrade the skill score by approximately 2 days, and the influence persists until a lead time of 10 days with a weakening trend. With respect to peak flows selected by the Peaks Over Threshold approach, the ensemble means from single models or multimodels are generally underestimated, indicating that the ensemble mean can bring overall improvement in forecasting of flows. For

  11. A uniform technique for flood frequency analysis.

    USGS Publications Warehouse

    Thomas, W.O.

    1985-01-01

    This uniform technique consisted of fitting the logarithms of annual peak discharges to a Pearson Type III distribution using the method of moments. The objective was to adopt a consistent approach for the estimation of floodflow frequencies that could be used in computing average annual flood losses for project evaluation. In addition, a consistent approach was needed for defining equitable flood-hazard zones as part of the National Flood Insurance Program. -from ASCE Publications Information

  12. Flood Extent Mapping for Namibia Using Change Detection and Thresholding with SAR

    NASA Technical Reports Server (NTRS)

    Long, Stephanie; Fatoyinbo, Temilola E.; Policelli, Frederick

    2014-01-01

    A new method for flood detection change detection and thresholding (CDAT) was used with synthetic aperture radar (SAR) imagery to delineate the extent of flooding for the Chobe floodplain in the Caprivi region of Namibia. This region experiences annual seasonal flooding and has seen a recent renewal of severe flooding after a long dry period in the 1990s. Flooding in this area has caused loss of life and livelihoods for the surrounding communities and has caught the attention of disaster relief agencies. There is a need for flood extent mapping techniques that can be used to process images quickly, providing near real-time flooding information to relief agencies. ENVISAT/ASAR and Radarsat-2 images were acquired for several flooding seasons from February 2008 to March 2013. The CDAT method was used to determine flooding from these images and includes the use of image subtraction, decision based classification with threshold values, and segmentation of SAR images. The total extent of flooding determined for 2009, 2011 and 2012 was about 542 km2, 720 km2, and 673 km2 respectively. Pixels determined to be flooded in vegetation were typically <0.5 % of the entire scene, with the exception of 2009 where the detection of flooding in vegetation was much greater (almost one third of the total flooded area). The time to maximum flooding for the 2013 flood season was determined to be about 27 days. Landsat water classification was used to compare the results from the new CDAT with SAR method; the results show good spatial agreement with Landsat scenes.

  13. Alkaline pH sensor molecules.

    PubMed

    Murayama, Takashi; Maruyama, Ichiro N

    2015-11-01

    Animals can survive only within a narrow pH range. This requires continual monitoring of environmental and body-fluid pH. Although a variety of acidic pH sensor molecules have been reported, alkaline pH sensor function is not well understood. This Review describes neuronal alkaline pH sensors, grouped according to whether they monitor extracellular or intracellular alkaline pH. Extracellular sensors include the receptor-type guanylyl cyclase, the insulin receptor-related receptor, ligand-gated Cl- channels, connexin hemichannels, two-pore-domain K+ channels, and transient receptor potential (TRP) channels. Intracellular sensors include TRP channels and gap junction channels. Identification of molecular mechanisms underlying alkaline pH sensing is crucial for understanding how animals respond to environmental alkaline pH and how body-fluid pH is maintained within a narrow range. © 2015 Wiley Periodicals, Inc.

  14. Re-assessing the flood risk in Scotland.

    PubMed

    Black, Andrew R; Burns, John C

    2002-07-22

    This paper presents a review of changes in flood risk estimation on Scottish rivers resulting from re-analysis of flood records or from the application of new methods. The review arises at a time when flood damages have received recent prominence through the occurrence of a number of extreme floods in Scotland, and when the possible impacts of climate change on flood risk are receiving considerable attention. An analysis of the nine longest available peaks-over-threshold (POT) flood series for Scottish rivers reveals that, for thresholds yielding two events per year on average, annual POT frequencies on western rivers have increased in the 1980s/1990s to maximum recorded values, while in the east, values were highest in the 1950s/1960s. These results support the results of flood modelling work based on rainfall and temperature records from the 1870s, which indicate that, in western catchments, annual POT frequencies in the 1980s/1990s are unprecedented. No general trends in flood magnitude series were found, but an unexpected cluster of extreme floods is identified as having occurred since 1988, resulting in eight of Scotland's 16 largest gauged rivers producing their maximum recorded flows since then. These shifts are related to recent increases in the dominance of westerly airflows, share similarities with the results of climate change modelling, and collectively point to increases in flood risk in many parts of Scotland. The paper also reviews advances in flood risk estimation arising from the publication of the UK Flood Estimation Handbook, developments in the collection and use of historic flood estimation and the production of maps of 100-year flood areal extent. Finally the challenges in flood risk estimation posed by climate change are examined, particularly in relation to the assumption of stationarity.

  15. Technique for estimating depth of floods in Tennessee

    USGS Publications Warehouse

    Gamble, C.R.

    1983-01-01

    Estimates of flood depths are needed for design of roadways across flood plains and for other types of construction along streams. Equations for estimating flood depths in Tennessee were derived using data for 150 gaging stations. The equations are based on drainage basin size and can be used to estimate depths of the 10-year and 100-year floods for four hydrologic areas. A method also was developed for estimating depth of floods having recurrence intervals between 10 and 100 years. Standard errors range from 22 to 30 percent for the 10-year depth equations and from 23 to 30 percent for the 100-year depth equations. (USGS)

  16. Characterization of remarkable floods in France, a transdisciplinary approach applied on generalized floods of January 1910

    NASA Astrophysics Data System (ADS)

    Boudou, Martin; Lang, Michel; Vinet, Freddy; Coeur, Denis

    2014-05-01

    . The January 1910's flood is one of these remarkable floods. This event is foremost known for its aftermaths on the Seine basin, where the flood remains the strongest recorded in Paris since 1658. However, its impacts were also widespread to France's Eastern regions (Martin, 2001). To demonstrate the evaluation grid's interest, we propose a deep analysis of the 1910's river flood with the integration of historical documentation. The approach focus on eastern France where the flood remains the highest recorded for several rivers but were often neglected by scientists in favor of Paris's flood. Through a transdisciplinary research based on the evaluation grid method, we will describe the January 1910 flood event and define why it can be considered as a remarkable flood for these regions.

  17. Methods of Knowledge Exchange and Learning Focused on Local Authorities' Experiences of Flood Science Communication

    ERIC Educational Resources Information Center

    Stokes, Alison; Roberts, Carolyn; Crowley, Kate; McEwen, Lindsey

    2015-01-01

    Devastating floods in 2007 across western England resulted in legislative changes which have placed increased responsibility on local government for managing and mitigating local flood risk. For these changes to be effective, professional stakeholders need to understand fundamental concepts in flood science of which they may have no prior…

  18. Flood inundation map library, Fort Kent, Maine

    USGS Publications Warehouse

    Lombard, Pamela J.

    2012-01-01

    Severe flooding occurred in northern Maine from April 28 to May 1, 2008, and damage was extensive in the town of Fort Kent (Lombard, 2010). Aroostook County was declared a Federal disaster area on May 9, 2008. The extent of flooding on both the Fish and St. John Rivers during this event showed that the current Federal Emergency Management Agency (FEMA) Flood Insurance Study (FIS) and Flood Insurance Rate Map (FIRM) (Federal Emergency Management Agency, 1979) were out of date. The U.S. Geological Survey (USGS) conducted a study to develop a flood inundation map library showing the areas and depths for a range of flood stages from bankfull to the flood of record for Fort Kent to complement an updated FIS (Federal Emergency Management Agency, in press). Hydrologic analyses that support the maps include computer models with and without the levee and with various depths of backwater on the Fish River. This fact sheet describes the methods used to develop the maps and describes how the maps can be accessed.

  19. Understanding the effects of past flood events and perceived and estimated flood risks on individuals' voluntary flood insurance purchase behavior.

    PubMed

    Shao, Wanyun; Xian, Siyuan; Lin, Ning; Kunreuther, Howard; Jackson, Nida; Goidel, Kirby

    2017-01-01

    Over the past several decades, the economic damage from flooding in the coastal areas has greatly increased due to rapid coastal development coupled with possible climate change impacts. One effective way to mitigate excessive economic losses from flooding is to purchase flood insurance. Only a minority of coastal residents however have taken this preventive measure. Using original survey data for all coastal counties of the United States Gulf Coast merged with contextual data, this study examines the effects of external influences and perceptions of flood-related risks on individuals' voluntary behaviors to purchase flood insurance. It is found that the estimated flood hazard conveyed through the U.S. Federal Emergency Management Agency's (FEMA's) flood maps, the intensities and consequences of past storms and flooding events, and perceived flood-related risks significantly affect individual's voluntary purchase of flood insurance. This behavior is also influenced by home ownership, trust in local government, education, and income. These findings have several important policy implications. First, FEMA's flood maps have been effective in conveying local flood risks to coastal residents, and correspondingly influencing their decisions to voluntarily seek flood insurance in the U.S. Gulf Coast. Flood maps therefore should be updated frequently to reflect timely and accurate information about flood hazards. Second, policy makers should design strategies to increase homeowners' trust in the local government, to better communicate flood risks with residents, to address the affordability issue for the low-income, and better inform less educated homeowners through various educational programs. Future studies should examine the voluntary flood insurance behavior across countries that are vulnerable to flooding. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Flood hydrology for Dry Creek, Lake County, Northwestern Montana

    USGS Publications Warehouse

    Parrett, C.; Jarrett, R.D.

    2004-01-01

    Dry Creek drains about 22.6 square kilometers of rugged mountainous terrain upstream from Tabor Dam in the Mission Range near St. Ignatius, Montana. Because of uncertainty about plausible peak discharges and concerns regarding the ability of the Tabor Dam spillway to safely convey these discharges, the flood hydrology for Dry Creek was evaluated on the basis of three hydrologic and geologic methods. The first method involved determining an envelope line relating flood discharge to drainage area on the basis of regional historical data and calculating a 500-year flood for Dry Creek using a regression equation. The second method involved paleoflood methods to estimate the maximum plausible discharge for 35 sites in the study area. The third method involved rainfall-runoff modeling for the Dry Creek basin in conjunction with regional precipitation information to determine plausible peak discharges. All of these methods resulted in estimates of plausible peak discharges that are substantially less than those predicted by the more generally applied probable maximum flood technique. Copyright ASCE 2004.

  1. Dynamic building risk assessment theoretic model for rainstorm-flood utilization ABM and ABS

    NASA Astrophysics Data System (ADS)

    Lai, Wenze; Li, Wenbo; Wang, Hailei; Huang, Yingliang; Wu, Xuelian; Sun, Bingyun

    2015-12-01

    Flood is one of natural disasters with the worst loss in the world. It needs to assess flood disaster risk so that we can reduce the loss of flood disaster. Disaster management practical work needs the dynamic risk results of building. Rainstorm flood disaster system is a typical complex system. From the view of complex system theory, flood disaster risk is the interaction result of hazard effect objects, rainstorm flood hazard factors, and hazard environments. Agent-based modeling (ABM) is an important tool for complex system modeling. Rainstorm-flood building risk dynamic assessment method (RFBRDAM) was proposed using ABM in this paper. The interior structures and procedures of different agents in proposed meth had been designed. On the Netlogo platform, the proposed method was implemented to assess the building risk changes of the rainstorm flood disaster in the Huaihe River Basin using Agent-based simulation (ABS). The results indicated that the proposed method can dynamically assess building risk of the whole process for the rainstorm flood disaster. The results of this paper can provide one new approach for flood disaster building risk dynamic assessment and flood disaster management.

  2. Flood risk (d)evolution: Disentangling key drivers of flood risk change with a retro-model experiment.

    PubMed

    Zischg, Andreas Paul; Hofer, Patrick; Mosimann, Markus; Röthlisberger, Veronika; Ramirez, Jorge A; Keiler, Margreth; Weingartner, Rolf

    2018-05-19

    Flood risks are dynamically changing over time. Over decades and centuries, the main drivers for flood risk change are influenced either by perturbations or slow alterations in the natural environment or, more importantly, by socio-economic development and human interventions. However, changes in the natural and human environment are intertwined. Thus, the analysis of the main drivers for flood risk changes requires a disentangling of the individual risk components. Here, we present a method for isolating the individual effects of selected drivers of change and selected flood risk management options based on a model experiment. In contrast to purely synthetic model experiments, we built our analyses upon a retro-model consisting of several spatio-temporal stages of river morphology and settlement structure. The main advantage of this approach is that the overall long-term dynamics are known and do not have to be assumed. We used this model setup to analyse the temporal evolution of the flood risk, for an ex-post evaluation of the key drivers of change, and for analysing possible alternative pathways for flood risk evolution under different governance settings. We showed that in the study region the construction of lateral levees and the consecutive river incision are the main drivers for decreasing flood risks over the last century. A rebound effect in flood risk can be observed following an increase in settlements since the 1960s. This effect is not as relevant as the river engineering measures, but it will become increasingly relevant in the future with continued socio-economic growth. The presented approach could provide a methodological framework for studying pathways for future flood risk evolvement and for the formulation of narratives for adapting governmental flood risk strategies to the spatio-temporal dynamics in the built environment. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Evaluation of various parameters of calcium-alginate immobilization method for enhanced alkaline protease production by Bacillus licheniformis NCIM-2042 using statistical methods.

    PubMed

    Potumarthi, Ravichandra; Subhakar, Ch; Pavani, A; Jetty, Annapurna

    2008-04-01

    Calcium-alginate immobilization method for the production of alkaline protease by Bacillus licheniformis NCIM-2042 was optimized statistically. Four variables, such as sodium-alginate concentration, calcium chloride concentration, inoculum size and agitation speed were optimized by 2(4) full factorial central composite design and subsequent analysis and model validation by a second-order regression equation. Eleven carbon, 11 organic nitrogen and seven inorganic nitrogen sources were screened by two-level Plackett-Burman design for maximum alkaline protease production by using optimized immobilized conditions. The levels of four variables, such as Na-alginate 2.78%; CaCl(2), 2.15%; inoculum size, 8.10% and agitation, 139 rpm were found to be optimum for maximal production of protease. Glucose, soybean meal and ammonium sulfate were resulted in maximum protease production at 644 U/ml, 720 U/ml, and 806 U/ml when screened for carbon, organic nitrogen and inorganic nitrogen sources, respectively, using optimized immobilization conditions. Repeated fed batch mode of operation, using optimized immobilized conditions, resulted in continuous operation for 12 cycles without disintegration of beads. Cross-sectional scanning electron microscope images have shown the growth pattern of B. licheniformis in Ca-alginate immobilized beads.

  4. A low-cost method to measure the timing of post-fire flash floods and debris flows relative to rainfall

    USGS Publications Warehouse

    Kean, Jason W.; Staley, Dennis M.; Leeper, Robert J.; Schmidt, Kevin Michael; Gartner, Joseph E.

    2012-01-01

    Data on the specific timing of post-fire flash floods and debris flows are very limited. We describe a method to measure the response times of small burned watersheds to rainfall using a low-cost pressure transducer, which can be installed quickly after a fire. Although the pressure transducer is not designed for sustained sampling at the fast rates ({less than or equal to}2 sec) used at more advanced debris-flow monitoring sites, comparisons with high-data rate stage data show that measured spikes in pressure sampled at 1-min intervals are sufficient to detect the passage of most debris flows and floods. Post-event site visits are used to measure the peak stage and identify flow type based on deposit characteristics. The basin response timescale (tb) to generate flow at each site was determined from an analysis of the cross correlation between time series of flow pressure and 5-min rainfall intensity. This timescale was found to be less than 30 minutes for 40 post-fire floods and 11 post-fire debris flows recorded in 15 southern California watersheds ({less than or equal to} 1.4 km2). Including data from 24 other debris flows recorded at 5 more instrumentally advanced monitoring stations, we find there is not a substantial difference in the median tb for floods and debris flows (11 and 9 minutes, respectively); however, there are slight, statistically significant differences in the trends of flood and debris-flow tb with basin area, which are presumably related to differences in flow speed between floods and debris flows.

  5. Unexpected flood loss correlations across Europe

    NASA Astrophysics Data System (ADS)

    Booth, Naomi; Boyd, Jessica

    2017-04-01

    Floods don't observe country borders, as highlighted by major events across Europe that resulted in heavy economic and insured losses in 1999, 2002, 2009 and 2013. Flood loss correlations between some countries occur along multi-country river systems or between neighbouring nations affected by the same weather systems. However, correlations are not so obvious and whilst flooding in multiple locations across Europe may appear independent, for a re/insurer providing cover across the continent, these unexpected correlations can lead to high loss accumulations. A consistent, continental-scale method that allows quantification and comparison of losses, and identifies correlations in loss between European countries is therefore essential. A probabilistic model for European river flooding was developed that allows estimation of potential losses to pan-European property portfolios. By combining flood hazard and exposure information in a catastrophe modelling platform, we can consider correlations between river basins across Europe rather than being restricted to country boundaries. A key feature of the model is its statistical event set based on extreme value theory. Using historical river flow data, the event set captures spatial and temporal patterns of flooding across Europe and simulates thousands of events representing a full range of possible scenarios. Some known correlations were identified, such as between neighbouring Belgium and Luxembourg where 28% of events that affect either country produce a loss in both. However, our model identified some unexpected correlations including between Austria and Poland, and Poland and France, which are geographically distant. These correlations in flood loss may be missed by traditional methods and are key for re/insurers with risks in multiple countries. The model also identified that 46% of European river flood events affect more than one country. For more extreme events with a return period higher than 200 years, all events

  6. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory : evaluation of alkaline persulfate digestion as an alternative to Kjeldahl digestion for determination of total and dissolved nitrogen and phosphorus in water

    USGS Publications Warehouse

    Patton, Charles J.; Kryskalla, Jennifer R.

    2003-01-01

    Alkaline persulfate digestion was evaluated and validated as a more sensitive, accurate, and less toxic alternative to Kjeldahl digestion for routine determination of nitrogen and phosphorus in surface- and ground-water samples in a large-scale and geographically diverse study conducted by U.S. Geological Survey (USGS) between October 1, 2001, and September 30, 2002. Data for this study were obtained from about 2,100 surface- and ground-water samples that were analyzed for Kjeldahl nitrogen and Kjeldahl phosphorus in the course of routine operations at the USGS National Water Quality Laboratory (NWQL). These samples were analyzed independently for total nitrogen and total phosphorus using an alkaline persulfate digestion method developed by the NWQL Methods Research and Development Program. About half of these samples were collected during nominally high-flow (April-June) conditions and the other half were collected during nominally low-flow (August-September) conditions. The number of filtered and whole-water samples analyzed from each flow regime was about equal.By operational definition, Kjeldahl nitrogen (ammonium + organic nitrogen) and alkaline persulfate digestion total nitrogen (ammonium + nitrite + nitrate + organic nitrogen) are not equivalent. It was necessary, therefore, to reconcile this operational difference by subtracting nitrate + nitrite concentra-tions from alkaline persulfate dissolved and total nitrogen concentrations prior to graphical and statistical comparisons with dissolved and total Kjeldahl nitrogen concentrations. On the basis of two-population paired t-test statistics, the means of all nitrate-corrected alkaline persulfate nitrogen and Kjeldahl nitrogen concentrations (2,066 paired results) were significantly different from zero at the p = 0.05 level. Statistically, the means of Kjeldahl nitrogen concentrations were greater than those of nitrate-corrected alkaline persulfate nitrogen concentrations. Experimental evidence strongly

  7. SIMULATION OF FLOOD HYDROGRAPHS FOR GEORGIA STREAMS.

    USGS Publications Warehouse

    Inman, E.J.; Armbruster, J.T.

    1986-01-01

    Flood hydrographs are needed for the design of many highway drainage structures and embankments. A method for simulating these flood hydrographs at urban and rural ungauged sites in Georgia is presented. The O'Donnell method was used to compute unit hydrographs from 355 flood events from 80 stations. An average unit hydrograph and an average lag time were computed for each station. These average unit hydrographs were transformed to unit hydrographs having durations of one-fourth, one-third, one-half, and three-fourths lag time and then reduced to dimensionless terms by dividing the time by lag time and the discharge by peak discharge. Hydrographs were simulated for these 355 flood events and their widths were compared with the widths of the observed hydrographs at 50 and 75 percent of peak flow. For simulating hydrographs at sites larger than 500 mi**2, the U. S. Geological Survey computer model CONROUT can be used.

  8. POTENTIALLY PATHOGENIC FREE-LIVING AMOEBAE IN SOME FLOOD-AFFECTED AREAS DURING 2011 CHIANG MAI FLOOD

    PubMed Central

    Wannasan, Anchalee; Uparanukraw, Pichart; Songsangchun, Apichart; Morakote, Nimit

    2013-01-01

    SUMMARY The survey was carried out to investigate the presence of potentially pathogenic free-living amoebae (FLA) during flood in Chiang Mai, Thailand in 2011. From different crisis flood areas, seven water samples were collected and tested for the presence of amoebae using culture and molecular methods. By monoxenic culture, FLA were detected from all samples at 37 °C incubation. The FLA growing at 37 °C were morphologically identified as Acanthamoeba spp., Naegleria spp. and some unidentified amoebae. Only three samples (42.8%), defined as thermotolerant FLA, continued to grow at 42 °C. By molecular methods, two non-thermotolerant FlA were shown to have 99% identity to Acanthamoeba sp. and 98% identity to Hartmannella vermiformis while the two thermotolerant FLA were identified as Echinamoeba exundans (100% identity) and Hartmannella sp. (99% identity). This first report of the occurrence of FLA in water during the flood disaster will provide information to the public to be aware of potentially pathogenic FLA. PMID:24213194

  9. Potentially pathogenic free-living amoebae in some flood-affected areas during 2011 Chiang Mai flood.

    PubMed

    Wannasan, Anchalee; Uparanukraw, Pichart; Songsangchun, Apichart; Morakote, Nimit

    2013-01-01

    The survey was carried out to investigate the presence of potentially pathogenic free-living amoebae (FLA) during flood in Chiang Mai, Thailand in 2011. From different crisis flood areas, seven water samples were collected and tested for the presence of amoebae using culture and molecular methods. By monoxenic culture, FLA were detected from all samples at 37 °C incubation. The FLA growing at 37 °C were morphologically identified as Acanthamoeba spp., Naegleria spp. and some unidentified amoebae. Only three samples (42.8%), defined as thermotolerant FLA, continued to grow at 42 °C. By molecular methods, two non-thermotolerant FlA were shown to have 99% identity to Acanthamoeba sp. and 98% identity to Hartmannella vermiformis while the two thermotolerant FLA were identified as Echinamoeba exundans (100% identity) and Hartmannella sp. (99% identity). This first report of the occurrence of FLA in water during the flood disaster will provide information to the public to be aware of potentially pathogenic FLA.

  10. Floods in mountain environments: A synthesis

    NASA Astrophysics Data System (ADS)

    Stoffel, Markus; Wyżga, Bartłomiej; Marston, Richard A.

    2016-11-01

    of mountain rivers, but morphological changes of rivers can also affect hydrological properties of floods and the associated risk for societies. This paper provides a review of research in the field of floods in mountain environments and puts the papers of this special issue dedicated to the same topic into context. It also provides insight into innovative studies, methods, or emerging aspects of the relations between environmental changes, geomorphic processes, and the occurrence of floods in mountain rivers.

  11. Advanced inorganic separators for alkaline batteries and method of making the same

    NASA Technical Reports Server (NTRS)

    Sheibley, D. W. (Inventor)

    1983-01-01

    A flexible, porous battery separator includes a coating applied to a porous, flexible substrate. The coating comprises: (1) a thermoplastic rubber-based resin which is insoluble and unreactive in the alkaline electrolyte, (2) a polar organic plasticizer which is reactive with the alkaline electrolyte to produce a reaction product which contains a hydroxyl group and/or a carboxylic acid group, and (3) a mixture of polar particulate filler materials which are unreactive with the electrode. The mixture comprises at least one first filler material having a surface area of greater than 25 sq meters/gram, at last one second filler material having a surface area of 10 to 25 sq meters/gram. The volume of the mixture of filler materials is less than 45% of the total volume of the fillers and the binder. The filler surface area per gram of binder is about 20 to 60 sq meters/gram, and the amount of plasticizer is sufficient to coat each filler particle.

  12. A Fresh Start for Flood Estimation in Ungauged Basins

    NASA Astrophysics Data System (ADS)

    Woods, R. A.

    2017-12-01

    The two standard methods for flood estimation in ungauged basins, regression-based statistical models and rainfall-runoff models using a design rainfall event, have survived relatively unchanged as the methods of choice for more than 40 years. Their technical implementation has developed greatly, but the models' representation of hydrological processes has not, despite a large volume of hydrological research. I suggest it is time to introduce more hydrology into flood estimation. The reliability of the current methods can be unsatisfactory. For example, despite the UK's relatively straightforward hydrology, regression estimates of the index flood are uncertain by +/- a factor of two (for a 95% confidence interval), an impractically large uncertainty for design. The standard error of rainfall-runoff model estimates is not usually known, but available assessments indicate poorer reliability than statistical methods. There is a practical need for improved reliability in flood estimation. Two promising candidates to supersede the existing methods are (i) continuous simulation by rainfall-runoff modelling and (ii) event-based derived distribution methods. The main challenge with continuous simulation methods in ungauged basins is to specify the model structure and parameter values, when calibration data are not available. This has been an active area of research for more than a decade, and this activity is likely to continue. The major challenges for the derived distribution method in ungauged catchments include not only the correct specification of model structure and parameter values, but also antecedent conditions (e.g. seasonal soil water balance). However, a much smaller community of researchers are active in developing or applying the derived distribution approach, and as a result slower progress is being made. A change in needed: surely we have learned enough about hydrology in the last 40 years that we can make a practical hydrological advance on our methods for

  13. Flood-rich and flood-poor periods in Spain in 1942-2009

    NASA Astrophysics Data System (ADS)

    Mediero, Luis; Santillán, David; Garrote, Luis

    2016-04-01

    Several studies to detect trends in flood series at either national or trans-national scales have been conducted. Mediero et al. (2015) studied flood trends by using the longest streamflow records available in Europe. They found a decreasing trend in the Atlantic, Continental and Scandinavian regions. More specifically, Mediero et al. (2014) found a general decreasing trend in flood series in Spain in the period 1959-2009. Trends in flood series are usually detected by the Mann-Kendall test applied to a given period. However, the result of the Mann-Kendall test can change in terms of the starting and ending year of the series. Flood oscillations can occur and flood-rich and flood-poor periods could condition the results, especially when they are located at the beginning or end of the series. A methodology to identify statistically significant flood-rich and flood-poor periods is developed, based on the comparison between the expected sampling variability of floods when stationarity is assumed and the observed variability of floods in a given series. The methodology is applied to the longest series of annual maximum floods, peaks over threshold and counts of annual occurrences in peaks over threshold series observed in Spain in the period 1942-2009. A flood-rich period in 1950-1970 and a flood-poor period in 1970-1990 are identified in most of the selected sites. The generalised decreasing trend in flood series found by Mediero et al. (2014) could be explained by a flood-rich period placed at the beginning of the series and a flood-poor period located at the end of the series. References: Mediero, L., Kjeldsen, T.R., Macdonald, N., Kohnova, S., Merz, B., Vorogushyn, S., Wilson, D., Alburquerque, T., Blöschl, G., Bogdanowicz, E., Castellarin, A., Hall, J., Kobold, M., Kriauciuniene, J., Lang, M., Madsen, H., Onuşluel Gül, G., Perdigão, R.A.P., Roald, L.A., Salinas, J.L., Toumazis, A.D., Veijalainen, N., Óðinn Þórarinsson. Identification of coherent flood

  14. The effects of floodplain forest restoration and logjams on flood risk and flood hydrology

    NASA Astrophysics Data System (ADS)

    Dixon, Simon; Sear, David A.; Sykes, Tim; Odoni, Nicholas

    2015-04-01

    Flooding is the most common natural catastrophe, accounting for around half of all natural disaster related deaths and causing economic losses in Europe estimated at over € 2bn per year. In addition flooding is expected to increase in magnitude and frequency with climate change, effectively shortening the return period for a given magnitude flood. Increasing the height and extent of hard engineered defences in response to increased risk is both unsustainable and undesirable. Thus alternative approaches to flood mitigation are needed such as harnessing vegetation processes to slow the passage of flood waves and increase local flood storage. However, our understanding of these effects at the catchment scale is limited. In this presentation we demonstrate the effects of two river restoration approaches upon catchment scale flood hydrology. The addition of large wood to river channels during river restoration projects is a popular method of attempting to improve physical and biological conditions in degraded river systems. Projects utilising large wood can involve the installation of engineered logjams (ELJs), the planting and enhancement of riparian forests, or a combination of both. Altering the wood loading of a channel through installation of ELJs and increasing floodplain surface complexity through encouraging mature woodland could be expected to increase the local hydraulic resistance, increasing the timing and duration of overbank events locally and therefore increasing the travel time of a flood wave through a reach. This reach-scale effect has been documented in models and the field; however the impacts of these local changes at a catchment scale remains to be illustrated. Furthermore there is limited knowledge of how changing successional stages of a restored riparian forest through time may affect its influence on hydromorphic processes. We present results of a novel paired numerical modelling study. We model changes in flood hydrology based on a 98km

  15. Do Natural Disasters Affect Voting Behavior? Evidence from Croatian Floods

    PubMed Central

    Bovan, Kosta; Banai, Benjamin; Pavela Banai, Irena

    2018-01-01

    Introduction: Studies show that natural disasters influence voters’ perception of incumbent politicians. To investigate whether voters are prone to punish politicians for events that are out of their control, this study was conducted in the previously unstudied context of Croatia, and by considering some of the methodological issues of previous studies. Method: Matching method technique was used, which ensures that affected and non-affected areas are matched on several control variables. The cases of natural disaster in the present study were floods that affected Croatia in 2014 and 2015. Results: Main results showed that, prior to matching, floods had an impact on voting behaviour in the 2014 and 2015 elections. Voters from flooded areas decreased their support for the incumbent government and president in the elections following the floods. However, once we accounted for differences in control variables between flooded and non-flooded areas, the flood effect disappeared. Furthermore, results showed that neither the presence nor the amount of the government’s relief spending had an impact on voting behaviour. Discussion: Presented results imply that floods did not have an impact on the election outcome. Results are interpreted in light of the retrospective voter model. PMID:29770268

  16. The impact of flood and post-flood cleaning on airborne microbiological and particle contamination in residential houses.

    PubMed

    He, Congrong; Salonen, Heidi; Ling, Xuan; Crilley, Leigh; Jayasundara, Nadeesha; Cheung, Hing Cho; Hargreaves, Megan; Huygens, Flavia; Knibbs, Luke D; Ayoko, Godwin A; Morawska, Lidia

    2014-08-01

    In January 2011, Brisbane, Australia, experienced a major river flooding event. We aimed to investigate its effects on air quality and assess the role of prompt cleaning activities in reducing the airborne exposure risk. A comprehensive, multi-parameter indoor and outdoor measurement campaign was conducted in 41 residential houses, 2 and 6 months after the flood. The median indoor air concentrations of supermicrometer particle number (PN), PM10, fungi and bacteria 2 months after the flood were comparable to those previously measured in Brisbane. These were 2.88 p cm(-3), 15 μg m(-3), 804 cf um(-3) and 177 cf um(-3) for flood-affected houses (AFH), and 2.74 p cm(-3), 15 μg m(-3), 547 cf um(-3) and 167 cf um(-3) for non-affected houses (NFH), respectively. The I/O (indoor/outdoor) ratios of these pollutants were 1.08, 1.38, 0.74 and 1.76 for AFH and 1.03, 1.32, 0.83 and 2.17 for NFH, respectively. The average of total elements (together with transition metals) in indoor dust was 2296 ± 1328 μg m(-2) for AFH and 1454 ± 678 μg m(-2) for NFH, respectively. In general, the differences between AFH and NFH were not statistically significant, implying the absence of a measureable effect on air quality from the flood. We postulate that this was due to the very swift and effective cleaning of the flooded houses by 60,000 volunteers. Among the various cleaning methods, the use of both detergent and bleach was the most efficient at controlling indoor bacteria. All cleaning methods were equally effective for indoor fungi. This study provides quantitative evidence of the significant impact of immediate post-flood cleaning on mitigating the effects of flooding on indoor bioaerosol contamination and other pollutants. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Hurricane coastal flood analysis using multispectral spectral images

    NASA Astrophysics Data System (ADS)

    Ogashawara, I.; Ferreira, C.; Curtarelli, M. P.

    2013-12-01

    Flooding is one of the main hazards caused by extreme events such as hurricanes and tropical storms. Therefore, flood maps are a crucial tool to support policy makers, environmental managers and other government agencies for emergency management, disaster recovery and risk reduction planning. However traditional flood mapping methods rely heavily on the interpolation of hydrodynamic models results, and most recently, the extensive collection of field data. These methods are time-consuming, labor intensive, and costly. Efficient and fast response alternative methods should be developed in order to improve flood mapping, and remote sensing has been proved as a valuable tool for this application. Our goal in this paper is to introduce a novel technique based on spectral analysis in order to aggregate knowledge and information to map coastal flood areas. For this purpose we used the Normalized Diference Water Index (NDWI) which was derived from two the medium resolution LANDSAT/TM 5 surface reflectance product from the LANDSAT climate data record (CDR). This product is generated from specialized software called Landsat Ecosystem Disturbance Adaptive Processing System (LEDAPS). We used the surface reflectance products acquired before and after the passage of Hurricane Ike for East Texas in September of 2008. We used as end member a classification of estimated flooded area based on the United States Geological Survey (USGS) mobile storm surge network that was deployed for Hurricane Ike. We used a dataset which consisted of 59 water levels recording stations. The estimated flooded area was delineated interpolating the maximum surge in each location using a spline with barriers method with high tension and a 30 meter Digital Elevation Model (DEM) from the National Elevation Dataset (NED). Our results showed that, in the flooded area, the NDWI values decreased after the hurricane landfall on average from 0.38 to 0.18 and the median value decreased from 0.36 to 0.2. However

  18. Rethinking the relationship between flood risk perception and flood management.

    PubMed

    Birkholz, S; Muro, M; Jeffrey, P; Smith, H M

    2014-04-15

    Although flood risk perceptions and their concomitant motivations for behaviour have long been recognised as significant features of community resilience in the face of flooding events, there has, for some time now, been a poorly appreciated fissure in the accompanying literature. Specifically, rationalist and constructivist paradigms in the broader domain of risk perception provide different (though not always conflicting) contexts for interpreting evidence and developing theory. This contribution reviews the major constructs that have been applied to understanding flood risk perceptions and contextualises these within broader conceptual developments around risk perception theory and contemporary thinking around flood risk management. We argue that there is a need to re-examine and re-invigorate flood risk perception research, in a manner that is comprehensively underpinned by more constructivist thinking around flood risk management as well as by developments in broader risk perception research. We draw attention to an historical over-emphasis on the cognitive perceptions of those at risk to the detriment of a richer understanding of a wider range of flood risk perceptions such as those of policy-makers or of tax-payers who live outside flood affected areas as well as the linkages between these perspectives and protective measures such as state-supported flood insurance schemes. Conclusions challenge existing understandings of the relationship between risk perception and flood management, particularly where the latter relates to communication strategies and the extent to which those at risk from flooding feel responsible for taking protective actions. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Flood Risk, Flood Mitigation, and Location Choice: Evaluating the National Flood Insurance Program's Community Rating System.

    PubMed

    Fan, Qin; Davlasheridze, Meri

    2016-06-01

    Climate change is expected to worsen the negative effects of natural disasters like floods. The negative impacts, however, can be mitigated by individuals' adjustments through migration and relocation behaviors. Previous literature has identified flood risk as one significant driver in relocation decisions, but no prior study examines the effect of the National Flood Insurance Program's voluntary program-the Community Rating System (CRS)-on residential location choice. This article fills this gap and tests the hypothesis that flood risk and the CRS-creditable flood control activities affect residential location choices. We employ a two-stage sorting model to empirically estimate the effects. In the first stage, individuals' risk perception and preference heterogeneity for the CRS activities are considered, while mean effects of flood risk and the CRS activities are estimated in the second stage. We then estimate heterogeneous marginal willingness to pay (WTP) for the CRS activities by category. Results show that age, ethnicity and race, educational attainment, and prior exposure to risk explain risk perception. We find significant values for the CRS-creditable mitigation activities, which provides empirical evidence for the benefits associated with the program. The marginal WTP for an additional credit point earned for public information activities, including hazard disclosure, is found to be the highest. Results also suggest that water amenities dominate flood risk. Thus, high amenity values may increase exposure to flood risk, and flood mitigation projects should be strategized in coastal regions accordingly. © 2015 Society for Risk Analysis.

  20. Molecular epidemiology of Vibrio cholerae associated with flood in Brahamputra River valley, Assam, India.

    PubMed

    Bhuyan, Soubhagya K; Vairale, Mohan G; Arya, Neha; Yadav, Priti; Veer, Vijay; Singh, Lokendra; Yadava, Pramod K; Kumar, Pramod

    2016-06-01

    Cholera is often caused when drinking water is contaminated through environmental sources. In recent years, the drastic cholera epidemics in Odisha (2007) and Haiti (2010) were associated with natural disasters (flood and Earthquake). Almost every year the state of Assam India witnesses flood in Brahamputra River valley during reversal of wind system (monsoon). This is often followed by outbreak of diarrheal diseases including cholera. Beside the incidence of cholera outbreaks, there is lack of experimental evidence for prevalence of the bacterium in aquatic environment and its association with cholera during/after flood in the state. A molecular surveillance during 2012-14 was carried out to study prevalence, strain differentiation, and clonality of Vibrio cholerae in inland aquatic reservoirs flooded by Brahamputra River in Assam. Water samples were collected, filtered, enriched in alkaline peptone water followed by selective culturing on thiosulfate bile salt sucrose agar. Environmental isolates were identified as V. cholerae, based on biochemical assays followed by sero-grouping and detailed molecular characterization. The incidence of the presence of the bacterium in potable water sources was higher after flood. Except one O1 isolate, all of the strains were broadly grouped under non-O1/non-O139 whereas some of them did have cholera toxin (CT). Surprisingly, we have noticed Haitian ctxB in two non-O1/non-O139 strains. MLST analyses based on pyrH, recA and rpoA genes revealed clonality in the environmental strains. The isolates showed varying degree of antimicrobial resistance including tetracycline and ciprofloxacin. The strains harbored the genetic elements SXT constins and integrons responsible for multidrug resistance. Genetic characterization is useful as phenotypic characters alone have proven to be unsatisfactory for strain discrimination. An assurance to safe drinking water, sanitation and monitoring of the aquatic reservoirs is of utmost importance for

  1. Magnitude and frequency of floods in Arkansas

    USGS Publications Warehouse

    Hodge, Scott A.; Tasker, Gary D.

    1995-01-01

    Methods are presented for estimating the magnitude and frequency of peak discharges of streams in Arkansas. Regression analyses were developed in which a stream's physical and flood characteristics were related. Four sets of regional regression equations were derived to predict peak discharges with selected recurrence intervals of 2, 5, 10, 25, 50, 100, and 500 years on streams draining less than 7,770 square kilometers. The regression analyses indicate that size of drainage area, main channel slope, mean basin elevation, and the basin shape factor were the most significant basin characteristics that affect magnitude and frequency of floods. The region of influence method is included in this report. This method is still being improved and is to be considered only as a second alternative to the standard method of producing regional regression equations. This method estimates unique regression equations for each recurrence interval for each ungaged site. The regression analyses indicate that size of drainage area, main channel slope, mean annual precipitation, mean basin elevation, and the basin shape factor were the most significant basin and climatic characteristics that affect magnitude and frequency of floods for this method. Certain recommendations on the use of this method are provided. A method is described for estimating the magnitude and frequency of peak discharges of streams for urban areas in Arkansas. The method is from a nationwide U.S. Geeological Survey flood frequency report which uses urban basin characteristics combined with rural discharges to estimate urban discharges. Annual peak discharges from 204 gaging stations, with drainage areas less than 7,770 square kilometers and at least 10 years of unregulated record, were used in the analysis. These data provide the basis for this analysis and are published in the Appendix of this report as supplemental data. Large rivers such as the Red, Arkansas, White, Black, St. Francis, Mississippi, and

  2. Methods for estimating selected flow-duration and flood-frequency characteristics at ungaged sites in Central Idaho

    USGS Publications Warehouse

    Kjelstrom, L.C.

    1998-01-01

    Methods for estimating daily mean discharges for selected flow durations and flood discharge for selected recurrence intervals at ungaged sites in central Idaho were applied using data collected at streamflow-gaging stations in the area. The areal and seasonal variability of discharge from ungaged drainage basins may be described by estimating daily mean discharges that are exceeded 20, 50, and 80 percent of the time each month. At 73 gaging stations, mean monthly discharge was regressed with discharge at three points—20, 50, and 80—from daily mean flow-duration curves for each month. Regression results were improved by dividing the study area into six regions. Previously determined estimates of mean monthly discharge from about 1,200 ungaged drainage basins provided the basis for applying the developed techniques to the ungaged basins. Estimates of daily mean discharges that are exceeded 20, 50, and 80 percent of the time each month at ungaged drainage basins can be made by multiplying mean monthly discharges estimated at ungaged sites by a regression factor for the appropriate region. In general, the flow-duration data were less accurately estimated at discharges exceeded 80 percent of the time than at discharges exceeded 20 percent of the time. Curves drawn through the three points for each of the six regions were most similar in July and most different from December through March. Coefficients of determination of the regressions indicate that differences in mean monthly discharge largely explain differences in discharge at points on the daily mean flow-duration curve. Inherent in the method are errors in the technique used to estimate mean monthly discharge. Flood discharge estimates for selected recurrence intervals at ungaged sites upstream or downstream from gaging stations can be determined by a transfer technique. A weighted ratio of drainage area times flood discharge for selected recurrence intervals at the gaging station can be used to estimate

  3. Evaluation method to floodwater amount of difficult control and utilization in flood season for hyperconcentration rivers and its application

    NASA Astrophysics Data System (ADS)

    Li, X.

    2013-05-01

    The severe soil erosion in the Chinese Loess Plateau has resulted in high sediment concentration in runoff, which can cause tremendous pressure to the development and utilization of regional floodwater resources as well as the regional flood control and disaster mitigation. The floodwater amount of difficult control and utilization in flood season (FADCUFS) is an important part of the available amount of surface water resources. It also has a critical role in the sustainable development of water resources, especially for those hyperconcentration rivers (HRs) in the Loess Plateau. The evaluation of FADCUFS for HRs is an important issue in the field of hydrology and water resources. However, the understandings of its connotation, evaluation method, and nature are limited. Combined engineering measures with non-engineering ones, the evaluation method of FADCUFS for HRs was presented based on the angles of water quantity and quality. The method divides the FADCUFS into two parts in terms of the flood control operation characteristics of reservoir in HR and the relationship between water resources utilization and sediment in runoff, respectively. One is the amount of difficult regulation-control floodwater (DRCF), and the other is the volume of difficult utilization floodwater (DUF). A case study of the Bajiazui Reservoir, located in the typical Jinghe River (the second tributary of the Chinese Yellow River with high sediment concentration) was performed. Three typical years, wet year (1988), average year (1986), and dry years (1995 and 2000), were employed. According to the daily optimal operation model of Bajiazui Reservoir, the DRCF occurs for only the wet year instead of the average and the dry years. There are four times of DRCF with the amount of 26.74 m3/s (July 14), 14.58 m3/s (August 5), 10.27 m3/s (August 9), and 1.23 m3/s (August 12) in 1988, respectively, with a total amount of 4.56 million m3. A certain close relationship exists between the amount of DRCF

  4. May flood-poor periods be more dangerous than flood-rich periods?

    NASA Astrophysics Data System (ADS)

    Salinas, Jose Luis; Di Baldassarre, Giuliano; Viglione, Alberto; Kuil, Linda; Bloeschl, Guenter

    2014-05-01

    River floods are among the most devastating natural hazards experienced by populations that, since the earliest recorded civilisations, have settled in floodplains because they offer favourable conditions for trade, agriculture, and economic development. The occurrence of a flood may cause loss of lives and tremendous economic damages and, therefore, is rightly seen as a very negative event by the communities involved. Occurrence of many floods in a row is, of course, even more frustrating and is rightly considered a unbearable calamity. Unfortunately, the occurrence of many floods in a limited number of consecutive years is not unusual. In many places in the world, it has been observed that extreme floods do not arrive randomly but cluster in time into flood-poor and flood-rich periods consistent with the Hurst effect. If this is the case, when are the people more in danger? When should people be more scared? In flood-poor or flood-rich periods? In this work, a Socio-Hydrology model (Di Baldassarre et al., 2013; Viglione et al., 2014) is used to show that, maybe counter-intuitively, flood-poor periods may be more dangerous than flood-rich periods. The model is a conceptualisation of a hypothetical setting of a city at a river where a community evolves, making choices between flood management options on the floodplain. The most important feedbacks between the economic, political, technological and hydrological processes of the evolution of that community are represented in the model. In particular, the model also accounts in a dynamic way for the evolution of the the community awareness to flood risk. Occurrence of floods tends to increase peoples' recognition that their property is in an area that is potentially at risk of flooding, both at the scales of individuals and communities, which is one of the main reasons why flood coping actions are taken. It is shown through examples that frequent flood events may result in moderate damages because they ensure that the

  5. Swiss Re Global Flood Hazard Zones: Know your flood risk

    NASA Astrophysics Data System (ADS)

    Vinukollu, R. K.; Castaldi, A.; Mehlhorn, J.

    2012-12-01

    Floods, among all natural disasters, have a great damage potential. On a global basis, there is strong evidence of increase in the number of people affected and economic losses due to floods. For example, global insured flood losses have increased by 12% every year since 1970 and this is expected to further increase with growing exposure in the high risk areas close to rivers and coastlines. Recently, the insurance industry has been surprised by the large extent of losses, because most countries lack reliable hazard information. One example has been the 2011 Thailand floods where millions of people were affected and the total economic losses were 30 billion USD. In order to assess the flood risk across different regions and countries, the flood team at Swiss Re based on a Geomorphologic Regression approach, developed in house and patented, produced global maps of flood zones. Input data for the study was obtained from NASA's Shuttle Radar Topographic Mission (SRTM) elevation data, Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global Digital Elevation Model (GDEM) and HydroSHEDS. The underlying assumptions of the approach are that naturally flowing rivers shape their channel and flood plain according to basin inherent forces and characteristics and that the flood water extent strongly depends on the shape of the flood plain. On the basis of the catchment characteristics, the model finally calculates the probability of a location to be flooded or not for a defined return period, which in the current study was set to 100 years. The data is produced at a 90-m resolution for latitudes 60S to 60N. This global product is now used in the insurance industry to inspect, inform and/or insure the flood risk across the world.

  6. Stainless steel anodes for alkaline water electrolysis and methods of making

    DOEpatents

    Soloveichik, Grigorii Lev

    2014-01-21

    The corrosion resistance of stainless steel anodes for use in alkaline water electrolysis was increased by immersion of the stainless steel anode into a caustic solution prior to electrolysis. Also disclosed herein are electrolyzers employing the so-treated stainless steel anodes. The pre-treatment process provides a stainless steel anode that has a higher corrosion resistance than an untreated stainless steel anode of the same composition.

  7. Alkaline Comet Assay for Assessing DNA Damage in Individual Cells.

    PubMed

    Pu, Xinzhu; Wang, Zemin; Klaunig, James E

    2015-08-06

    Single-cell gel electrophoresis, commonly called a comet assay, is a simple and sensitive method for assessing DNA damage at the single-cell level. It is an important technique in genetic toxicological studies. The comet assay performed under alkaline conditions (pH >13) is considered the optimal version for identifying agents with genotoxic activity. The alkaline comet assay is capable of detecting DNA double-strand breaks, single-strand breaks, alkali-labile sites, DNA-DNA/DNA-protein cross-linking, and incomplete excision repair sites. The inclusion of digestion of lesion-specific DNA repair enzymes in the procedure allows the detection of various DNA base alterations, such as oxidative base damage. This unit describes alkaline comet assay procedures for assessing DNA strand breaks and oxidative base alterations. These methods can be applied in a variety of cells from in vitro and in vivo experiments, as well as human studies. Copyright © 2015 John Wiley & Sons, Inc.

  8. RASOR flood modelling

    NASA Astrophysics Data System (ADS)

    Beckers, Joost; Buckman, Lora; Bachmann, Daniel; Visser, Martijn; Tollenaar, Daniel; Vatvani, Deepak; Kramer, Nienke; Goorden, Neeltje

    2015-04-01

    Decision making in disaster management requires fast access to reliable and relevant information. We believe that online information and services will become increasingly important in disaster management. Within the EU FP7 project RASOR (Rapid Risk Assessment and Spatialisation of Risk) an online platform is being developed for rapid multi-hazard risk analyses to support disaster management anywhere in the world. The platform will provide access to a plethora of GIS data that are relevant to risk assessment. It will also enable the user to run numerical flood models to simulate historical and newly defined flooding scenarios. The results of these models are maps of flood extent, flood depths and flow velocities. The RASOR platform will enable to overlay historical event flood maps with observations and Earth Observation (EO) imagery to fill in gaps and assess the accuracy of the flood models. New flooding scenarios can be defined by the user and simulated to investigate the potential impact of future floods. A series of flood models have been developed within RASOR for selected case study areas around the globe that are subject to very different flood hazards: • The city of Bandung in Indonesia, which is prone to fluvial flooding induced by heavy rainfall. The flood hazard is exacerbated by land subsidence. • The port of Cilacap on the south coast of Java, subject to tsunami hazard from submarine earthquakes in the Sunda trench. • The area south of city of Rotterdam in the Netherlands, prone to coastal and/or riverine flooding. • The island of Santorini in Greece, which is subject to tsunamis induced by landslides. Flood models have been developed for each of these case studies using mostly EO data, augmented by local data where necessary. Particular use was made of the new TanDEM-X (TerraSAR-X add-on for Digital Elevation Measurement) product from the German Aerospace centre (DLR) and EADS Astrium. The presentation will describe the flood models and the

  9. Floods

    MedlinePlus

    Floods are common in the United States. Weather such as heavy rain, thunderstorms, hurricanes, or tsunamis can ... is breached, or when a dam breaks. Flash floods, which can develop quickly, often have a dangerous ...

  10. Flooding

    MedlinePlus

    ... flooding For communities, companies, or water and wastewater facilities: Flood Resilience Guide is your one-stop resource ... Zika Top of Page For water and wastewater facilities: For water and wastewater facilities : Suggested post-hurricane ...

  11. Peak flood estimation using gene expression programming

    NASA Astrophysics Data System (ADS)

    Zorn, Conrad R.; Shamseldin, Asaad Y.

    2015-12-01

    As a case study for the Auckland Region of New Zealand, this paper investigates the potential use of gene-expression programming (GEP) in predicting specific return period events in comparison to the established and widely used Regional Flood Estimation (RFE) method. Initially calibrated to 14 gauged sites, the GEP derived model was further validated to 10 and 100 year flood events with a relative errors of 29% and 18%, respectively. This is compared to the RFE method providing 48% and 44% errors for the same flood events. While the effectiveness of GEP in predicting specific return period events is made apparent, it is argued that the derived equations should be used in conjunction with those existing methodologies rather than as a replacement.

  12. A New Approach to Flood Protection Design and Riparian Management

    Treesearch

    Philip B. Williams; Mitchell L. Swanson

    1989-01-01

    Conventional engineering methods of flood control design focus narrowly on the efficient conveyance of water, with little regard for environmental resource planning and natural geomorphic processes. Consequently, flood control projects are often environmentally disastrous, expensive to maintain, and even inadequate to control floods. In addition, maintenance programs...

  13. An Evaluation of Selected Extraordinary Floods in the United States Reported by the U.S. Geological Survey and Implications for Future Advancement of Flood Science

    USGS Publications Warehouse

    Costa, John E.; Jarrett, Robert D.

    2008-01-01

    Thirty flood peak discharges determine the envelope curve of maximum floods documented in the United States by the U.S. Geological Survey. These floods occurred from 1927 to 1978 and are extraordinary not just in their magnitude, but in their hydraulic and geomorphic characteristics. The reliability of the computed discharge of these extraordinary floods was reviewed and evaluated using current (2007) best practices. Of the 30 flood peak discharges investigated, only 7 were measured at daily streamflow-gaging stations that existed when the flood occurred, and 23 were measured at miscellaneous (ungaged) sites. Methods used to measure these 30 extraordinary flood peak discharges consisted of 21 slope-area measurements, 2 direct current-meter measurements, 1 culvert measurement, 1 rating-curve extension, and 1 interpolation and rating-curve extension. The remaining four peak discharges were measured using combinations of culvert, slope-area, flow-over-road, and contracted-opening measurements. The method of peak discharge determination for one flood is unknown. Changes to peak discharge or rating are recommended for 20 of the 30 flood peak discharges that were evaluated. Nine floods retained published peak discharges, but their ratings were downgraded. For two floods, both peak discharge and rating were corrected and revised. Peak discharges for five floods that are subject to significant uncertainty due to complex field and hydraulic conditions, were re-rated as estimates. This study resulted in 5 of the 30 peak discharges having revised values greater than about 10 percent different from the original published values. Peak discharges were smaller for three floods (North Fork Hubbard Creek, Texas; El Rancho Arroyo, New Mexico; South Fork Wailua River, Hawaii), and two peak discharges were revised upward (Lahontan Reservoir tributary, Nevada; Bronco Creek, Arizona). Two peak discharges were indeterminate because they were concluded to have been debris flows with peak

  14. Alkaline degradation studies of anion exchange polymers to enable new membrane designs

    NASA Astrophysics Data System (ADS)

    Nunez, Sean Andrew

    Current performance targets for anion-exchange membrane (AEM) fuel cells call for greater than 95% alkaline stability for 5000 hours at temperatures up to 120 °C. Using this target temperature of 120 °C, an incisive 1H NMR-based alkaline degradation method to identify the degradation products of n-alkyl spacer tetraalkylammonium cations in various AEM polymers and small molecule analogs. Herein, the degradation mechanisms and rates of benzyltrimethylammonium-, n-alkyl interstitial spacer- and n-alkyl terminal pendant-cations are studied on several architectures. These findings demonstrate that benzyltrimethylammonium- and n-alkyl terminal pendant cations are more labile than an n-alkyl interstitial spacer cation and conclude that Hofmann elimination is not the predominant mechanism of alkaline degradation. Additionally, the alkaline stability of an n-alkyl interstitial spacer cation is enhanced when combined with an n-alkyl terminal pendant. Interestingly, at 120 °C, an inverse trend was found in the overall alkaline stability of AEM poly(styrene) and AEM poly(phenylene oxide) samples than was previously shown at 80 °C. Successive small molecule studies suggest that at 120 °C, an anion-induced 1,4-elimination degradation mechanism may be activated on styrenic AEM polymers bearing an acidic alpha-hydrogen. In addition, an ATR-FTIR based method was developed to assess the alkaline stability of solid membranes and any added resistance to degradation that may be due to differential solubilities and phase separation. To increase the stability of anion exchange membranes, Oshima magnesate--halogen exchange was demonstrated as a method for the synthesis of new anion exchange membranes that typically fail in the presence of organolithium or Grignard reagents alone. This new chemistry, applied to non-resinous polymers for the first time, proved effective for the n-akyl interstitial spacer functionalization of poly(phenylene oxide) and poly(styrene- co

  15. Geomorphic Flood Area (GFA): a DEM-based tool for flood susceptibility mapping at large scales

    NASA Astrophysics Data System (ADS)

    Manfreda, S.; Samela, C.; Albano, R.; Sole, A.

    2017-12-01

    Flood hazard and risk mapping over large areas is a critical issue. Recently, many researchers are trying to achieve a global scale mapping encountering several difficulties, above all the lack of data and implementation costs. In data scarce environments, a preliminary and cost-effective floodplain delineation can be performed using geomorphic methods (e.g., Manfreda et al., 2014). We carried out several years of research on this topic, proposing a morphologic descriptor named Geomorphic Flood Index (GFI) (Samela et al., 2017) and developing a Digital Elevation Model (DEM)-based procedure able to identify flood susceptible areas. The procedure exhibited high accuracy in several test sites in Europe, United States and Africa (Manfreda et al., 2015; Samela et al., 2016, 2017) and has been recently implemented in a QGIS plugin named Geomorphic Flood Area (GFA) - tool. The tool allows to automatically compute the GFI, and turn it into a linear binary classifier capable of detecting flood-prone areas. To train this classifier, an inundation map derived using hydraulic models for a small portion of the basin is required (the minimum is 2% of the river basin's area). In this way, the GFA-tool allows to extend the classification of the flood-prone areas across the entire basin. We are also defining a simplified procedure for the estimation of the river depth, which may be helpful for large-scale analyses to approximatively evaluate the expected flood damages in the surrounding areas. ReferencesManfreda, S., Nardi, F., Samela, C., Grimaldi, S., Taramasso, A. C., Roth, G., & Sole, A. (2014). Investigation on the use of geomorphic approaches for the delineation of flood prone areas. J. Hydrol., 517, 863-876. Manfreda, S., Samela, C., Gioia, A., Consoli, G., Iacobellis, V., Giuzio, L., & Sole, A. (2016). Flood-prone areas assessment using linear binary classifiers based on flood maps obtained from 1D and 2D hydraulic models. Nat. Hazards, Vol. 79 (2), pp 735-754. Samela, C

  16. Effect of alkaline addition on anaerobic sludge digestion with combined pretreatment of alkaline and high pressure homogenization.

    PubMed

    Fang, Wei; Zhang, Panyue; Zhang, Guangming; Jin, Shuguang; Li, Dongyi; Zhang, Meixia; Xu, Xiangzhe

    2014-09-01

    To improve anaerobic digestion efficiency, combination pretreatment of alkaline and high pressure homogenization was applied to pretreat sewage sludge. Effect of alkaline dosage on anaerobic sludge digestion was investigated in detail. SCOD of sludge supernatant significantly increased with the alkaline dosage increase after the combined pretreatment because of sludge disintegration. Organics were significantly degraded after the anaerobic digestion, and the maximal SCOD, TCOD and VS removal was 73.5%, 61.3% and 43.5%, respectively. Cumulative biogas production, methane content in biogas and biogas production rate obviously increased with the alkaline dosage increase. Considering both the biogas production and alkaline dosage, the optimal alkaline dosage was selected as 0.04 mol/L. Relationships between biogas production and sludge disintegration showed that the accumulative biogas was mainly enhanced by the sludge disintegration. The methane yield linearly increased with the DDCOD increase as Methane yield (ml/gVS)=4.66 DDCOD-9.69. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. The Alkaline Diet: Is There Evidence That an Alkaline pH Diet Benefits Health?

    PubMed Central

    Schwalfenberg, Gerry K.

    2012-01-01

    This review looks at the role of an alkaline diet in health. Pubmed was searched looking for articles on pH, potential renal acid loads, bone health, muscle, growth hormone, back pain, vitamin D and chemotherapy. Many books written in the lay literature on the alkaline diet were also reviewed and evaluated in light of the published medical literature. There may be some value in considering an alkaline diet in reducing morbidity and mortality from chronic diseases and further studies are warranted in this area of medicine. PMID:22013455

  18. Acidity and Alkalinity in mine drainage: Practical considerations

    USGS Publications Warehouse

    Cravotta, III, Charles A.; Kirby, Carl S.

    2004-01-01

    In this paper, we emphasize that the Standard Method hot peroxide treatment procedure for acidity determination (hot acidity) directly measures net acidity or net alkalinity, but that more than one water-quality measure can be useful as a measure of the severity of acid mine drainage. We demonstrate that the hot acidity is related to the pH, alkalinity, and dissolved concentrations of Fe, Mn, and Al in fresh mine drainage. We show that the hot acidity accurately indicates the potential for pH to decrease to acidic values after complete oxidation of Fe and Mn, and it indicates the excess alkalinity or that required for neutralization of the sample. We show that the hot acidity method gives consistent, interpretable results on fresh or aged samples. Regional data for mine-drainage quality in Pennsylvania indicated the pH of fresh samples was predominantly acidic (pH 2.5 to 4) or near neutral (pH 6 to 7); approximately 25 percent of the samples had intermediate pH values. This bimodal frequency distribution of pH was distinctive for fully oxidized samples; oxidized samples had acidic or near-neutral pH, only. Samples that had nearneutral pH after oxidation had negative hot acidity; samples that had acidic pH after oxidation had positive hot acidity. Samples with comparable pH values had variable hot acidities owing to variations in their alkalinities and dissolved Fe, Mn, and Al concentrations. The hot acidity was comparable to net acidity computed on the basis of initial pH and concentrations of Fe, Mn, and Al minus the initial alkalinity. Acidity computed from the pH and dissolved metals concentrations, assuming equivalents of 2 per mole of Fe and Mn and 3 per mole of Al, was comparable to that computed on the basis of aqueous species and FeII/FeIII. Despite changes in the pH, alkalinity, and metals concentrations, the hot acidities were comparable for fresh and aged samples. Thus, meaningful “net” acidity can be determined from a measured hot acidity or by

  19. Augmented digestion of lignocellulose by steam explosion, acid and alkaline pretreatment methods: a review.

    PubMed

    Singh, Joginder; Suhag, Meenakshi; Dhaka, Anil

    2015-03-06

    Lignocellulosic materials can be explored as one of the sustainable substrates for bioethanol production through microbial intervention as they are abundant, cheap and renewable. But at the same time, their recalcitrant structure makes the conversion process more cumbersome owing to their chemical composition which adversely affects the efficiency of bioethanol production. Therefore, the technical approaches to overcome recalcitrance of biomass feedstock has been developed to remove the barriers with the help of pretreatment methods which make cellulose more accessible to the hydrolytic enzymes, secreted by the microorganisms, for its conversion to glucose. Pretreatment of lignocellulosic biomass in cost effective manner is a major challenge to bioethanol technology research and development. Hence, in this review, we have discussed various aspects of three commonly used pretreatment methods, viz., steam explosion, acid and alkaline, applied on various lignocellulosic biomasses to augment their digestibility alongwith the challenges associated with their processing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Pakistan Flooding

    Atmospheric Science Data Center

    2013-04-16

    article title:  Flooding in Pakistan     View Larger Image In late July 2010, flooding caused by heavy monsoon rains began in several regions of Pakistan, ... river is 23 kilometers (14 miles) wide or more in spots, and flooding in much of the surrounding region, particularly in the Larkana ...

  1. Floods and climate: emerging perspectives for flood risk assessment and management

    NASA Astrophysics Data System (ADS)

    Merz, B.; Aerts, J.; Arnbjerg-Nielsen, K.; Baldi, M.; Becker, A.; Bichet, A.; Blöschl, G.; Bouwer, L. M.; Brauer, A.; Cioffi, F.; Delgado, J. M.; Gocht, M.; Guzzetti, F.; Harrigan, S.; Hirschboeck, K.; Kilsby, C.; Kron, W.; Kwon, H.-H.; Lall, U.; Merz, R.; Nissen, K.; Salvatti, P.; Swierczynski, T.; Ulbrich, U.; Viglione, A.; Ward, P. J.; Weiler, M.; Wilhelm, B.; Nied, M.

    2014-07-01

    Flood estimation and flood management have traditionally been the domain of hydrologists, water resources engineers and statisticians, and disciplinary approaches abound. Dominant views have been shaped; one example is the catchment perspective: floods are formed and influenced by the interaction of local, catchment-specific characteristics, such as meteorology, topography and geology. These traditional views have been beneficial, but they have a narrow framing. In this paper we contrast traditional views with broader perspectives that are emerging from an improved understanding of the climatic context of floods. We come to the following conclusions: (1) extending the traditional system boundaries (local catchment, recent decades, hydrological/hydraulic processes) opens up exciting possibilities for better understanding and improved tools for flood risk assessment and management. (2) Statistical approaches in flood estimation need to be complemented by the search for the causal mechanisms and dominant processes in the atmosphere, catchment and river system that leave their fingerprints on flood characteristics. (3) Natural climate variability leads to time-varying flood characteristics, and this variation may be partially quantifiable and predictable, with the perspective of dynamic, climate-informed flood risk management. (4) Efforts are needed to fully account for factors that contribute to changes in all three risk components (hazard, exposure, vulnerability) and to better understand the interactions between society and floods. (5) Given the global scale and societal importance, we call for the organization of an international multidisciplinary collaboration and data-sharing initiative to further understand the links between climate and flooding and to advance flood research.

  2. Long-term psychological outcomes of flood survivors of hard-hit areas of the 1998 Dongting Lake flood in China: Prevalence and risk factors

    PubMed Central

    Dai, Wenjie; Kaminga, Atipatsa C.; Tan, Hongzhuan; Wang, Jieru; Lai, Zhiwei; Wu, Xin; Liu, Aizhong

    2017-01-01

    Background Although numerous studies have indicated that exposure to natural disasters may increase survivors’ risk of post-traumatic stress disorder (PTSD) and anxiety, studies focusing on the long-term psychological outcomes of flood survivors are limited. Thus, this study aimed to estimate the prevalence of PTSD and anxiety among flood survivors 17 years after the 1998 Dongting Lake flood and to identify the risk factors for PTSD and anxiety. Methods This cross-sectional study was conducted in December 2015, 17 years after the 1998 Dongting Lake flood. Survivors in hard-hit areas of the flood disaster were enrolled in this study using a stratified, systematic random sampling method. Well qualified investigators conducted face-to-face interviews with participants using the PTSD Checklist-Civilian version, the Zung Self-Rating Anxiety Scale, the Chinese version of the Social Support Rating Scale and the Revised Eysenck Personality Questionnaire-Short Scale for Chinese to assess PTSD, anxiety, social support and personality traits, respectively. Logistic regression analyses were used to identify factors associated with PTSD and anxiety. Results A total of 325 participants were recruited in this study, and the prevalence of PTSD and anxiety was 9.5% and 9.2%, respectively. Multivariable logistic regression analyses indicated that female sex, experiencing at least three flood-related stressors, having a low level of social support, and having the trait of emotional instability were risk factors for long-term adverse psychological outcomes among flood survivors after the disaster. Conclusions PTSD and anxiety were common long-term adverse psychological outcomes among flood survivors. Early and effective psychological interventions for flood survivors are needed to prevent the development of PTSD and anxiety in the long run after a flood, especially for individuals who are female, experience at least three flood-related stressors, have a low level of social support

  3. More frequent flooding? Changes in flood frequency in the Pearl River basin, China, since 1951 and over the past 1000 years

    NASA Astrophysics Data System (ADS)

    Zhang, Qiang; Gu, Xihui; Singh, Vijay P.; Shi, Peijun; Sun, Peng

    2018-05-01

    Flood risks across the Pearl River basin, China, were evaluated using a peak flood flow dataset covering a period of 1951-2014 from 78 stations and historical flood records of the past 1000 years. The generalized extreme value (GEV) model and the kernel estimation method were used to evaluate frequencies and risks of hazardous flood events. Results indicated that (1) no abrupt changes or significant trends could be detected in peak flood flow series at most of the stations, and only 16 out of 78 stations exhibited significant peak flood flow changes with change points around 1990. Peak flood flow in the West River basin increased and significant increasing trends were identified during 1981-2010; decreasing peak flood flow was found in coastal regions and significant trends were observed during 1951-2014 and 1966-2014. (2) The largest three flood events were found to cluster in both space and time. Generally, basin-scale flood hazards can be expected in the West and North River basins. (3) The occurrence rate of floods increased in the middle Pearl River basin but decreased in the lower Pearl River basin. However, hazardous flood events were observed in the middle and lower Pearl River basin, and this is particularly true for the past 100 years. However, precipitation extremes were subject to moderate variations and human activities, such as building of levees, channelization of river systems, and rapid urbanization; these were the factors behind the amplification of floods in the middle and lower Pearl River basin, posing serious challenges for developing measures of mitigation of flood hazards in the lower Pearl River basin, particularly the Pearl River Delta (PRD) region.

  4. Evaluation of flood hazard maps in print and web mapping services as information tools in flood risk communication

    NASA Astrophysics Data System (ADS)

    Hagemeier-Klose, M.; Wagner, K.

    2009-04-01

    Flood risk communication with the general public and the population at risk is getting increasingly important for flood risk management, especially as a precautionary measure. This is also underlined by the EU Flood Directive. The flood related authorities therefore have to develop adjusted information tools which meet the demands of different user groups. This article presents the formative evaluation of flood hazard maps and web mapping services according to the specific requirements and needs of the general public using the dynamic-transactional approach as a theoretical framework. The evaluation was done by a mixture of different methods; an analysis of existing tools, a creative workshop with experts and laymen and an online survey. The currently existing flood hazard maps or web mapping services or web GIS still lack a good balance between simplicity and complexity with adequate readability and usability for the public. Well designed and associative maps (e.g. using blue colours for water depths) which can be compared with past local flood events and which can create empathy in viewers, can help to raise awareness, to heighten the activity and knowledge level or can lead to further information seeking. Concerning web mapping services, a linkage between general flood information like flood extents of different scenarios and corresponding water depths and real time information like gauge levels is an important demand by users. Gauge levels of these scenarios are easier to understand than the scientifically correct return periods or annualities. The recently developed Bavarian web mapping service tries to integrate these requirements.

  5. Flood Hazard Mapping Assessment for El-Awali River Catchment-Lebanon

    NASA Astrophysics Data System (ADS)

    Hdeib, Rouya; Abdallah, Chadi; Moussa, Roger; Hijazi, Samar

    2016-04-01

    River flooding prediction and flood forecasting has become an essential stage in the major flood mitigation plans worldwide. Delineation of floodplains resulting from a river flooding event requires coupling between a Hydrological rainfall-runoff model to calculate the resulting outflows of the catchment and a hydraulic model to calculate the corresponding water surface profiles along the river main course. In this study several methods were applied to predict the flood discharge of El-Awali River using the available historical data and gauging records and by conducting several site visits. The HEC-HMS Rainfall-Runoff model was built and applied to calculate the flood hydrographs along several outlets on El-Awali River and calibrated using the storm that took place on January 2013 and caused flooding of the major Lebanese rivers and by conducting additional site visits to calculate proper river sections and record witnesses of the locals. The Hydraulic HEC-RAS model was then applied to calculate the corresponding water surface profiles along El-Awali River main reach. Floodplain delineation and Hazard mapping for 10,50 and 100 years return periods was performed using the Watershed Modeling System WMS. The results first show an underestimation of the flood discharge recorded by the operating gauge stations on El-Awali River, whereas, the discharge of the 100 years flood may reach up to 506 m3/s compared by lower values calculated using the traditional discharge estimation methods. Second any flooding of El-Awali River may be catastrophic especially to the coastal part of the catchment and can cause tragic losses in agricultural lands and properties. Last a major floodplain was noticed in Marj Bisri village this floodplain can reach more than 200 meters in width. Overall, performance was good and the Rainfall-Runoff model can provide valuable information about flows especially on ungauged points and can perform a great aid for the floodplain delineation and flood

  6. Validation of a Global Hydrodynamic Flood Inundation Model

    NASA Astrophysics Data System (ADS)

    Bates, P. D.; Smith, A.; Sampson, C. C.; Alfieri, L.; Neal, J. C.

    2014-12-01

    In this work we present first validation results for a hyper-resolution global flood inundation model. We use a true hydrodynamic model (LISFLOOD-FP) to simulate flood inundation at 1km resolution globally and then use downscaling algorithms to determine flood extent and depth at 90m spatial resolution. Terrain data are taken from a custom version of the SRTM data set that has been processed specifically for hydrodynamic modelling. Return periods of flood flows along the entire global river network are determined using: (1) empirical relationships between catchment characteristics and index flood magnitude in different hydroclimatic zones derived from global runoff data; and (2) an index flood growth curve, also empirically derived. Bankful return period flow is then used to set channel width and depth, and flood defence impacts are modelled using empirical relationships between GDP, urbanization and defence standard of protection. The results of these simulations are global flood hazard maps for a number of different return period events from 1 in 5 to 1 in 1000 years. We compare these predictions to flood hazard maps developed by national government agencies in the UK and Germany using similar methods but employing detailed local data, and to observed flood extent at a number of sites including St. Louis, USA and Bangkok in Thailand. Results show that global flood hazard models can have considerable skill given careful treatment to overcome errors in the publicly available data that are used as their input.

  7. Downscaling catchment scale flood risk to contributing sub-catchments to determine the optimum location for flood management.

    NASA Astrophysics Data System (ADS)

    Pattison, Ian; Lane, Stuart; Hardy, Richard; Reaney, Sim

    2010-05-01

    The recent increase in flood frequency and magnitude has been hypothesised to have been caused by either climate change or land management. Field scale studies have found that changing land management practices does affect local runoff and streamflow, but upscaling these effects to the catchment scale continues to be problematic, both conceptually and more importantly methodologically. The impact on downstream flood risk is highly dependent upon where the changes are in the catchment, indicating that some areas of the catchment are more important in determining downstream flood risk than others. This is a major flaw in the traditional approach to studying the effect of land use on downstream flood risk: catchment scale hydrological models, which treat every cell in the model equally. We are proposing an alternative ideological approach for doing flood management research, which is underpinned by downscaling the downstream effect (problem i.e. flooding) to the upstream causes (contributing sub-catchments). It is hoped that this approach could have several benefits over the traditional upscaling approach. Firstly, it provides an efficient method to prioritise areas for land use management changes to be implemented to reduce downstream flood risk. Secondly, targets for sub-catchment hydrograph change can be determined which will deliver the required downstream effect. Thirdly, it may be possible to detect the effect of land use changes in upstream areas on downstream flood risk, by weighting the areas of most importance in hydrological models. Two methods for doing this downscaling are proposed; 1) data-based statistical analysis; and 2) hydraulic modelling-based downscaling. These will be outlined using the case study of the River Eden, Cumbria, NW England. The data-based methodology uses the timing and magnitude of floods for each sub-catchment. Principal components analysis (PCA) is used to simplify sub-catchment interactions and optimising stepwise regression is

  8. Dynamic Flood Vulnerability Mapping with Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Tellman, B.; Kuhn, C.; Max, S. A.; Sullivan, J.

    2015-12-01

    Satellites capture the rate and character of environmental change from local to global levels, yet integrating these changes into flood exposure models can be cost or time prohibitive. We explore an approach to global flood modeling by leveraging satellite data with computing power in Google Earth Engine to dynamically map flood hazards. Our research harnesses satellite imagery in two main ways: first to generate a globally consistent flood inundation layer and second to dynamically model flood vulnerability. Accurate and relevant hazard maps rely on high quality observation data. Advances in publicly available spatial, spectral, and radar data together with cloud computing allow us to improve existing efforts to develop a comprehensive flood extent database to support model training and calibration. This talk will demonstrate the classification results of algorithms developed in Earth Engine designed to detect flood events by combining observations from MODIS, Landsat 8, and Sentinel-1. Our method to derive flood footprints increases the number, resolution, and precision of spatial observations for flood events both in the US, recorded in the NCDC (National Climatic Data Center) storm events database, and globally, as recorded events from the Colorado Flood Observatory database. This improved dataset can then be used to train machine learning models that relate spatial temporal flood observations to satellite derived spatial temporal predictor variables such as precipitation, antecedent soil moisture, and impervious surface. This modeling approach allows us to rapidly update models with each new flood observation, providing near real time vulnerability maps. We will share the water detection algorithms used with each satellite and discuss flood detection results with examples from Bihar, India and the state of New York. We will also demonstrate how these flood observations are used to train machine learning models and estimate flood exposure. The final stage of

  9. Near Real-Time Flood Monitoring and Impact Assessment Systems. Chapter 6; [Case Study: 2011 Flooding in Southeast Asia

    NASA Technical Reports Server (NTRS)

    Ahamed, Aakash; Bolten, John; Doyle, C.; Fayne, Jessica

    2016-01-01

    Floods are the costliest natural disaster (United Nations 2004), causing approximately6.8 million deaths in the twentieth century alone (Doocy et al. 2013).Worldwide economic flood damage estimates in 2012 exceed $19 Billion USD(Munich Re 2013). Extended duration floods also pose longer term threats to food security, water, sanitation, hygiene, and community livelihoods, particularly in developing countries (Davies et al. 2014).Projections by the Intergovernmental Panel on Climate Change (IPCC) suggest that precipitation extremes, rainfall intensity, storm intensity, and variability are increasing due to climate change (IPCC 2007). Increasing hydrologic uncertainty will likely lead to unprecedented extreme flood events. As such, there is a vital need to enhance and further develop traditional techniques used to rapidly assessflooding and extend analytical methods to estimate impacted population and infrastructure.

  10. Links between seawater flooding, soil ammonia oxidiser communities and their response to changes in salinity.

    PubMed

    Nacke, Heiko; Schöning, Ingo; Schindler, Malte; Schrumpf, Marion; Daniel, Rolf; Nicol, Graeme W; Prosser, James I

    2017-11-01

    Coastal areas worldwide are challenged by climate change-associated increases in sea level and storm surge quantities that potentially lead to more frequent flooding of soil ecosystems. Currently, little is known of the effects of inundation events on microorganisms controlling nitrification in these ecosystems. The goal of this study was to investigate the impact of seawater flooding on the abundance, community composition and salinity tolerance of soil ammonia oxidisers. Topsoil was sampled from three islands flooded at different frequencies by the Wadden Sea. Archaeal ammonia oxidiser amoA genes were more abundant than their betaproteobacterial counterparts, and the distribution of archaeal and bacterial ammonia oxidiser amoA and 16S rRNA gene sequences significantly differed between the islands. The findings indicate selection of ammonia oxidiser phylotypes with greater tolerance to high salinity and slightly alkaline pH (e.g. Nitrosopumilus representatives) in frequently flooded soils. A cluster phylogenetically related to gammaproteobacterial ammonia oxidisers was detected in all samples analysed in this survey. Nevertheless, no gammaprotebacterial amoA genes could be amplified via PCR and only betaproteobacterial ammonia oxidisers were detected in enrichment cultures. A slurry-based experiment demonstrated the tolerance of both bacterial and archaeal ammonia oxidisers to a wide range of salinities (e.g. Wadden Sea water salinity) in soil naturally exposed to seawater at a high frequency. © FEMS 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Flooding and Schools

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2011

    2011-01-01

    According to the Federal Emergency Management Agency, flooding is the nation's most common natural disaster. Some floods develop slowly during an extended period of rain or in a warming trend following a heavy snow. Flash floods can occur quickly, without any visible sign of rain. Catastrophic floods are associated with burst dams and levees,…

  12. City-scale accessibility of emergency responders operating during flood events

    NASA Astrophysics Data System (ADS)

    Green, Daniel; Yu, Dapeng; Pattison, Ian; Wilby, Robert; Bosher, Lee; Patel, Ramila; Thompson, Philip; Trowell, Keith; Draycon, Julia; Halse, Martin; Yang, Lili; Ryley, Tim

    2017-01-01

    Emergency responders often have to operate and respond to emergency situations during dynamic weather conditions, including floods. This paper demonstrates a novel method using existing tools and datasets to evaluate emergency responder accessibility during flood events within the city of Leicester, UK. Accessibility was quantified using the 8 and 10 min legislative targets for emergency provision for the ambulance and fire and rescue services respectively under "normal" no-flood conditions, as well as flood scenarios of various magnitudes (1 in 20-year, 1 in 100-year and 1 in 1000-year recurrence intervals), with both surface water and fluvial flood conditions considered. Flood restrictions were processed based on previous hydrodynamic inundation modelling undertaken and inputted into a Network Analysis framework as restrictions for surface water and fluvial flood events. Surface water flooding was shown to cause more disruption to emergency responders operating within the city due to its widespread and spatially distributed footprint when compared to fluvial flood events of comparable magnitude. Fire and rescue 10 min accessibility was shown to decrease from 100, 66.5, 39.8 and 26.2 % under the no-flood, 1 in 20-year, 1 in 100-year and 1 in 1000-year surface water flood scenarios respectively. Furthermore, total inaccessibility was shown to increase with flood magnitude from 6.0 % under the 1 in 20-year scenario to 31.0 % under the 1 in 100-year flood scenario. Additionally, the evolution of emergency service accessibility throughout a surface water flood event is outlined, demonstrating the rapid impact on emergency service accessibility within the first 15 min of the surface water flood event, with a reduction in service coverage and overlap being observed for the ambulance service during a 1 in 100-year flood event. The study provides evidence to guide strategic planning for decision makers prior to and during emergency response to flood events at the city

  13. Glacier generated floods

    USGS Publications Warehouse

    Walder, J.S.; Fountain, A.G.; ,

    1997-01-01

    Destructive floods result from drainage of glacier-dammed lakes and sudden release of water stored within glaciers. There is a good basis - both empirical and theoretical - for predicting the magnitude of floods from ice-dammed lakes, although some aspects of flood initiation need to be better understood. In contrast, an understanding of floods resulting from release of internally stored water remains elusive, owing to lack of knowledge of how and where water is stored and to inadequate understanding of the complex physics of the temporally and spatially variable subglacial drainage system.Destructive floods result from drainage of glacier-dammed lakes and sudden release of water stored within glaciers. There is a good basis - both empirical and theoretical - for predicting the magnitude of floods from ice-dammed lakes, although some aspects of flood initiation need to be better understood. In contrast, an understanding of floods resulting from release of internally stored water remains elusive, owing to lack of knowledge of how and where water is stored and to inadequate understanding of the complex physics of the temporally and spatially variable subglacial drainage system.

  14. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNAmore » populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.« less

  15. Using cost-benefit concepts in design floods improves communication of uncertainty

    NASA Astrophysics Data System (ADS)

    Ganora, Daniele; Botto, Anna; Laio, Francesco; Claps, Pierluigi

    2017-04-01

    Flood frequency analysis, i.e. the study of the relationships between the magnitude and the rarity of high flows in a river, is the usual procedure adopted to assess flood hazard, preliminary to the plan/design of flood protection measures. It grounds on the fit of a probability distribution to the peak discharge values recorded in gauging stations and the final estimates over a region are thus affected by uncertainty, due to the limited sample availability and of the possible alternatives in terms of the probabilistic model and the parameter estimation methods used. In the last decade, the scientific community dealt with this issue by developing a number of methods to quantify such uncertainty components. Usually, uncertainty is visually represented through confidence bands, which are easy to understand, but are not yet demonstrated to be useful for design purposes: they usually disorient decision makers, as the design flood is no longer univocally defined, making the decision process undetermined. These considerations motivated the development of the uncertainty-compliant design flood estimator (UNCODE) procedure (Botto et al., 2014) that allows one to select meaningful flood design values accounting for the associated uncertainty by considering additional constraints based on cost-benefit criteria. This method suggests an explicit multiplication factor that corrects the traditional (without uncertainty) design flood estimates to incorporate the effects of uncertainty in the estimate at the same safety level. Even though the UNCODE method was developed for design purposes, it can represent a powerful and robust tool to help clarifying the effects of the uncertainty in statistical estimation. As the process produces increased design flood estimates, this outcome demonstrates how uncertainty leads to more expensive flood protection measures, or insufficiency of current defenses. Moreover, the UNCODE approach can be used to assess the "value" of data, as the costs

  16. Estimating flood discharge using witness movies in post-flood hydrological surveys

    NASA Astrophysics Data System (ADS)

    Le Coz, Jérôme; Hauet, Alexandre; Le Boursicaud, Raphaël; Pénard, Lionel; Bonnifait, Laurent; Dramais, Guillaume; Thollet, Fabien; Braud, Isabelle

    2015-04-01

    The estimation of streamflow rates based on post-flood surveys is of paramount importance for the investigation of extreme hydrological events. Major uncertainties usually arise from the absence of information on the flow velocities and from the limited spatio-temporal resolution of such surveys. Nowadays, after each flood occuring in populated areas home movies taken from bridges, river banks or even drones are shared by witnesses through Internet platforms like YouTube. Provided that some topography data and additional information are collected, image-based velocimetry techniques can be applied to some of these movie materials, in order to estimate flood discharges. As a contribution to recent post-flood surveys conducted in France, we developed and applied a method for estimating velocities and discharges based on the Large Scale Particle Image Velocimetry (LSPIV) technique. Since the seminal work of Fujita et al. (1998), LSPIV applications to river flows were reported by a number of authors and LSPIV can now be considered a mature technique. However, its application to non-professional movies taken by flood witnesses remains challenging and required some practical developments. The different steps to apply LSPIV analysis to a flood home movie are as follows: (i) select a video of interest; (ii) contact the author for agreement and extra information; (iii) conduct a field topography campaign to georeference Ground Control Points (GCPs), water level and cross-sectional profiles; (iv) preprocess the video before LSPIV analysis: correct lens distortion, align the images, etc.; (v) orthorectify the images to correct perspective effects and know the physical size of pixels; (vi) proceed with the LSPIV analysis to compute the surface velocity field; and (vii) compute discharge according to a user-defined velocity coefficient. Two case studies in French mountainous rivers during extreme floods are presented. The movies were collected on YouTube and field topography

  17. Handbook for Federal Insurance Administration: Flood-insurance studies

    USGS Publications Warehouse

    Kennedy, E.J.

    1973-01-01

    A flood insurance study, made for the Federal Insurance Administration (FIA) of the Department of Housing and Urban Development (HUD) is an analysis of flood inundation frequency for all flood plains within the corporate limits of the community being studied. The study is an application of surveying, hydrology, and hydraulics to determine flood insurance premium rates. Much of the surveying needed can be done by private firms, either by ground methods or photogrammetry. Contracts are needed to let large surveys but purchase orders can be used for small ones. Photogrammetric stereo models, digital regression models, and step-backwater models are needed for most studies. Damage survey data are not involved.

  18. 44 CFR 60.3 - Flood plain management criteria for flood-prone areas.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... minimize or eliminate infiltration of flood waters into the systems; and (6) Require within flood-prone... infiltration of flood waters into the systems and discharges from the systems into flood waters and (ii) onsite...

  19. 44 CFR 60.3 - Flood plain management criteria for flood-prone areas.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... minimize or eliminate infiltration of flood waters into the systems; and (6) Require within flood-prone... infiltration of flood waters into the systems and discharges from the systems into flood waters and (ii) onsite...

  20. 44 CFR 60.3 - Flood plain management criteria for flood-prone areas.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... minimize or eliminate infiltration of flood waters into the systems; and (6) Require within flood-prone... infiltration of flood waters into the systems and discharges from the systems into flood waters and (ii) onsite...

  1. A fast method for optical simulation of flood maps of light-sharing detector modules

    PubMed Central

    Shi, Han; Du, Dong; Xu, JianFeng; Moses, William W.; Peng, Qiyu

    2016-01-01

    Optical simulation of the detector module level is highly desired for Position Emission Tomography (PET) system design. Commonly used simulation toolkits such as GATE are not efficient in the optical simulation of detector modules with complicated light-sharing configurations, where a vast amount of photons need to be tracked. We present a fast approach based on a simplified specular reflectance model and a structured light-tracking algorithm to speed up the photon tracking in detector modules constructed with polished finish and specular reflector materials. We simulated conventional block detector designs with different slotted light guide patterns using the new approach and compared the outcomes with those from GATE simulations. While the two approaches generated comparable flood maps, the new approach was more than 200–600 times faster. The new approach has also been validated by constructing a prototype detector and comparing the simulated flood map with the experimental flood map. The experimental flood map has nearly uniformly distributed spots similar to those in the simulated flood map. In conclusion, the new approach provides a fast and reliable simulation tool for assisting in the development of light-sharing-based detector modules with a polished surface finish and using specular reflector materials. PMID:27660376

  2. A fast method for optical simulation of flood maps of light-sharing detector modules.

    PubMed

    Shi, Han; Du, Dong; Xu, JianFeng; Moses, William W; Peng, Qiyu

    2015-12-01

    Optical simulation of the detector module level is highly desired for Position Emission Tomography (PET) system design. Commonly used simulation toolkits such as GATE are not efficient in the optical simulation of detector modules with complicated light-sharing configurations, where a vast amount of photons need to be tracked. We present a fast approach based on a simplified specular reflectance model and a structured light-tracking algorithm to speed up the photon tracking in detector modules constructed with polished finish and specular reflector materials. We simulated conventional block detector designs with different slotted light guide patterns using the new approach and compared the outcomes with those from GATE simulations. While the two approaches generated comparable flood maps, the new approach was more than 200-600 times faster. The new approach has also been validated by constructing a prototype detector and comparing the simulated flood map with the experimental flood map. The experimental flood map has nearly uniformly distributed spots similar to those in the simulated flood map. In conclusion, the new approach provides a fast and reliable simulation tool for assisting in the development of light-sharing-based detector modules with a polished surface finish and using specular reflector materials.

  3. A fast method for optical simulation of flood maps of light-sharing detector modules

    DOE PAGES

    Shi, Han; Du, Dong; Xu, JianFeng; ...

    2015-09-03

    Optical simulation of the detector module level is highly desired for Position Emission Tomography (PET) system design. Commonly used simulation toolkits such as GATE are not efficient in the optical simulation of detector modules with complicated light-sharing configurations, where a vast amount of photons need to be tracked. Here, we present a fast approach based on a simplified specular reflectance model and a structured light-tracking algorithm to speed up the photon tracking in detector modules constructed with polished finish and specular reflector materials. We also simulated conventional block detector designs with different slotted light guide patterns using the new approachmore » and compared the outcomes with those from GATE simulations. And while the two approaches generated comparable flood maps, the new approach was more than 200–600 times faster. The new approach has also been validated by constructing a prototype detector and comparing the simulated flood map with the experimental flood map. The experimental flood map has nearly uniformly distributed spots similar to those in the simulated flood map. In conclusion, the new approach provides a fast and reliable simulation tool for assisting in the development of light-sharing-based detector modules with a polished surface finish and using specular reflector materials.« less

  4. Social media for disaster response during floods

    NASA Astrophysics Data System (ADS)

    Eilander, D.; van de Vries, C.; Baart, F.; van Swol, R.; Wagemaker, J.; van Loenen, A.

    2015-12-01

    During floods it is difficult to obtain real-time accurate information about the extent and severity of the hazard. This information is very important for disaster risk reduction management and crisis relief organizations. Currently, real-time information is derived from few sources such as field reports, traffic camera's, satellite images and areal images. However, getting a real-time and accurate picture of the situation on the ground remains difficult. At the same time, people affected by natural hazards increasingly share their observations and their needs through digital media. Unlike conventional monitoring systems, Twitter data contains a relatively large number of real-time ground truth observations representing both physical hazard characteristics and hazard impacts. In the city of Jakarta, Indonesia, the intensity of unique flood related tweets during a flood event, peaked at almost 900 tweets per minute during floods in early 2015. Flood events around the world in 2014/2015 yielded large numbers of flood related tweets: from Philippines (85.000) to Pakistan (82.000) to South-Korea (50.000) to Detroit (20.000). The challenge here is to filter out useful content from this cloud of data, validate these observations and convert them to readily usable information. In Jakarta, flood related tweets often contain information about the flood depth. In a pilot we showed that this type of information can be used for real-time mapping of the flood extent by plotting these observations on a Digital Elevation Model. Uncertainties in the observations were taken into account by assigning a probability to each observation indicating its likelihood to be correct based on statistical analysis of the total population of tweets. The resulting flood maps proved to be correct for about 75% of the neighborhoods in Jakarta. Further cross-validation of flood related tweets against (hydro-) meteorological data is to likely improve the skill of the method.

  5. Methods for Estimating Magnitude and Frequency of Floods in Rural Basins in the Southeastern United States: South Carolina

    USGS Publications Warehouse

    Feaster, Toby D.; Gotvald, Anthony J.; Weaver, J. Curtis

    2009-01-01

    For more than 50 years, the U.S. Geological Survey (USGS) has been developing regional regression equations that can be used to estimate flood magnitude and frequency at ungaged sites. Flood magnitude relates to the volume of flow that occurs over some period of time and usually is presented in cubic feet per second. Flood frequency relates to the probability of occurrence of a flood; that is, on average, what is the likelihood that a flood with a specified magnitude will occur in any given year (1 percent chance, 10 percent chance, 50 percent chance, and so on). Such flood estimates are needed for the efficient design of bridges, highway embankments, levees, and other structures near streams. In addition, these estimates are needed for the effective planning and management of land and water resources, to protect lives and property in flood-prone areas, and to determine flood-insurance rates.

  6. Methods for determining magnitude and frequency of floods in California, based on data through water year 2006

    USGS Publications Warehouse

    Gotvald, Anthony J.; Barth, Nancy A.; Veilleux, Andrea G.; Parrett, Charles

    2012-01-01

    Methods for estimating the magnitude and frequency of floods in California that are not substantially affected by regulation or diversions have been updated. Annual peak-flow data through water year 2006 were analyzed for 771 streamflow-gaging stations (streamgages) in California having 10 or more years of data. Flood-frequency estimates were computed for the streamgages by using the expected moments algorithm to fit a Pearson Type III distribution to logarithms of annual peak flows for each streamgage. Low-outlier and historic information were incorporated into the flood-frequency analysis, and a generalized Grubbs-Beck test was used to detect multiple potentially influential low outliers. Special methods for fitting the distribution were developed for streamgages in the desert region in southeastern California. Additionally, basin characteristics for the streamgages were computed by using a geographical information system. Regional regression analysis, using generalized least squares regression, was used to develop a set of equations for estimating flows with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities for ungaged basins in California that are outside of the southeastern desert region. Flood-frequency estimates and basin characteristics for 630 streamgages were combined to form the final database used in the regional regression analysis. Five hydrologic regions were developed for the area of California outside of the desert region. The final regional regression equations are functions of drainage area and mean annual precipitation for four of the five regions. In one region, the Sierra Nevada region, the final equations are functions of drainage area, mean basin elevation, and mean annual precipitation. Average standard errors of prediction for the regression equations in all five regions range from 42.7 to 161.9 percent. For the desert region of California, an analysis of 33 streamgages was used to develop regional estimates

  7. Are we preventing flood damage eco-efficiently? An integrated method applied to post-disaster emergency actions.

    PubMed

    Petit-Boix, Anna; Arahuetes, Ana; Josa, Alejandro; Rieradevall, Joan; Gabarrell, Xavier

    2017-02-15

    Flood damage results in economic and environmental losses in the society, but flood prevention also entails an initial investment in infrastructure. This study presents an integrated eco-efficiency approach for assessing flood prevention and avoided damage. We focused on ephemeral streams in the Maresme region (Catalonia, Spain), which is an urbanized area affected by damaging torrential events. Our goal was to determine the feasibility of post-disaster emergency actions implemented after a major event through an integrated hydrologic, environmental and economic approach. Life cycle assessment (LCA) and costing (LCC) were used to determine the eco-efficiency of these actions, and their net impact and payback were calculated by integrating avoided flood damage. Results showed that the actions effectively reduced damage generation when compared to the registered water flows and rainfall intensities. The eco-efficiency of the emergency actions resulted in 1.2kgCO 2 eq. per invested euro. When integrating the avoided damage into the initial investment, negative net impacts were obtained (e.g., -5.2E+05€ and -2.9E+04kgCO 2 eq. per event), which suggests that these interventions contributed with environmental and economic benefits to the society. The economic investment was recovered in two years, whereas the design could be improved to reduce their environmental footprint, which is recovered in 25years. Our method and results highlight the effects of integrating the environmental and economic consequences of decisions at an urban scale and might help the administration and insurance companies in the design of prevention plans and climate change adaptation. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. A Fresh Start for Flood Estimation in Ungauged UK Catchments

    NASA Astrophysics Data System (ADS)

    Giani, Giulia; Woods, Ross

    2017-04-01

    The standard regression-based method for estimating the median annual flood in ungauged UK catchments has a high standard error (95% confidence interval is +/- a factor of 2). This is also the dominant source of uncertainty in statistical estimates of the 100-year flood. Similarly large uncertainties have been reported elsewhere. These large uncertainties make it difficult to do reliable flood design estimates for ungauged catchments. If the uncertainty could be reduced, flood protection schemes could be made significantly more cost-effective. Here we report on attempts to develop a new practical method for flood estimation in ungauged UK catchments, by making more use of knowledge about rainfall-runoff processes. Building on recent research on the seasonality of flooding, we first classify more than 1000 UK catchments into groups according to the seasonality of extreme rainfall and floods, and infer possible causal mechanisms for floods (e.g. Berghuijs et al, Geophysical Research Letters, 2016). For each group we are developing simplified rainfall-runoff-routing relationships (e.g. Viglione et al, Journal of Hydrology, 2010) which can account for spatial and temporal variability in rainfall and flood processes, as well as channel network routing effects. An initial investigation by Viglione et al suggested that the relationship between rainfall amount and flood peak could be summarised through a dimensionless response number that represents the product of the event runoff coefficient and a measure of hydrograph peakedness. Our hypothesis is that this approach is widely applicable, and can be used as the basis for flood estimation. Using subdaily and daily rainfall-runoff data for more than 1000 catchments, we identify a subset of catchments in the west of the UK where floods are generated predominantly in winter through the coincidence of heavy rain and low soil moisture deficits. Floods in these catchments can reliably be simulated with simple rainfall

  9. Participatory approaches to understanding practices of flood management across borders

    NASA Astrophysics Data System (ADS)

    Bracken, L. J.; Forrester, J.; Oughton, E. A.; Cinderby, S.; Donaldson, A.; Anness, L.; Passmore, D.

    2012-04-01

    The aim of this paper is to outline and present initial results from a study designed to identify principles of and practices for adaptive co-management strategies for resilience to flooding in borderlands using participatory methods. Borderlands are the complex and sometimes undefined spaces existing at the interface of different territories and draws attention towards messy connections and disconnections (Strathern 2004; Sassen 2006). For this project the borderlands concerned are those between professional and lay knowledge, between responsible agencies, and between one nation and another. Research was focused on the River Tweed catchment, located on the Scottish-English border. This catchment is subject to complex environmental designations and rural development regimes that make integrated management of the whole catchment difficult. A multi-method approach was developed using semi-structured interviews, Q methodology and participatory GIS in order to capture wide ranging practices for managing flooding, the judgements behind these practices and to 'scale up' participation in the study. Professionals and local experts were involved in the research. The methodology generated a useful set of options for flood management, with research outputs easily understood by key management organisations and the wider public alike. There was a wide endorsement of alternative flood management solutions from both managers and local experts. The role of location was particularly important for ensuring communication and data sharing between flood managers from different organisations and more wide ranging stakeholders. There were complex issues around scale; both the mismatch between communities and evidence of flooding and the mismatch between governance and scale of intervention for natural flood management. The multi-method approach was essential in capturing practice and the complexities around governance of flooding. The involvement of key flood management organisations was

  10. Quantifying the effect of autonomous adaptation to global river flood projections: application to future flood risk assessments

    NASA Astrophysics Data System (ADS)

    Kinoshita, Youhei; Tanoue, Masahiro; Watanabe, Satoshi; Hirabayashi, Yukiko

    2018-01-01

    This study represents the first attempt to quantify the effects of autonomous adaptation on the projection of global flood hazards and to assess future flood risk by including this effect. A vulnerability scenario, which varies according to the autonomous adaptation effect for conventional disaster mitigation efforts, was developed based on historical vulnerability values derived from flood damage records and a river inundation simulation. Coupled with general circulation model outputs and future socioeconomic scenarios, potential future flood fatalities and economic loss were estimated. By including the effect of autonomous adaptation, our multimodel ensemble estimates projected a 2.0% decrease in potential flood fatalities and an 821% increase in potential economic losses by 2100 under the highest emission scenario together with a large population increase. Vulnerability changes reduced potential flood consequences by 64%-72% in terms of potential fatalities and 28%-42% in terms of potential economic losses by 2100. Although socioeconomic changes made the greatest contribution to the potential increased consequences of future floods, about a half of the increase of potential economic losses was mitigated by autonomous adaptation. There is a clear and positive relationship between the global temperature increase from the pre-industrial level and the estimated mean potential flood economic loss, while there is a negative relationship with potential fatalities due to the autonomous adaptation effect. A bootstrapping analysis suggests a significant increase in potential flood fatalities (+5.7%) without any adaptation if the temperature increases by 1.5 °C-2.0 °C, whereas the increase in potential economic loss (+0.9%) was not significant. Our method enables the effects of autonomous adaptation and additional adaptation efforts on climate-induced hazards to be distinguished, which would be essential for the accurate estimation of the cost of adaptation to

  11. Techniques for estimating flood hydrographs for ungaged urban watersheds

    USGS Publications Warehouse

    Stricker, V.A.; Sauer, V.B.

    1984-01-01

    The Clark Method, modified slightly was used to develop a synthetic, dimensionless hydrograph which can be used to estimate flood hydrographs for ungaged urban watersheds. Application of the technique results in a typical (average) flood hydrograph for a given peak discharge. Input necessary to apply the technique is an estimate of basin lagtime and the recurrence interval peak discharge. Equations for this purpose were obtained from a recent nationwide study on flood frequency in urban watersheds. A regression equation was developed which relates flood volumes to drainage area size, basin lagtime, and peak discharge. This equation is useful where storage of floodwater may be a part of design of flood prevention. (USGS)

  12. Automatic domain updating technique for improving computational efficiency of 2-D flood-inundation simulation

    NASA Astrophysics Data System (ADS)

    Tanaka, T.; Tachikawa, Y.; Ichikawa, Y.; Yorozu, K.

    2017-12-01

    Flood is one of the most hazardous disasters and causes serious damage to people and property around the world. To prevent/mitigate flood damage through early warning system and/or river management planning, numerical modelling of flood-inundation processes is essential. In a literature, flood-inundation models have been extensively developed and improved to achieve flood flow simulation with complex topography at high resolution. With increasing demands on flood-inundation modelling, its computational burden is now one of the key issues. Improvements of computational efficiency of full shallow water equations are made from various perspectives such as approximations of the momentum equations, parallelization technique, and coarsening approaches. To support these techniques and more improve the computational efficiency of flood-inundation simulations, this study proposes an Automatic Domain Updating (ADU) method of 2-D flood-inundation simulation. The ADU method traces the wet and dry interface and automatically updates the simulation domain in response to the progress and recession of flood propagation. The updating algorithm is as follow: first, to register the simulation cells potentially flooded at initial stage (such as floodplains nearby river channels), and then if a registered cell is flooded, to register its surrounding cells. The time for this additional process is saved by checking only cells at wet and dry interface. The computation time is reduced by skipping the processing time of non-flooded area. This algorithm is easily applied to any types of 2-D flood inundation models. The proposed ADU method is implemented to 2-D local inertial equations for the Yodo River basin, Japan. Case studies for two flood events show that the simulation is finished within two to 10 times smaller time showing the same result as that without the ADU method.

  13. An Integrated Urban Flood Analysis System in South Korea

    NASA Astrophysics Data System (ADS)

    Moon, Young-Il; Kim, Min-Seok; Yoon, Tae-Hyung; Choi, Ji-Hyeok

    2017-04-01

    Due to climate change and the rapid growth of urbanization, the frequency of concentrated heavy rainfall has caused urban floods. As a result, we studied climate change in Korea and developed an integrated flood analysis system that systematized technology to quantify flood risk and flood forecasting in urban areas. This system supports synthetic decision-making through real-time monitoring and prediction on flash rain or short-term rainfall by using radar and satellite information. As part of the measures to deal with the increase of inland flood damage, we have found it necessary to build a systematic city flood prevention system that systematizes technology to quantify flood risk as well as flood forecast, taking into consideration both inland and river water. This combined inland-river flood analysis system conducts prediction on flash rain or short-term rainfall by using radar and satellite information and performs prompt and accurate prediction on the inland flooded area. In addition, flood forecasts should be accurate and immediate. Accurate flood forecasts signify that the prediction of the watch, warning time and water level is precise. Immediate flood forecasts represent the forecasts lead time which is the time needed to evacuate. Therefore, in this study, in order to apply rainfall-runoff method to medium and small urban stream for flood forecasts, short-term rainfall forecasting using radar is applied to improve immediacy. Finally, it supports synthetic decision-making for prevention of flood disaster through real-time monitoring. Keywords: Urban Flood, Integrated flood analysis system, Rainfall forecasting, Korea Acknowledgments This research was supported by a grant (16AWMP-B066744-04) from Advanced Water Management Research Program (AWMP) funded by Ministry of Land, Infrastructure and Transport of Korean government.

  14. Compounding effects of sea level rise and fluvial flooding.

    PubMed

    Moftakhari, Hamed R; Salvadori, Gianfausto; AghaKouchak, Amir; Sanders, Brett F; Matthew, Richard A

    2017-09-12

    Sea level rise (SLR), a well-documented and urgent aspect of anthropogenic global warming, threatens population and assets located in low-lying coastal regions all around the world. Common flood hazard assessment practices typically account for one driver at a time (e.g., either fluvial flooding only or ocean flooding only), whereas coastal cities vulnerable to SLR are at risk for flooding from multiple drivers (e.g., extreme coastal high tide, storm surge, and river flow). Here, we propose a bivariate flood hazard assessment approach that accounts for compound flooding from river flow and coastal water level, and we show that a univariate approach may not appropriately characterize the flood hazard if there are compounding effects. Using copulas and bivariate dependence analysis, we also quantify the increases in failure probabilities for 2030 and 2050 caused by SLR under representative concentration pathways 4.5 and 8.5. Additionally, the increase in failure probability is shown to be strongly affected by compounding effects. The proposed failure probability method offers an innovative tool for assessing compounding flood hazards in a warming climate.

  15. Field-trip guide to Columbia River flood basalts, associated rhyolites, and diverse post-plume volcanism in eastern Oregon

    USGS Publications Warehouse

    Ferns, Mark L.; Streck, Martin J.; McClaughry, Jason D.

    2017-08-09

    The Miocene Columbia River Basalt Group (CRBG) is the youngest and best preserved continental flood basalt province on Earth, linked in space and time with a compositionally diverse succession of volcanic rocks that partially record the apparent emergence and passage of the Yellowstone plume head through eastern Oregon during the late Cenozoic. This compositionally diverse suite of volcanic rocks are considered part of the La Grande-Owyhee eruptive axis (LOEA), an approximately 300-kilometer-long (185 mile), north-northwest-trending, middle Miocene to Pliocene volcanic belt located along the eastern margin of the Columbia River flood basalt province. Volcanic rocks erupted from and preserved within the LOEA form an important regional stratigraphic link between the (1) flood basalt-dominated Columbia Plateau on the north, (2) bimodal basalt-rhyolite vent complexes of the Owyhee Plateau on the south, (3) bimodal basalt-rhyolite and time-transgressive rhyolitic volcanic fields of the Snake River Plain-Yellowstone Plateau, and (4) the High Lava Plains of central Oregon.This field-trip guide describes a 4-day geologic excursion that will explore the stratigraphic and geochemical relationships among mafic rocks of the Columbia River Basalt Group and coeval and compositionally diverse volcanic rocks associated with the early “Yellowstone track” and High Lava Plains in eastern Oregon. Beginning in Portland, the Day 1 log traverses the Columbia River gorge eastward to Baker City, focusing on prominent outcrops that reveal a distal succession of laterally extensive, large-volume tholeiitic flood lavas of the Grande Ronde, Wanapum, and Saddle Mountains Basalt formations of the CRBG. These “great flows” are typical of the well-studied flood basalt-dominated Columbia Plateau, where interbedded silicic and calc-alkaline lavas are conspicuously absent. The latter part of Day 1 will highlight exposures of middle to late Miocene silicic ash-flow tuffs, rhyolite domes, and

  16. 21 CFR 864.7660 - Leukocyte alkaline phosphatase test.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Leukocyte alkaline phosphatase test. 864.7660... Leukocyte alkaline phosphatase test. (a) Identification. A leukocyte alkaline phosphatase test is a device used to identify the enzyme leukocyte alkaline phosphatase in neutrophilic granulocytes (granular...

  17. 21 CFR 864.7660 - Leukocyte alkaline phosphatase test.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Leukocyte alkaline phosphatase test. 864.7660... Leukocyte alkaline phosphatase test. (a) Identification. A leukocyte alkaline phosphatase test is a device used to identify the enzyme leukocyte alkaline phosphatase in neutrophilic granulocytes (granular...

  18. 21 CFR 864.7660 - Leukocyte alkaline phosphatase test.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Leukocyte alkaline phosphatase test. 864.7660... Leukocyte alkaline phosphatase test. (a) Identification. A leukocyte alkaline phosphatase test is a device used to identify the enzyme leukocyte alkaline phosphatase in neutrophilic granulocytes (granular...

  19. An integrated simulation method for flash-flood risk assessment: 2. Effects of changes in land-use under a historical perspective

    NASA Astrophysics Data System (ADS)

    Rosso, R.; Rulli, M. C.

    The influence of land use changes on flood occurrence and severity in the Bisagno River (Thyrrenian Liguria, N.W. Italy is investigated using a Monte Carlo simulation approach (Rulli and Rosso, 2002). High resolution land-use maps for the area were reconstructed and scenario simulations were made for a pre-industrial (1878), an intermediate (1930) and a current (1980) year. Land-use effects were explored to assess the consequences of distributed changes in land use due to agricultural practice and urbanisation. Hydraulic conveyance effects were considered, to assess the consequences of channel modifications associated with engineering works in the lower Bisagno River network. Flood frequency analyses of the annual flood series, retrieved from the simulations, were used to examine the effect of land-use change and river conveyance on flood regime. The impact of these effects proved to be negligible in the upper Bisagno River, moderate in the downstream river and severe in the small tributaries in the lower Bisagno valley that drain densely populated urban areas. The simulation approach is shown to be capable of incorporating historical data on landscape and river patterns into quantitative methods for risk assessment.

  20. Is alkaline phosphatase the smoking gun for highly refractory primitive leukemic cells?

    PubMed Central

    Rico, Laura G.; Juncà, Jordi; Ward, Mike D.; Bradford, Jolene; Petriz, Jordi

    2016-01-01

    With the aim to detect candidate malignant primitive progenitor populations, we modified an original alkaline phosphatase (ALP) stem cell detection method based on the identification of alkaline phosphatase fluorescent cells in combination with flow cytometry immunophenotyping. Over a period of one year, we have been using this technique to study its activity in patients with leukemia and lymphoma, showing that changes in the alkaline phosphatase levels can be used to detect rare populations of highly refractory malignant cells. By screening different blood cancers, we have observed that this activity is not always restricted to CD34+ leukemic cells, and can be overexpressed in CD34 negative leukemia. We have verified that this method gives accurate and reproducible measurements and our preliminary results suggest that CD34+/ALPhigh cells appear to sustain leukemogenesis over time. PMID:27732563

  1. Flood inundation extent mapping based on block compressed tracing

    NASA Astrophysics Data System (ADS)

    Shen, Dingtao; Rui, Yikang; Wang, Jiechen; Zhang, Yu; Cheng, Liang

    2015-07-01

    Flood inundation extent, depth, and duration are important factors affecting flood hazard evaluation. At present, flood inundation analysis is based mainly on a seeded region-growing algorithm, which is an inefficient process because it requires excessive recursive computations and it is incapable of processing massive datasets. To address this problem, we propose a block compressed tracing algorithm for mapping the flood inundation extent, which reads the DEM data in blocks before transferring them to raster compression storage. This allows a smaller computer memory to process a larger amount of data, which solves the problem of the regular seeded region-growing algorithm. In addition, the use of a raster boundary tracing technique allows the algorithm to avoid the time-consuming computations required by the seeded region-growing. Finally, we conduct a comparative evaluation in the Chin-sha River basin, results show that the proposed method solves the problem of flood inundation extent mapping based on massive DEM datasets with higher computational efficiency than the original method, which makes it suitable for practical applications.

  2. Unstructured mesh adaptivity for urban flooding modelling

    NASA Astrophysics Data System (ADS)

    Hu, R.; Fang, F.; Salinas, P.; Pain, C. C.

    2018-05-01

    Over the past few decades, urban floods have been gaining more attention due to their increase in frequency. To provide reliable flooding predictions in urban areas, various numerical models have been developed to perform high-resolution flood simulations. However, the use of high-resolution meshes across the whole computational domain causes a high computational burden. In this paper, a 2D control-volume and finite-element flood model using adaptive unstructured mesh technology has been developed. This adaptive unstructured mesh technique enables meshes to be adapted optimally in time and space in response to the evolving flow features, thus providing sufficient mesh resolution where and when it is required. It has the advantage of capturing the details of local flows and wetting and drying front while reducing the computational cost. Complex topographic features are represented accurately during the flooding process. For example, the high-resolution meshes around the buildings and steep regions are placed when the flooding water reaches these regions. In this work a flooding event that happened in 2002 in Glasgow, Scotland, United Kingdom has been simulated to demonstrate the capability of the adaptive unstructured mesh flooding model. The simulations have been performed using both fixed and adaptive unstructured meshes, and then results have been compared with those published 2D and 3D results. The presented method shows that the 2D adaptive mesh model provides accurate results while having a low computational cost.

  3. Process-based model with flood control measures towards more realistic global flood modeling

    NASA Astrophysics Data System (ADS)

    Tang, Q.; Zhang, X.; Wang, Y.; Mu, M.; Lv, A.; Li, Z.

    2017-12-01

    In the profoundly human-influenced era, the Anthropocene, increased amount of land was developed in flood plains and many flood control measures were implemented to protect people and infrastructures placed in the flood-prone areas. These human influences (for example, dams and dykes) have altered peak streamflow and flood risk, and are already an integral part of flood. However, most of the process-based flood models have yet to taken into account the human influences. In this study, we used a hydrological model together with an advanced hydrodynamic model to assess flood risk at the Baiyangdian catchment. The Baiyangdian Lake is the largest shallow freshwater lake in North China, and it was used as a flood storage area in the past. A new development hub for the Beijing-Tianjin-Hebei economic triangle, namely the Xiongan new area, was recently established in the flood-prone area around the lake. The shuttle radar topography mission (SRTM) digital elevation model (DEMs) was used to parameterize the hydrodynamic model simulation, and the inundation estimates were compared with published flood maps and observed inundation area during the extreme historical flood events. A simple scheme was carried out to consider the impacts of flood control measures, including the reservoirs in the headwaters and the dykes to be built. By comparing model simulations with and without the influences of flood control measures, we demonstrated the importance of human influences in altering the inundated area and depth under design flood conditions. Based on the SRTM DEM and dam and reservoir data in the Global Reservoir and Dam (GRanD) database, we further discuss the potential to develop a global flood model with human influences.

  4. Hot wet spots of Swiss buildings - detecting clusters of flood exposure

    NASA Astrophysics Data System (ADS)

    Röthlisberger, Veronika; Zischg, Andreas; Keiler, Margreth

    2016-04-01

    Where are the hotspots of flood exposure in Switzerland? There is no single answer but rather a wide range of findings depending on the databases and methods used. In principle, the analysis of flood exposure is the overlay of two spatial datasets, one on flood hazard and one on assets, e.g. buildings. The presented study aims to test a new developed approach which is based on public available Swiss data. On the hazard side, these are two different types of flood hazard maps each representing a similar return period beyond the dimensioning of structural protection systems. When it comes to assets we use nationwide harmonized data on building, namely a complete dataset of building polygons to which we assign features as volume, residents and monetary value. For the latter we apply findings of multivariate analyses of insurance data. By overlaying building polygons with the flood hazard map we identify the exposed buildings. We analyse the resulting spatial distribution of flood exposure at different levels of scales (local to regional) using administrative units (e.g. municipalities) but also artificial grids with a corresponding size (e.g. 5 000 m). The presentation focuses on the identification of hotspots highlighting the influence of the applied data and methods, e.g. local scan statistics testing intensities within and without potential clusters or log relative exposure surfaces based on kernel intensity estimates. We find a major difference of identified hotspots between absolute values and normalized values of exposure. Whereas the hotspots of flood exposure in absolute figures mirrors the underlying distribution of buildings, the hotspots of flood exposure ratios show very different pictures. We conclude that findings on flood exposure vary depending on the data and moreover the methods used and therefore need to be communicated carefully and appropriate to different stakeholders who may use the information for decision making on flood risk management.

  5. A new approach for computing a flood vulnerability index using cluster analysis

    NASA Astrophysics Data System (ADS)

    Fernandez, Paulo; Mourato, Sandra; Moreira, Madalena; Pereira, Luísa

    2016-08-01

    A Flood Vulnerability Index (FloodVI) was developed using Principal Component Analysis (PCA) and a new aggregation method based on Cluster Analysis (CA). PCA simplifies a large number of variables into a few uncorrelated factors representing the social, economic, physical and environmental dimensions of vulnerability. CA groups areas that have the same characteristics in terms of vulnerability into vulnerability classes. The grouping of the areas determines their classification contrary to other aggregation methods in which the areas' classification determines their grouping. While other aggregation methods distribute the areas into classes, in an artificial manner, by imposing a certain probability for an area to belong to a certain class, as determined by the assumption that the aggregation measure used is normally distributed, CA does not constrain the distribution of the areas by the classes. FloodVI was designed at the neighbourhood level and was applied to the Portuguese municipality of Vila Nova de Gaia where several flood events have taken place in the recent past. The FloodVI sensitivity was assessed using three different aggregation methods: the sum of component scores, the first component score and the weighted sum of component scores. The results highlight the sensitivity of the FloodVI to different aggregation methods. Both sum of component scores and weighted sum of component scores have shown similar results. The first component score aggregation method classifies almost all areas as having medium vulnerability and finally the results obtained using the CA show a distinct differentiation of the vulnerability where hot spots can be clearly identified. The information provided by records of previous flood events corroborate the results obtained with CA, because the inundated areas with greater damages are those that are identified as high and very high vulnerability areas by CA. This supports the fact that CA provides a reliable FloodVI.

  6. Struggle in the flood: tree responses to flooding stress in four tropical floodplain systems

    PubMed Central

    Parolin, Pia; Wittmann, Florian

    2010-01-01

    Background and aims In the context of the 200th anniversary of Charles Darwin's birth in 1809, this study discusses the variation in structure and adaptation associated with survival and reproductive success in the face of environmental stresses in the trees of tropical floodplains. Scope We provide a comparative review on the responses to flooding stress in the trees of freshwater wetlands in tropical environments. The four large wetlands we evaluate are: (i) Central Amazonian floodplains in South America, (ii) the Okavango Delta in Africa, (iii) the Mekong floodplains of Asia and (iv) the floodplains of Northern Australia. They each have a predictable ‘flood pulse’. Although flooding height varies between the ecosystems, the annual pulse is a major driving force influencing all living organisms and a source of stress for which specialized adaptations for survival are required. Main points The need for trees to survive an annual flood pulse has given rise to a large variety of adaptations. However, phenological responses to the flood are similar in the four ecosystems. Deciduous and evergreen species respond with leaf shedding, although sap flow remains active for most of the year. Growth depends on adequate carbohydrate supply. Physiological adaptations (anaerobic metabolism, starch accumulation) are also required. Conclusions Data concerning the ecophysiology and adaptations of trees in floodplain forests worldwide are extremely scarce. For successful floodplain conservation, more information is needed, ideally through a globally co-ordinated study using reproducible comparative methods. In the light of climatic change, with increasing drought, decreased groundwater availability and flooding periodicities, this knowledge is needed ever more urgently to facilitate fast and appropriate management responses to large-scale environmental change. PMID:22476061

  7. Revisiting regional flood frequency analysis in Slovakia: the region-of-influence method vs. traditional regional approaches

    NASA Astrophysics Data System (ADS)

    Gaál, Ladislav; Kohnová, Silvia; Szolgay, Ján.

    2010-05-01

    During the last 10-15 years, the Slovak hydrologists and water resources managers have been devoting considerable efforts to develop statistical tools for modelling probabilities of flood occurrence in a regional context. Initially, these models followed concepts to regional flood frequency analysis that were based on fixed regions, later the Hosking and Wallis's (HW; 1997) theory was adopted and modified. Nevertheless, it turned out to be that delineating homogeneous regions using these approaches is not a straightforward task, mostly due to the complex orography of the country. In this poster we aim at revisiting flood frequency analyses so far accomplished for Slovakia by adopting one of the pooling approaches, i.e. the region-of-influence (ROI) approach (Burn, 1990). In the ROI approach, unique pooling groups of similar sites are defined for each site under study. The similarity of sites is defined through Euclidean distance in the space of site attributes that had also proved applicability in former cluster analyses: catchment area, afforested area, hydrogeological catchment index and the mean annual precipitation. The homogeneity of the proposed pooling groups is evaluated by the built-in homogeneity test by Lu and Stedinger (1992). Two alternatives of the ROI approach are examined: in the first one the target size of the pooling groups is adjusted to the target return period T of the estimated flood quantiles, while in the other one, the target size is fixed, regardless of the target T. The statistical models of the ROI approach are inter-compared by the conventional regionalization approach based on the HW methodology where the parameters of flood frequency distributions were derived by means of L-moment statistics and a regional formula for the estimation of the index flood was derived by multiple regression methods using physiographic and climatic catchment characteristics. The inter-comparison of different frequency models is evaluated by means of the

  8. Historical floods in flood frequency analysis: Is this game worth the candle?

    NASA Astrophysics Data System (ADS)

    Strupczewski, Witold G.; Kochanek, Krzysztof; Bogdanowicz, Ewa

    2017-11-01

    In flood frequency analysis (FFA) the profit from inclusion of historical information on the largest historical pre-instrumental floods depends primarily on reliability of the information, i.e. the accuracy of magnitude and return period of floods. This study is focused on possible theoretical maximum gain in accuracy of estimates of upper quantiles, that can be obtained by incorporating the largest historical floods of known return periods into the FFA. We assumed a simple case: N years of systematic records of annual maximum flows and either one largest (XM1) or two largest (XM1 and XM2) flood peak flows in a historical M-year long period. The problem is explored by Monte Carlo simulations with the maximum likelihood (ML) method. Both correct and false distributional assumptions are considered. In the first case the two-parameter extreme value models (Gumbel, log-Gumbel, Weibull) with various coefficients of variation serve as parent distributions. In the case of unknown parent distribution, the Weibull distribution was assumed as estimating model and the truncated Gumbel as parent distribution. The return periods of XM1 and XM2 are determined from the parent distribution. The results are then compared with the case, when return periods of XM1 and XM2 are defined by their plotting positions. The results are presented in terms of bias, root mean square error and the probability of overestimation of the quantile with 100-year return period. The results of the research indicate that the maximal profit of inclusion of pre-instrumental foods in the FFA may prove smaller than the cost of reconstruction of historical hydrological information.

  9. Nucleotide sequences encoding a thermostable alkaline protease

    DOEpatents

    Wilson, David B.; Lao, Guifang

    1998-01-01

    Nucleotide sequences, derived from a thermophilic actinomycete microorganism, which encode a thermostable alkaline protease are disclosed. Also disclosed are variants of the nucleotide sequences which encode a polypeptide having thermostable alkaline proteolytic activity. Recombinant thermostable alkaline protease or recombinant polypeptide may be obtained by culturing in a medium a host cell genetically engineered to contain and express a nucleotide sequence according to the present invention, and recovering the recombinant thermostable alkaline protease or recombinant polypeptide from the culture medium.

  10. Topography- and nightlight-based national flood risk assessment in Canada

    NASA Astrophysics Data System (ADS)

    Elshorbagy, Amin; Bharath, Raja; Lakhanpal, Anchit; Ceola, Serena; Montanari, Alberto; Lindenschmidt, Karl-Erich

    2017-04-01

    In Canada, flood analysis and water resource management, in general, are tasks conducted at the provincial level; therefore, unified national-scale approaches to water-related problems are uncommon. In this study, a national-scale flood risk assessment approach is proposed and developed. The study focuses on using global and national datasets available with various resolutions to create flood risk maps. First, a flood hazard map of Canada is developed using topography-based parameters derived from digital elevation models, namely, elevation above nearest drainage (EAND) and distance from nearest drainage (DFND). This flood hazard mapping method is tested on a smaller area around the city of Calgary, Alberta, against a flood inundation map produced by the city using hydraulic modelling. Second, a flood exposure map of Canada is developed using a land-use map and the satellite-based nightlight luminosity data as two exposure parameters. Third, an economic flood risk map is produced, and subsequently overlaid with population density information to produce a socioeconomic flood risk map for Canada. All three maps of hazard, exposure, and risk are classified into five classes, ranging from very low to severe. A simple way to include flood protection measures in hazard estimation is also demonstrated using the example of the city of Winnipeg, Manitoba. This could be done for the entire country if information on flood protection across Canada were available. The evaluation of the flood hazard map shows that the topography-based method adopted in this study is both practical and reliable for large-scale analysis. Sensitivity analysis regarding the resolution of the digital elevation model is needed to identify the resolution that is fine enough for reliable hazard mapping, but coarse enough for computational tractability. The nightlight data are found to be useful for exposure and risk mapping in Canada; however, uncertainty analysis should be conducted to investigate the

  11. Regional flood-frequency relations for streams with many years of no flow

    USGS Publications Warehouse

    Hjalmarson, Hjalmar W.; Thomas, Blakemore E.; ,

    1990-01-01

    In the southwestern United States, flood-frequency relations for streams that drain small arid basins are difficult to estimate, largely because of the extreme temporal and spatial variability of floods and the many years of no flow. A method is proposed that is based on the station-year method. The new method produces regional flood-frequency relations using all available annual peak-discharge data. The prediction errors for the relations are directly assessed using randomly selected subsamples of the annual peak discharges.

  12. An operational procedure for rapid flood risk assessment in Europe

    NASA Astrophysics Data System (ADS)

    Dottori, Francesco; Kalas, Milan; Salamon, Peter; Bianchi, Alessandra; Alfieri, Lorenzo; Feyen, Luc

    2017-07-01

    The development of methods for rapid flood mapping and risk assessment is a key step to increase the usefulness of flood early warning systems and is crucial for effective emergency response and flood impact mitigation. Currently, flood early warning systems rarely include real-time components to assess potential impacts generated by forecasted flood events. To overcome this limitation, this study describes the benchmarking of an operational procedure for rapid flood risk assessment based on predictions issued by the European Flood Awareness System (EFAS). Daily streamflow forecasts produced for major European river networks are translated into event-based flood hazard maps using a large map catalogue derived from high-resolution hydrodynamic simulations. Flood hazard maps are then combined with exposure and vulnerability information, and the impacts of the forecasted flood events are evaluated in terms of flood-prone areas, economic damage and affected population, infrastructures and cities.An extensive testing of the operational procedure has been carried out by analysing the catastrophic floods of May 2014 in Bosnia-Herzegovina, Croatia and Serbia. The reliability of the flood mapping methodology is tested against satellite-based and report-based flood extent data, while modelled estimates of economic damage and affected population are compared against ground-based estimations. Finally, we evaluate the skill of risk estimates derived from EFAS flood forecasts with different lead times and combinations of probabilistic forecasts. Results highlight the potential of the real-time operational procedure in helping emergency response and management.

  13. Probabilistic Flood Mapping using Volunteered Geographical Information

    NASA Astrophysics Data System (ADS)

    Rivera, S. J.; Girons Lopez, M.; Seibert, J.; Minsker, B. S.

    2016-12-01

    Flood extent maps are widely used by decision makers and first responders to provide critical information that prevents economic impacts and the loss of human lives. These maps are usually obtained from sensory data and/or hydrologic models, which often have limited coverage in space and time. Recent developments in social media and communication technology have created a wealth of near-real-time, user-generated content during flood events in many urban areas, such as flooded locations, pictures of flooding extent and height, etc. These data could improve decision-making and response operations as events unfold. However, the integration of these data sources has been limited due to the need for methods that can extract and translate the data into useful information for decision-making. This study presents an approach that uses volunteer geographic information (VGI) and non-traditional data sources (i.e., Twitter, Flicker, YouTube, and 911 and 311 calls) to generate/update the flood extent maps in areas where no models and/or gauge data are operational. The approach combines Web-crawling and computer vision techniques to gather information about the location, extent, and water height of the flood from unstructured textual data, images, and videos. These estimates are then used to provide an updated flood extent map for areas surrounding the geo-coordinate of the VGI through the application of a Hydro Growing Region Algorithm (HGRA). HGRA combines hydrologic and image segmentation concepts to estimate a probabilistic flooding extent along the corresponding creeks. Results obtained for a case study in Austin, TX (i.e., 2015 Memorial Day flood) were comparable to those obtained by a calibrated hydrologic model and had good spatial correlation with flooding extents estimated by the Federal Emergency Management Agency (FEMA).

  14. Past and present floods in South Moravia

    NASA Astrophysics Data System (ADS)

    Brázdil, Rudolf; Chromá, Kateřina; Řezníčková, Ladislava; Valášek, Hubert; Dolák, Lukáš; Stachoň, Zdeněk; Soukalová, Eva; Dobrovolný, Petr

    2015-04-01

    Floods represent the most destructive natural phenomena in the Czech Republic, often causing great material damage or loss of human life. Systematic instrumental measurements of water levels in Moravia (the eastern part of the Czech Republic) started mainly in the 1880s-1890s, while for discharges it was in the 1910s-1920s. Different documentary evidence allows extension of our knowledge about floods prior the instrumental period. The paper presents long-term flood chronologies for four South Moravian rivers: the Jihlava, the Svratka, the Dyje and the Morava. Different documentary data are used to extract floods. Taxation records are of particular importance among them. Since the mid-17th century, damage to property and land (fields, meadows, pastures or gardens) entitled farmers and landowners to request a tax relief. Related documents of this administration process kept mainly in Moravian Land Archives in Brno allow to obtain detail information about floods and their impacts. Selection of floods in the instrumental period is based on calculation of N-year return period of peak water levels and/or peak discharges for selected hydrological stations of the corresponding rivers (with return period of two years and more). Final flood chronologies combine floods derived from both documentary data and hydrological measurements. Despite greater inter-decadal variability, periods of higher flood frequency are c. 1821-1850 and 1921-1950 for all four rivers; for the Dyje and Morava rivers also 1891-1900. Flood frequency fluctuations are further compared with other Central European rivers. Uncertainties in created chronologies with respect to data and methods used for compilation of long-term series and anthropogenic changes in river catchments are discussed. The study is a part of the research project "Hydrometeorological extremes in Southern Moravia derived from documentary evidence" supported by the Grant Agency of the Czech Republic, reg. no. 13-19831S.

  15. Copula-based assessment of the relationship between food peaks and flood volumes using information on historical floods by Bayesian Monte Carlo Markov Chain simulations

    NASA Astrophysics Data System (ADS)

    Gaál, Ladislav; Szolgay, Ján.; Bacigál, Tomáå.¡; Kohnová, Silvia

    2010-05-01

    Copula-based estimation methods of hydro-climatological extremes have increasingly been gaining attention of researchers and practitioners in the last couple of years. Unlike the traditional estimation methods which are based on bivariate cumulative distribution functions (CDFs), copulas are a relatively flexible tool of statistics that allow for modelling dependencies between two or more variables such as flood peaks and flood volumes without making strict assumptions on the marginal distributions. The dependence structure and the reliability of the joint estimates of hydro-climatological extremes, mainly in the right tail of the joint CDF not only depends on the particular copula adopted but also on the data available for the estimation of the marginal distributions of the individual variables. Generally, data samples for frequency modelling have limited temporal extent, which is a considerable drawback of frequency analyses in practice. Therefore, it is advised to deal with statistical methods that improve any part of the process of copula construction and result in more reliable design values of hydrological variables. The scarcity of the data sample mostly in the extreme tail of the joint CDF can be bypassed, e.g., by using a considerably larger amount of simulated data by rainfall-runoff analysis or by including historical information on the variables under study. The latter approach of data extension is used here to make the quantile estimates of the individual marginals of the copula more reliable. In the presented paper it is proposed to use historical information in the frequency analysis of the marginal distributions in the framework of Bayesian Monte Carlo Markov Chain (MCMC) simulations. Generally, a Bayesian approach allows for a straightforward combination of different sources of information on floods (e.g. flood data from systematic measurements and historical flood records, respectively) in terms of a product of the corresponding likelihood

  16. Urban Flood Prevention and Early Warning System in Jinan City

    NASA Astrophysics Data System (ADS)

    Feng, Shiyuan; Li, Qingguo

    2018-06-01

    The system construction of urban flood control and disaster reduction in China is facing pressure and challenge from new urban water disaster. Under the circumstances that it is difficult to build high standards of flood protection engineering measures in urban areas, it is particularly important to carry out urban flood early warning. In Jinan City, a representative inland area, based on the index system of early warning of flood in Jinan urban area, the method of fuzzy comprehensive evaluation was adopted to evaluate the level of early warning. Based on the cumulative rainfall of 3 hours, the CAflood simulation results based on cellular automaton model of urban flooding were used as evaluation indexes to realize the accuracy and integration of urban flood control early warning.

  17. Estimation of flood environmental effects using flood zone mapping techniques in Halilrood Kerman, Iran.

    PubMed

    Boudaghpour, Siamak; Bagheri, Majid; Bagheri, Zahra

    2014-01-01

    High flood occurrences with large environmental damages have a growing trend in Iran. Dynamic movements of water during a flood cause different environmental damages in geographical areas with different characteristics such as topographic conditions. In general, environmental effects and damages caused by a flood in an area can be investigated from different points of view. The current essay is aiming at detecting environmental effects of flood occurrences in Halilrood catchment area of Kerman province in Iran using flood zone mapping techniques. The intended flood zone map was introduced in four steps. Steps 1 to 3 pave the way to calculate and estimate flood zone map in the understudy area while step 4 determines the estimation of environmental effects of flood occurrence. Based on our studies, wide range of accuracy for estimating the environmental effects of flood occurrence was introduced by using of flood zone mapping techniques. Moreover, it was identified that the existence of Jiroft dam in the study area can decrease flood zone from 260 hectares to 225 hectares and also it can decrease 20% of flood peak intensity. As a result, 14% of flood zone in the study area can be saved environmentally.

  18. Flood Risk Assessments of Architectural Heritage - Case of Changgyeonggung Palace

    NASA Astrophysics Data System (ADS)

    Lee, Hyosang; Kim, Ji-sung; Lee, Ho-jin

    2014-05-01

    The risk of natural disasters such as flood and earthquake has increased due to recent extreme weather events. Therefore, the necessity of the risk management system to protect architectural properties, a cultural heritage of humanity, from natural disasters has been consistently felt. The solutions for managing flood risk focusing on architectural heritage are suggested and applied to protect Changgyeonggung Palace, a major palace heritage in Seoul. After the probable rainfall scenario for risk assessment (frequency: 100 years, 200 years, and 500 years) and the scenario of a probable maximum precipitation (PMP) are made and a previous rainfall event (from July 26th to 28th in 2011) is identified, they are used for the model (HEC-HMS, SWMM) to assess flood risk of certain areas covering Changgyeonggung Palace to do flood amount. Such flood amount makes it possible to identify inundation risks based on GIS models to assess flood risk of individual architectural heritage. The results of assessing such risk are used to establish the disaster risk management system that managers of architectural properties can utilize. According to the results of assessing flood risk of Changgyeonggung Palace, inundation occurs near outlets of Changgyeonggung Palace and sections of river channel for all scenarios of flood risk but the inundation risk of major architectural properties was estimated low. The methods for assessing flood risk of architectural heritage proposed in this study and the risk management system for Changgyeonggung Palace using the methods show thorough solutions for flood risk management and the possibility of using the solutions seems high. A comprehensive management system for architectural heritage will be established in the future through the review on diverse factors for disasters.

  19. Flood Risk Management in Iowa through an Integrated Flood Information System

    NASA Astrophysics Data System (ADS)

    Demir, Ibrahim; Krajewski, Witold

    2013-04-01

    The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, flood-related data, information and interactive visualizations for communities in Iowa. The key element of the system's architecture is the notion of community. Locations of the communities, those near streams and rivers, define basin boundaries. The IFIS provides community-centric watershed and river characteristics, weather (rainfall) conditions, and streamflow data and visualization tools. Interactive interfaces allow access to inundation maps for different stage and return period values, and flooding scenarios with contributions from multiple rivers. Real-time and historical data of water levels, gauge heights, and rainfall conditions are available in the IFIS by streaming data from automated IFC bridge sensors, USGS stream gauges, NEXRAD radars, and NWS forecasts. Simple 2D and 3D interactive visualizations in the IFIS make the data more understandable to general public. Users are able to filter data sources for their communities and selected rivers. The data and information on IFIS is also accessible through web services and mobile applications. The IFIS is optimized for various browsers and screen sizes to provide access through multiple platforms including tablets and mobile devices. The IFIS includes a rainfall-runoff forecast model to provide a five-day flood risk estimate for around 1100 communities in Iowa. Multiple view modes in the IFIS accommodate different user types from general public to researchers and decision makers by providing different level of tools and details. River view mode allows users to visualize data from multiple IFC bridge sensors and USGS stream gauges to follow flooding condition along a river. The IFIS will help communities make better-informed decisions on the occurrence of floods, and will alert

  20. Development of evaluation metod of flood risk in Tokyo metropolitan area

    NASA Astrophysics Data System (ADS)

    Hirano, J.; Dairaku, K.

    2012-12-01

    Flood is one of the most significant natural hazards in Japan. In particular, the Tokyo metropolitan area has been affected by several large flood disasters. Investigating potential flood risk in Tokyo metropolitan area is important for development of climate change adaptation strategy. We aim to develop a method for evaluating flood risk in Tokyo Metropolitan area by considering effect of historical land use and land cover change, socio-economic change, and climatic change. Ministry of land, infrastructure, transport and tourism in Japan published "Statistics of flood", which contains data for flood causes, number of damaged houses, area of wetted surface, and total amount of damage for each flood at small municipal level. Based on these flood data, we constructed a flood database system for Tokyo metropolitan area for the period from 1961 to 2008 by using ArcGIS software.Based on these flood data , we created flood risk curve, representing the relation ship between damage and exceedbability of flood for the period 1976-2008. Based on the flood risk cruve, we aim to evaluate potential flood risk in the Tokyo metropolitan area and clarify the cause of regional difference in flood risk at Tokyo metropolitan area by considering effect of socio-economic change and climate change

  1. Nucleotide sequences encoding a thermostable alkaline protease

    DOEpatents

    Wilson, D.B.; Lao, G.

    1998-01-06

    Nucleotide sequences, derived from a thermophilic actinomycete microorganism, which encode a thermostable alkaline protease are disclosed. Also disclosed are variants of the nucleotide sequences which encode a polypeptide having thermostable alkaline proteolytic activity. Recombinant thermostable alkaline protease or recombinant polypeptide may be obtained by culturing in a medium a host cell genetically engineered to contain and express a nucleotide sequence according to the present invention, and recovering the recombinant thermostable alkaline protease or recombinant polypeptide from the culture medium. 3 figs.

  2. Frequency analyses for recent regional floods in the United States

    USGS Publications Warehouse

    Melcher, Nick B.; Martinez, Patsy G.; ,

    1996-01-01

    During 1993-95, significant floods that resulted in record-high river stages, loss of life, and significant property damage occurred in the United States. The floods were caused by unique global weather patterns that produced large amounts of rain over large areas. Standard methods for flood-frequency analyses may not adequately consider the probability of recurrence of these global weather patterns.

  3. Development of Integrated Flood Analysis System for Improving Flood Mitigation Capabilities in Korea

    NASA Astrophysics Data System (ADS)

    Moon, Young-Il; Kim, Jong-suk

    2016-04-01

    Recently, the needs of people are growing for a more safety life and secure homeland from unexpected natural disasters. Flood damages have been recorded every year and those damages are greater than the annual average of 2 trillion won since 2000 in Korea. It has been increased in casualties and property damages due to flooding caused by hydrometeorlogical extremes according to climate change. Although the importance of flooding situation is emerging rapidly, studies related to development of integrated management system for reducing floods are insufficient in Korea. In addition, it is difficult to effectively reduce floods without developing integrated operation system taking into account of sewage pipe network configuration with the river level. Since the floods result in increasing damages to infrastructure, as well as life and property, structural and non-structural measures should be urgently established in order to effectively reduce the flood. Therefore, in this study, we developed an integrated flood analysis system that systematized technology to quantify flood risk and flood forecasting for supporting synthetic decision-making through real-time monitoring and prediction on flash rain or short-term rainfall by using radar and satellite information in Korea. Keywords: Flooding, Integrated flood analysis system, Rainfall forecasting, Korea Acknowledgments This work was carried out with the support of "Cooperative Research Program for Agriculture Science & Technology Development (Project No. PJ011686022015)" Rural Development Administration, Republic of Korea

  4. The application of the Contingent Valuation method towards the assessment of the impacts emerged from the March 2006 floods in the Evros River. An experts-based survey.

    NASA Astrophysics Data System (ADS)

    Markantonis, V.; Bithas, K.

    2009-04-01

    In March 2006 Greece was struck by a severe flooding, which caused significant damages in the Prefecture of Evros, on the Eastern border of Greece. 250 million m² of farmland was flooded causing severe damages to agriculture, transport and water supply networks. Total direct damages are estimated at € 372 million. The negative effect on economic activity caused by the floods, considered the worst over the last 50 years, took place in an area that had already been severely affected by floods in 2005. Apart from the direct damages critical were also the indirect impacts on the environmental and the social level. The need for economic analysis concerning the design and implementation of efficient flood management policies is well emphasized in the natural hazards' policies. Within this framework, the present paper is analyzing the application of stated preferences valuation techniques for the assessment of the damages caused in the Prefecture of Evros by the severe floods of March 2006. The objective of this paper is to define the role of economic valuation techniques in assisting the design of efficient and sustainable policies for flood management. More specific, the Contingent Valuation (CV) method is applied in order to valuate the impacts of the March 2006 floods, including the environmental impacts as far as concerns the soil, the biodiversity and the aesthetic environment of the flooded areas. The paper begins with a discussion of the theoretical economic framework, and particularly, the contingent valuation method framework that can be used to evaluate flood impacts. Understanding public preferences for complex environmental policy changes, such as flood impacts, is a preeminent challenge for environmental economists and other social scientists. Information issues are central to the design and application of the survey-based contingent valuation (CV) method for valuing environmental goods. While content is under the control of the analyst, how this

  5. Linking flood peak, flood volume and inundation extent: a DEM-based approach

    NASA Astrophysics Data System (ADS)

    Rebolho, Cédric; Furusho-Percot, Carina; Blaquière, Simon; Brettschneider, Marco; Andréassian, Vazken

    2017-04-01

    Traditionally, flood inundation maps are computed based on the Shallow Water Equations (SWE) in one or two dimensions, with various simplifications that have proved to give good results. However, the complexity of the SWEs often requires a numerical resolution which can need long computing time, as well as detailed cross section data: this often results in restricting these models to rather small areas abundant with high quality data. This, along with the necessity for fast inundation mapping, are the reason why rapid inundation models are being designed, working for (almost) any river with a minimum amount of data and, above all, easily available data. Our model tries to follow this path by using a 100m DEM over France from which are extracted a drainage network and the associated drainage areas. It is based on two pre-existing methods: (1) SHYREG (Arnaud et al.,2013), a regionalized approach used to calculate the 2-year and 10-year flood quantiles (used as approximated bankfull flow and maximum discharge, respectively) for each river pixel of the DEM (below a 10 000 km2 drainage area) and (2) SOCOSE (Mailhol,1980), which gives, amongst other things, an empirical formula of a characteristic flood duration (for each pixel) based on catchment area, average precipitation and temperature. An overflow volume for each river pixel is extracted from a triangular shaped synthetic hydrograph designed with SHYREG quantiles and SOCOSE flood duration. The volume is then spread from downstream to upstream one river pixel at a time. When the entire hydrographic network is processed, the model stops and generates a map of potential inundation area associated with the 10-year flood quantile. Our model can also be calibrated using past-events inundation maps by adjusting two parameters, one which modifies the overflow duration, and the other, equivalent to a minimum drainage area for river pixels to be flooded. Thus, in calibration on a sample of 42 basins, the first draft of the

  6. Information Communication using Knowledge Engine on Flood Issues

    NASA Astrophysics Data System (ADS)

    Demir, I.; Krajewski, W. F.

    2012-04-01

    The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to and visualization of flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, and other flood-related data for communities in Iowa. The system is designed for use by general public, often people with no domain knowledge and poor general science background. To improve effective communication with such audience, we have introduced a new way in IFIS to get information on flood related issues - instead of by navigating within hundreds of features and interfaces of the information system and web-based sources-- by providing dynamic computations based on a collection of built-in data, analysis, and methods. The IFIS Knowledge Engine connects to distributed sources of real-time stream gauges, and in-house data sources, analysis and visualization tools to answer questions grouped into several categories. Users will be able to provide input based on the query within the categories of rainfall, flood conditions, forecast, inundation maps, flood risk and data sensors. Our goal is the systematization of knowledge on flood related issues, and to provide a single source for definitive answers to factual queries. Long-term goal of this knowledge engine is to make all flood related knowledge easily accessible to everyone, and provide educational geoinformatics tool. The future implementation of the system will be able to accept free-form input and voice recognition capabilities within browser and mobile applications. We intend to deliver increasing capabilities for the system over the coming releases of IFIS. This presentation provides an overview of our Knowledge Engine, its unique information interface and functionality as an educational tool, and discusses the future plans for providing knowledge on flood related issues and resources.

  7. Large Scale Processes and Extreme Floods in Brazil

    NASA Astrophysics Data System (ADS)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  8. Improving enzymatic hydrolysis efficiency of wheat straw through sequential autohydrolysis and alkaline post-extraction.

    PubMed

    Wu, Xinxing; Huang, Chen; Zhai, Shengcheng; Liang, Chen; Huang, Caoxing; Lai, Chenhuan; Yong, Qiang

    2018-03-01

    In this work, a two-step pretreatment process of wheat straw was established by combining autohydrolysis pretreatment and alkaline post-extraction. The results showed that employing alkaline post-extraction to autohydrolyzed wheat straw could significantly improve its enzymatic hydrolysis efficiency from 36.0% to 83.7%. Alkaline post-extraction lead to the changes of the structure characteristics of autohydrolyzed wheat straw. Associations between enzymatic hydrolysis efficiency and structure characteristics were also studied. The results showed that the factors of structure characteristics such as delignification, xylan removal yield, crystallinity, accessibility and hydrophobicity are positively related to enzymatic hydrolysis efficiency within a certain range for alkaline post-extracted wheat straw. The results demonstrated that autohydrolysis coupled with alkaline post-extraction is an effective and promising method to gain fermentable sugars from biomass. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Electrocatalysts for Hydrogen Evolution in Alkaline Electrolytes: Mechanisms, Challenges, and Prospective Solutions.

    PubMed

    Mahmood, Nasir; Yao, Yunduo; Zhang, Jing-Wen; Pan, Lun; Zhang, Xiangwen; Zou, Ji-Jun

    2018-02-01

    Hydrogen evolution reaction (HER) in alkaline medium is currently a point of focus for sustainable development of hydrogen as an alternative clean fuel for various energy systems, but suffers from sluggish reaction kinetics due to additional water dissociation step. So, the state-of-the-art catalysts performing well in acidic media lose considerable catalytic performance in alkaline media. This review summarizes the recent developments to overcome the kinetics issues of alkaline HER, synthesis of materials with modified morphologies, and electronic structures to tune the active sites and their applications as efficient catalysts for HER. It first explains the fundamentals and electrochemistry of HER and then outlines the requirements for an efficient and stable catalyst in alkaline medium. The challenges with alkaline HER and limitation with the electrocatalysts along with prospective solutions are then highlighted. It further describes the synthesis methods of advanced nanostructures based on carbon, noble, and inexpensive metals and their heterogeneous structures. These heterogeneous structures provide some ideal systems for analyzing the role of structure and synergy on alkaline HER catalysis. At the end, it provides the concluding remarks and future perspectives that can be helpful for tuning the catalysts active-sites with improved electrochemical efficiencies in future.

  10. Electrocatalysts for Hydrogen Evolution in Alkaline Electrolytes: Mechanisms, Challenges, and Prospective Solutions

    PubMed Central

    Mahmood, Nasir; Yao, Yunduo; Zhang, Jing‐Wen; Pan, Lun; Zhang, Xiangwen

    2017-01-01

    Abstract Hydrogen evolution reaction (HER) in alkaline medium is currently a point of focus for sustainable development of hydrogen as an alternative clean fuel for various energy systems, but suffers from sluggish reaction kinetics due to additional water dissociation step. So, the state‐of‐the‐art catalysts performing well in acidic media lose considerable catalytic performance in alkaline media. This review summarizes the recent developments to overcome the kinetics issues of alkaline HER, synthesis of materials with modified morphologies, and electronic structures to tune the active sites and their applications as efficient catalysts for HER. It first explains the fundamentals and electrochemistry of HER and then outlines the requirements for an efficient and stable catalyst in alkaline medium. The challenges with alkaline HER and limitation with the electrocatalysts along with prospective solutions are then highlighted. It further describes the synthesis methods of advanced nanostructures based on carbon, noble, and inexpensive metals and their heterogeneous structures. These heterogeneous structures provide some ideal systems for analyzing the role of structure and synergy on alkaline HER catalysis. At the end, it provides the concluding remarks and future perspectives that can be helpful for tuning the catalysts active‐sites with improved electrochemical efficiencies in future. PMID:29610722

  11. A hierarchical Bayesian GEV model for improving local and regional flood quantile estimates

    NASA Astrophysics Data System (ADS)

    Lima, Carlos H. R.; Lall, Upmanu; Troy, Tara; Devineni, Naresh

    2016-10-01

    We estimate local and regional Generalized Extreme Value (GEV) distribution parameters for flood frequency analysis in a multilevel, hierarchical Bayesian framework, to explicitly model and reduce uncertainties. As prior information for the model, we assume that the GEV location and scale parameters for each site come from independent log-normal distributions, whose mean parameter scales with the drainage area. From empirical and theoretical arguments, the shape parameter for each site is shrunk towards a common mean. Non-informative prior distributions are assumed for the hyperparameters and the MCMC method is used to sample from the joint posterior distribution. The model is tested using annual maximum series from 20 streamflow gauges located in an 83,000 km2 flood prone basin in Southeast Brazil. The results show a significant reduction of uncertainty estimates of flood quantile estimates over the traditional GEV model, particularly for sites with shorter records. For return periods within the range of the data (around 50 years), the Bayesian credible intervals for the flood quantiles tend to be narrower than the classical confidence limits based on the delta method. As the return period increases beyond the range of the data, the confidence limits from the delta method become unreliable and the Bayesian credible intervals provide a way to estimate satisfactory confidence bands for the flood quantiles considering parameter uncertainties and regional information. In order to evaluate the applicability of the proposed hierarchical Bayesian model for regional flood frequency analysis, we estimate flood quantiles for three randomly chosen out-of-sample sites and compare with classical estimates using the index flood method. The posterior distributions of the scaling law coefficients are used to define the predictive distributions of the GEV location and scale parameters for the out-of-sample sites given only their drainage areas and the posterior distribution of the

  12. Going beyond the flood insurance rate map: insights from flood hazard map co-production

    NASA Astrophysics Data System (ADS)

    Luke, Adam; Sanders, Brett F.; Goodrich, Kristen A.; Feldman, David L.; Boudreau, Danielle; Eguiarte, Ana; Serrano, Kimberly; Reyes, Abigail; Schubert, Jochen E.; AghaKouchak, Amir; Basolo, Victoria; Matthew, Richard A.

    2018-04-01

    Flood hazard mapping in the United States (US) is deeply tied to the National Flood Insurance Program (NFIP). Consequently, publicly available flood maps provide essential information for insurance purposes, but they do not necessarily provide relevant information for non-insurance aspects of flood risk management (FRM) such as public education and emergency planning. Recent calls for flood hazard maps that support a wider variety of FRM tasks highlight the need to deepen our understanding about the factors that make flood maps useful and understandable for local end users. In this study, social scientists and engineers explore opportunities for improving the utility and relevance of flood hazard maps through the co-production of maps responsive to end users' FRM needs. Specifically, two-dimensional flood modeling produced a set of baseline hazard maps for stakeholders of the Tijuana River valley, US, and Los Laureles Canyon in Tijuana, Mexico. Focus groups with natural resource managers, city planners, emergency managers, academia, non-profit, and community leaders refined the baseline hazard maps by triggering additional modeling scenarios and map revisions. Several important end user preferences emerged, such as (1) legends that frame flood intensity both qualitatively and quantitatively, and (2) flood scenario descriptions that report flood magnitude in terms of rainfall, streamflow, and its relation to an historic event. Regarding desired hazard map content, end users' requests revealed general consistency with mapping needs reported in European studies and guidelines published in Australia. However, requested map content that is not commonly produced included (1) standing water depths following the flood, (2) the erosive potential of flowing water, and (3) pluvial flood hazards, or flooding caused directly by rainfall. We conclude that the relevance and utility of commonly produced flood hazard maps can be most improved by illustrating pluvial flood hazards

  13. Healthcare-associated infections and their prevention after extensive flooding.

    PubMed

    Apisarnthanarak, Anucha; Warren, David K; Mayhall, Clovus Glen

    2013-08-01

    This review will focus on the epidemiology of healthcare-associated infections (HAIs) after extensive blackwater flooding as well as preventive measures. There is evidence suggesting an increased incidence of HAIs and pseudo-outbreaks due to molds after extensive flooding in healthcare facilities. However, there is no strong evidence of an increased incidence of typical nosocomial infections (i.e., ventilator-associated pneumonia, healthcare-associated pneumonia, central line-associated bloodstream infection and catheter-associated urinary tract infections). The prevalence of multidrug-resistant organisms may decrease after extensive flooding, due to repeated and thorough environmental cleaning prior to re-opening hospitals. Contamination of hospital water sources by enteric Gram-negative bacteria (e.g., Aeromonas species), Legionella species and nontuberculous Mycobacterium species in flood-affected hospitals has been reported. Surveillance is an important initial step to detect potential outbreaks/pseudo-outbreaks of HAIs. Hospital preparedness policies before extensive flooding, particularly with environmental cleaning and mold remediation, are key to reducing the risk of flood-related HAIs. These policies are still lacking in most hospitals in countries that have experienced or are at risk for extensive flooding, which argues for nationwide policies to strengthen preparedness planning. Additional studies are needed to evaluate the epidemiology of flood-related HAIs and the optimal surveillance and control methods following extensive flooding.

  14. Improving flood risk mapping in Italy: the FloodRisk open-source software

    NASA Astrophysics Data System (ADS)

    Albano, Raffaele; Mancusi, Leonardo; Craciun, Iulia; Sole, Aurelia; Ozunu, Alexandru

    2017-04-01

    Time and again, floods around the world illustrate the devastating impact they can have on societies. Furthermore, the expectation that the flood damages can increase over time with climate, land-use change and social growth in flood prone-areas has raised the public and other stakeholders' (governments, international organization, re-insurance companies and emergency responders) awareness for the need to manage risks in order to mitigate their causes and consequences. In this light, the choice of appropriate measures, the assessment of the costs and effects of such measures, and their prioritization are crucial for decision makers. As a result, a priori flood risk assessment has become a key part of flood management practices with the aim of minimizing the total costs related to the risk management cycle. In this context, The EU Flood Directive 2007/60 requires the delineation of flood risk maps on the bases of most appropriate and advanced tools, with particular attention on limiting required economic efforts. The main aim of these risk maps is to provide the required knowledge for the development of flood risk management plans (FRMPs) by considering both costs and benefits of alternatives and results from consultation with all interested parties. In this context, this research project developed a free and open-source (FOSS) GIS software, called FloodRisk, to operatively support stakeholders in their compliance with the FRMPs. FloodRisk aims to facilitate the development of risk maps and the evaluation and management of current and future flood risk for multi-purpose applications. This new approach overcomes the limits of the expert-drive qualitative (EDQ) approach currently adopted in several European countries, such as Italy, which does not permit a suitable evaluation of the effectiveness of risk mitigation strategies, because the vulnerability component cannot be properly assessed. Moreover, FloodRisk is also able to involve the citizens in the flood

  15. Variation in flooding-induced morphological traits in natural populations of white clover (Trifolium repens) and their effects on plant performance during soil flooding

    PubMed Central

    Huber, Heidrun; Jacobs, Elke; Visser, Eric J. W.

    2009-01-01

    Background and Aims Soil flooding leads to low soil oxygen concentrations and thereby negatively affects plant growth. Differences in flooding tolerance have been explained by the variation among species in the extent to which traits related to acclimation were expressed. However, our knowledge of variation within natural species (i.e. among individual genotypes) in traits related to flooding tolerance is very limited. Such data could tell us on which traits selection might have taken place, and will take place in future. The aim of the present study was to show that variation in flooding-tolerance-related traits is present among genotypes of the same species, and that both the constitutive variation and the plastic variation in flooding-induced changes in trait expression affect the performance of genotypes during soil flooding. Methods Clones of Trifolium repens originating from a river foreland were subjected to either drained, control conditions or to soil flooding. Constitutive expression of morphological traits was recorded on control plants, and flooding-induced changes in expression were compared with these constitutive expression levels. Moreover, the effect of both constitutive and flooding-induced trait expression on plant performance was determined. Key Results Constitutive and plastic variation of several morphological traits significantly affected plant performance. Even relatively small increases in root porosity and petiole length contributed to better performance during soil flooding. High specific leaf area, by contrast, was negatively correlated with performance during flooding. Conclusions The data show that different genotypes responded differently to soil flooding, which could be linked to variation in morphological trait expression. As flooded and drained conditions exerted different selection pressures on trait expression, the optimal value for constitutive and plastic traits will depend on the frequency and duration of flooding. These data

  16. Data assimilation of citizen collected information for real-time flood hazard mapping

    NASA Astrophysics Data System (ADS)

    Sayama, T.; Takara, K. T.

    2017-12-01

    Many studies in data assimilation in hydrology have focused on the integration of satellite remote sensing and in-situ monitoring data into hydrologic or land surface models. For flood predictions also, recent studies have demonstrated to assimilate remotely sensed inundation information with flood inundation models. In actual flood disaster situations, citizen collected information including local reports by residents and rescue teams and more recently tweets via social media also contain valuable information. The main interest of this study is how to effectively use such citizen collected information for real-time flood hazard mapping. Here we propose a new data assimilation technique based on pre-conducted ensemble inundation simulations and update inundation depth distributions sequentially when local data becomes available. The propose method is composed by the following two-steps. The first step is based on weighting average of preliminary ensemble simulations, whose weights are updated by Bayesian approach. The second step is based on an optimal interpolation, where the covariance matrix is calculated from the ensemble simulations. The proposed method was applied to case studies including an actual flood event occurred. It considers two situations with more idealized one by assuming continuous flood inundation depth information is available at multiple locations. The other one, which is more realistic case during such a severe flood disaster, assumes uncertain and non-continuous information is available to be assimilated. The results show that, in the first idealized situation, the large scale inundation during the flooding was estimated reasonably with RMSE < 0.4 m in average. For the second more realistic situation, the error becomes larger (RMSE 0.5 m) and the impact of the optimal interpolation becomes comparatively less effective. Nevertheless, the applications of the proposed data assimilation method demonstrated a high potential of this method for

  17. Development of flood profiles and flood-inundation maps for the Village of Killbuck, Ohio

    USGS Publications Warehouse

    Ostheimer, Chad J.

    2013-01-01

    Digital flood-inundation maps for a reach of Killbuck Creek near the Village of Killbuck, Ohio, were created by the U.S. Geological Survey (USGS), in cooperation with Holmes County, Ohio. The inundation maps depict estimates of the areal extent of flooding corresponding to water levels (stages) at the USGS streamgage Killbuck Creek near Killbuck (03139000) and were completed as part of an update to Federal Emergency Management Agency Flood-Insurance Study. The maps were provided to the National Weather Service (NWS) for incorporation into a Web-based flood-warning system that can be used in conjunction with NWS flood-forecast data to show areas of predicted flood inundation associated with forecasted flood-peak stages. The digital maps also have been submitted for inclusion in the data libraries of the USGS interactive Flood Inundation Mapper. Data from the streamgage can be used by emergency-management personnel, in conjunction with the flood-inundation maps, to help determine a course of action when flooding is imminent. Flood profiles for selected reaches were prepared by calibrating a steady-state step-backwater model to an established streamgage rating curve. The step-backwater model then was used to determine water-surface-elevation profiles for 10 flood stages at the streamgage with corresponding streamflows ranging from approximately the 50- to 0.2-percent annual exceedance probabilities. The computed flood profiles were used in combination with digital elevation data to delineate flood-inundation areas.

  18. Characterisation of seasonal flood types according to timescales in mixed probability distributions

    NASA Astrophysics Data System (ADS)

    Fischer, Svenja; Schumann, Andreas; Schulte, Markus

    2016-08-01

    When flood statistics are based on annual maximum series (AMS), the sample often contains flood peaks, which differ in their genesis. If the ratios among event types change over the range of observations, the extrapolation of a probability distribution function (pdf) can be dominated by a majority of events that belong to a certain flood type. If this type is not typical for extraordinarily large extremes, such an extrapolation of the pdf is misleading. To avoid this breach of the assumption of homogeneity, seasonal models were developed that differ between winter and summer floods. We show that a distinction between summer and winter floods is not always sufficient if seasonal series include events with different geneses. Here, we differentiate floods by their timescales into groups of long and short events. A statistical method for such a distinction of events is presented. To demonstrate their applicability, timescales for winter and summer floods in a German river basin were estimated. It is shown that summer floods can be separated into two main groups, but in our study region, the sample of winter floods consists of at least three different flood types. The pdfs of the two groups of summer floods are combined via a new mixing model. This model considers that information about parallel events that uses their maximum values only is incomplete because some of the realisations are overlaid. A statistical method resulting in an amendment of statistical parameters is proposed. The application in a German case study demonstrates the advantages of the new model, with specific emphasis on flood types.

  19. A two-stage method of quantitative flood risk analysis for reservoir real-time operation using ensemble-based hydrologic forecasts

    NASA Astrophysics Data System (ADS)

    Liu, P.

    2013-12-01

    Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.

  20. Voice-enabled Knowledge Engine using Flood Ontology and Natural Language Processing

    NASA Astrophysics Data System (ADS)

    Sermet, M. Y.; Demir, I.; Krajewski, W. F.

    2015-12-01

    The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to flood inundation maps, real-time flood conditions, flood forecasts, flood-related data, information and interactive visualizations for communities in Iowa. The IFIS is designed for use by general public, often people with no domain knowledge and limited general science background. To improve effective communication with such audience, we have introduced a voice-enabled knowledge engine on flood related issues in IFIS. Instead of navigating within many features and interfaces of the information system and web-based sources, the system provides dynamic computations based on a collection of built-in data, analysis, and methods. The IFIS Knowledge Engine connects to real-time stream gauges, in-house data sources, analysis and visualization tools to answer natural language questions. Our goal is the systematization of data and modeling results on flood related issues in Iowa, and to provide an interface for definitive answers to factual queries. The goal of the knowledge engine is to make all flood related knowledge in Iowa easily accessible to everyone, and support voice-enabled natural language input. We aim to integrate and curate all flood related data, implement analytical and visualization tools, and make it possible to compute answers from questions. The IFIS explicitly implements analytical methods and models, as algorithms, and curates all flood related data and resources so that all these resources are computable. The IFIS Knowledge Engine computes the answer by deriving it from its computational knowledge base. The knowledge engine processes the statement, access data warehouse, run complex database queries on the server-side and return outputs in various formats. This presentation provides an overview of IFIS Knowledge Engine, its unique information interface and functionality as an educational tool, and discusses the future plans

  1. Assessment of parameter regionalization methods for modeling flash floods in China

    NASA Astrophysics Data System (ADS)

    Ragettli, Silvan; Zhou, Jian; Wang, Haijing

    2017-04-01

    Rainstorm flash floods are a common and serious phenomenon during the summer months in many hilly and mountainous regions of China. For this study, we develop a modeling strategy for simulating flood events in small river basins of four Chinese provinces (Shanxi, Henan, Beijing, Fujian). The presented research is part of preliminary investigations for the development of a national operational model for predicting and forecasting hydrological extremes in basins of size 10 - 2000 km2, whereas most of these basins are ungauged or poorly gauged. The project is supported by the China Institute of Water Resources and Hydropower Research within the framework of the national initiative for flood prediction and early warning system for mountainous regions in China (research project SHZH-IWHR-73). We use the USGS Precipitation-Runoff Modeling System (PRMS) as implemented in the Java modeling framework Object Modeling System (OMS). PRMS can operate at both daily and storm timescales, switching between the two using a precipitation threshold. This functionality allows the model to perform continuous simulations over several years and to switch to the storm mode to simulate storm response in greater detail. The model was set up for fifteen watersheds for which hourly precipitation and runoff data were available. First, automatic calibration based on the Shuffled Complex Evolution method was applied to different hydrological response unit (HRU) configurations. The Nash-Sutcliffe efficiency (NSE) was used as assessment criteria, whereas only runoff data from storm events were considered. HRU configurations reflect the drainage-basin characteristics and depend on assumptions regarding drainage density and minimum HRU size. We then assessed the sensitivity of optimal parameters to different HRU configurations. Finally, the transferability to other watersheds of optimal model parameters that were not sensitive to HRU configurations was evaluated. Model calibration for the 15

  2. High resolution mapping of flood hazard for South Korea

    NASA Astrophysics Data System (ADS)

    Ghosh, Sourima; Nzerem, Kechi; Zovi, Francesco; Li, Shuangcai; Mei, Yi; Assteerawatt, Anongnart; Hilberts, Arno; Tillmanns, Stephan; Mitas, Christos

    2015-04-01

    Floods are one of primary natural hazards that affect South Korea. During the past 15 years, catastrophic flood events which mainly have occurred during the rainy and typhoon seasons - especially under condition where soils are already saturated, have triggered substantial property damage with an average annual loss of around US1.2 billion (determined from WAter Management Information System's flood damage database for years 2002-2011) in South Korea. According to Seoul Metropolitan Government, over 16,000 households in the capital city Seoul were inundated during 2010 flood events. More than 10,000 households in Seoul were apparently flooded during one major flood event due to torrential rain in July 2011. Recently in August 2014, a serious flood event due to heavy rainfall hit the Busan region in the south east of South Korea. Addressing the growing needs, RMS has recently released country-wide high resolution combined flood return period maps for post-drainage local "pluvial" inundation and undefended large-scale "fluvial" inundation to aid the government and the insurance industry in the evaluation of comprehensive flood risk. RMS has developed a flood hazard model for South Korea to generate inundation depths and extents for a range of flood return periods. The model is initiated with 30 years of historical meteorological forcing data and calibrated to daily observations at over 100 river gauges across the country. Simulations of hydrologic processes are subsequently performed based on a 2000 year set of stochastic forcing. Floodplain inundation processes are modelled by numerically solving the shallow water equations using finite volume method on GPUs. Taking into account the existing stormwater drainage standards, economic exposure densities, etc., reasonable flood maps are created from inundation model output. Final hazard maps at one arcsec grid resolution can be the basis for both evaluating and managing flood risk, its economic impacts, and insured flood

  3. Field-scale simulation of chemical flooding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saad, N.

    1989-01-01

    A three-dimensional compositional chemical flooding simulator (UTCHEM) has been improved. The new mathematical formulation, boundary conditions, and a description of the physicochemical models of the simulator are presented. This improved simulator has been used for the study of the low tension pilot project at the Big Muddy field near Casper, Wyoming. Both the tracer injection conducted prior to the injection of the chemical slug, and the chemical flooding stages of the pilot project, have been analyzed. Not only the oil recovery but also the tracers, polymer, alcohol and chloride histories have been successfully matched with field results. Simulation results indicatemore » that, for this fresh water reservoir, the salinity gradient during the preflush and the resulting calcium pickup by the surfactant slug played a major role in the success of the project. In addition, analysis of the effects of the crossflow on the performance of the pilot project indicates that, for the well spacing of the pilot, crossflow does not play as important a role as it might for a large-scale project. To improve the numerical efficiency of the simulator, a third order convective differencing scheme has been applied to the simulator. This method can be used with non-uniform mesh, and therefore is suited for simulation studies of large-scale multiwell heterogeneous reservoirs. Comparison of the results with one and two dimensional analytical solutions shows that this method is effective in eliminating numerical dispersion using relatively large grid blocks. Results of one, two and three-dimensional miscible water/tracer flow, water flooding, polymer flooding, and micellar-polymer flooding test problems, and results of grid orientation studies, are presented.« less

  4. New methods in hydrologic modeling and decision support for culvert flood risk under climate change

    NASA Astrophysics Data System (ADS)

    Rosner, A.; Letcher, B. H.; Vogel, R. M.; Rees, P. S.

    2015-12-01

    Assessing culvert flood vulnerability under climate change poses an unusual combination of challenges. We seek a robust method of planning for an uncertain future, and therefore must consider a wide range of plausible future conditions. Culverts in our case study area, northwestern Massachusetts, USA, are predominantly found in small, ungaged basins. The need to predict flows both at numerous sites and under numerous plausible climate conditions requires a statistical model with low data and computational requirements. We present a statistical streamflow model that is driven by precipitation and temperature, allowing us to predict flows without reliance on reference gages of observed flows. The hydrological analysis is used to determine each culvert's risk of failure under current conditions. We also explore the hydrological response to a range of plausible future climate conditions. These results are used to determine the tolerance of each culvert to future increases in precipitation. In a decision support context, current flood risk as well as tolerance to potential climate changes are used to provide a robust assessment and prioritization for culvert replacements.

  5. Flood-Ring Formation and Root Development in Response to Experimental Flooding of Young Quercus robur Trees

    PubMed Central

    Copini, Paul; den Ouden, Jan; Robert, Elisabeth M. R.; Tardif, Jacques C.; Loesberg, Walter A.; Goudzwaard, Leo; Sass-Klaassen, Ute

    2016-01-01

    Spring flooding in riparian forests can cause significant reductions in earlywood-vessel size in submerged stem parts of ring-porous tree species, leading to the presence of ‘flood rings’ that can be used as a proxy to reconstruct past flooding events, potentially over millennia. The mechanism of flood-ring formation and the relation with timing and duration of flooding are still to be elucidated. In this study, we experimentally flooded 4-year-old Quercus robur trees at three spring phenophases (late bud dormancy, budswell, and internode expansion) and over different flooding durations (2, 4, and 6 weeks) to a stem height of 50 cm. The effect of flooding on root and vessel development was assessed immediately after the flooding treatment and at the end of the growing season. Ring width and earlywood-vessel size and density were measured at 25- and 75-cm stem height and collapsed vessels were recorded. Stem flooding inhibited earlywood-vessel development in flooded stem parts. In addition, flooding upon budswell and internode expansion led to collapsed earlywood vessels below the water level. At the end of the growing season, mean earlywood-vessel size in the flooded stem parts (upon budswell and internode expansion) was always reduced by approximately 50% compared to non-flooded stem parts and 55% compared to control trees. This reduction was already present 2 weeks after flooding and occurred independent of flooding duration. Stem and root flooding were associated with significant root dieback after 4 and 6 weeks and mean radial growth was always reduced with increasing flooding duration. By comparing stem and root flooding, we conclude that flood rings only occur after stem flooding. As earlywood-vessel development was hampered during flooding, a considerable number of narrow earlywood vessels present later in the season, must have been formed after the actual flooding events. Our study indicates that root dieback, together with strongly reduced hydraulic

  6. After the flood is before the next flood - post event review of the Central European Floods of June 2013. Insights, recommendations and next steps for future flood prevention

    NASA Astrophysics Data System (ADS)

    Szoenyi, Michael; Mechler, Reinhard; McCallum, Ian

    2015-04-01

    In early June 2013, severe flooding hit Central and Eastern Europe, causing extensive damage, in particular along the Danube and Elbe main watersheds. The situation was particularly severe in Eastern Germany, Austria, Hungary and the Czech Republic. Based on the Post Event Review Capability (PERC) approach, developed by Zurich Insurance's Flood Resilience Program to provide independent review of large flood events, we examine what has worked well (best practice) and opportunities for further improvement. The PERC overall aims to thoroughly examine aspects of flood resilience, flood risk management and catastrophe intervention in order to help build back better after events and learn for future events. As our research from post event analyses shows a lot of losses are in fact avoidable by taking the right measures pre-event and these measures are economically - efficient with a return of 4 Euro on losses saved for every Euro invested in prevention on average (Wharton/IIASA flood resilience alliance paper on cost benefit analysis, Mechler et al. 2014) and up to 10 Euros for certain countries. For the 2013 flood events we provide analysis on the following aspects and in general identify a number of factors that worked in terms of reducing the loss and risk burden. 1. Understanding risk factors of the Central European Floods 2013 We review the precursors leading up to the floods in June, with an extremely wet May 2013 and an atypical V-b weather pattern that brought immense precipitation in a very short period to the watersheds of Elbe, Donau and partially the Rhine in the D-A-CH countries and researched what happened during the flood and why. Key questions we asked revolve around which protection and risk reduction approaches worked well and which did not, and why. 2. Insights and recommendations from the post event review The PERC identified a number of risk factors, which need attention if risk is to be reduced over time. • Yet another "100-year flood" - risk

  7. Advancing flood risk analysis by integrating adaptive behaviour in large-scale flood risk assessments

    NASA Astrophysics Data System (ADS)

    Haer, T.; Botzen, W.; Aerts, J.

    2016-12-01

    In the last four decades the global population living in the 1/100 year-flood zone has doubled from approximately 500 million to a little less than 1 billion people. Urbanization in low lying -flood prone- cities further increases the exposed assets, such as buildings and infrastructure. Moreover, climate change will further exacerbate flood risk in the future. Accurate flood risk assessments are important to inform policy-makers and society on current- and future flood risk levels. However, these assessment suffer from a major flaw in the way they estimate flood vulnerability and adaptive behaviour of individuals and governments. Current flood risk projections commonly assume that either vulnerability remains constant, or try to mimic vulnerability through incorporating an external scenario. Such a static approach leads to a misrepresentation of future flood risk, as humans respond adaptively to flood events, flood risk communication, and incentives to reduce risk. In our study, we integrate adaptive behaviour in a large-scale European flood risk framework through an agent-based modelling approach. This allows for the inclusion of heterogeneous agents, which dynamically respond to each other and a changing environment. We integrate state-of-the-art flood risk maps based on climate scenarios (RCP's), and socio-economic scenarios (SSP's), with government and household agents, which behave autonomously based on (micro-)economic behaviour rules. We show for the first time that excluding adaptive behaviour leads to a major misrepresentation of future flood risk. The methodology is applied to flood risk, but has similar implications for other research in the field of natural hazards. While more research is needed, this multi-disciplinary study advances our understanding of how future flood risk will develop.

  8. Alkaline Phosphatase, Soluble Extracellular Adenine Nucleotides, and Adenosine Production after Infant Cardiopulmonary Bypass

    PubMed Central

    Davidson, Jesse A.; Urban, Tracy; Tong, Suhong; Twite, Mark; Woodruff, Alan

    2016-01-01

    Rationale Decreased alkaline phosphatase activity after infant cardiac surgery is associated with increased post-operative cardiovascular support requirements. In adults undergoing coronary artery bypass grafting, alkaline phosphatase infusion may reduce inflammation. Mechanisms underlying these effects have not been explored but may include decreased conversion of extracellular adenine nucleotides to adenosine. Objectives 1) Evaluate the association between alkaline phosphatase activity and serum conversion of adenosine monophosphate to adenosine after infant cardiac surgery; 2) assess if inhibition/supplementation of serum alkaline phosphatase modulates this conversion. Methods and Research Pre/post-bypass serum samples were obtained from 75 infants <4 months of age. Serum conversion of 13C5-adenosine monophosphate to 13C5-adenosine was assessed with/without selective inhibition of alkaline phosphatase and CD73. Low and high concentration 13C5-adenosine monophosphate (simulating normal/stress concentrations) were used. Effects of alkaline phosphatase supplementation on adenosine monophosphate clearance were also assessed. Changes in serum alkaline phosphatase activity were strongly correlated with changes in 13C5-adenosine production with or without CD73 inhibition (r = 0.83; p<0.0001). Serum with low alkaline phosphatase activity (≤80 U/L) generated significantly less 13C5-adenosine, particularly in the presence of high concentration 13C5-adenosine monophosphate (10.4μmol/L vs 12.9μmol/L; p = 0.0004). Inhibition of alkaline phosphatase led to a marked decrease in 13C5-adenosine production (11.9μmol/L vs 2.7μmol/L; p<0.0001). Supplementation with physiologic dose human tissue non-specific alkaline phosphatase or high dose bovine intestinal alkaline phosphatase doubled 13C5-adenosine monophosphate conversion to 13C5-adenosine (p<0.0001). Conclusions Alkaline phosphatase represents the primary serum ectonucleotidase after infant cardiac surgery and low post

  9. A method to calibrate channel friction and bathymetry parameters of a Sub-Grid hydraulic model using SAR flood images

    NASA Astrophysics Data System (ADS)

    Wood, M.; Neal, J. C.; Hostache, R.; Corato, G.; Chini, M.; Giustarini, L.; Matgen, P.; Wagener, T.; Bates, P. D.

    2015-12-01

    Synthetic Aperture Radar (SAR) satellites are capable of all-weather day and night observations that can discriminate between land and smooth open water surfaces over large scales. Because of this there has been much interest in the use of SAR satellite data to improve our understanding of water processes, in particular for fluvial flood inundation mechanisms. Past studies prove that integrating SAR derived data with hydraulic models can improve simulations of flooding. However while much of this work focusses on improving model channel roughness values or inflows in ungauged catchments, improvement of model bathymetry is often overlooked. The provision of good bathymetric data is critical to the performance of hydraulic models but there are only a small number of ways to obtain bathymetry information where no direct measurements exist. Spatially distributed river depths are also rarely available. We present a methodology for calibration of model average channel depth and roughness parameters concurrently using SAR images of flood extent and a Sub-Grid model utilising hydraulic geometry concepts. The methodology uses real data from the European Space Agency's archive of ENVISAT[1] Wide Swath Mode images of the River Severn between Worcester and Tewkesbury during flood peaks between 2007 and 2010. Historic ENVISAT WSM images are currently free and easy to access from archive but the methodology can be applied with any available SAR data. The approach makes use of the SAR image processing algorithm of Giustarini[2] et al. (2013) to generate binary flood maps. A unique feature of the calibration methodology is to also use parameter 'identifiability' to locate the parameters with higher accuracy from a pre-assigned range (adopting the DYNIA method proposed by Wagener[3] et al., 2003). [1] https://gpod.eo.esa.int/services/ [2] Giustarini. 2013. 'A Change Detection Approach to Flood Mapping in Urban Areas Using TerraSAR-X'. IEEE Transactions on Geoscience and Remote

  10. Uncertainty and Sensitivity of Direct Economic Flood Damages: the FloodRisk Free and Open-Source Software

    NASA Astrophysics Data System (ADS)

    Albano, R.; Sole, A.; Mancusi, L.; Cantisani, A.; Perrone, A.

    2017-12-01

    The considerable increase of flood damages in the the past decades has shifted in Europe the attention from protection against floods to managing flood risks. In this context, the expected damages assessment represents a crucial information within the overall flood risk management process. The present paper proposes an open source software, called FloodRisk, that is able to operatively support stakeholders in the decision making processes with a what-if approach by carrying out the rapid assessment of the flood consequences, in terms of direct economic damage and loss of human lives. The evaluation of the damage scenarios, trough the use of the GIS software proposed here, is essential for cost-benefit or multi-criteria analysis of risk mitigation alternatives. However, considering that quantitative assessment of flood damages scenarios is characterized by intrinsic uncertainty, a scheme has been developed to identify and quantify the role of the input parameters in the total uncertainty of flood loss model application in urban areas with mild terrain and complex topography. By the concept of parallel models, the contribution of different module and input parameters to the total uncertainty is quantified. The results of the present case study have exhibited a high epistemic uncertainty on the damage estimation module and, in particular, on the type and form of the utilized damage functions, which have been adapted and transferred from different geographic and socio-economic contexts because there aren't depth-damage functions that are specifically developed for Italy. Considering that uncertainty and sensitivity depend considerably on local characteristics, the epistemic uncertainty associated with the risk estimate is reduced by introducing additional information into the risk analysis. In the light of the obtained results, it is evident the need to produce and disseminate (open) data to develop micro-scale vulnerability curves. Moreover, the urgent need to push

  11. Floods in Colorado

    USGS Publications Warehouse

    Follansbee, Robert; Sawyer, Leon R.

    1948-01-01

    The first records of floods in Colorado antedated the settlement of the State by about 30 years. These were records of floods on the Arkansas and Republican Rivers in 1826. Other floods noted by traders, hunters and emigrants, some of whom were on their way to the Far West, occurred in 1844 on the Arkansas River, and by inference on the South Platte River. Other early floods were those on the Purgatoire, the Lower Arkansas, and the San Juan Rivers about 1859. The most serious flood since settlement began was that on the Arkansas River during June 1921, which caused the loss of about 100 lives and an estimated property loss of $19,000,000. Many floods of lesser magnitude have occurred, and some of these have caused loss of life and very considerable property damage. Topography is the chief factor in determining the location of storms and resulting floods. These occur most frequently on the eastern slope of the Front Range. In the mountains farther west precipitation is insufficient to cause floods except during periods of melting snow, in June. In the southwestern part of the State, where precipitation during periods of melting snow is insufficient to cause floods, the severest floods yet experienced resulted from heavy rains in September 1909 and October 1911. In the eastern foothills region, usually below an altitude of about 7,500 feet and extending for a distance of about 50 miles east of the mountains, is a zone subject to rainfalls of great intensity known as cloudbursts. These cloudbursts are of short duration and are confined to very small areas. At times the intensity is so great as to make breathing difficult for those exposed to a storm. The areas of intense rainfall are so small that Weather Bureau precipitation stations have not been located in them. Local residents, being cloudburst conscious, frequently measure the rainfall in receptacles in their yards, and such records constitute the only source of information regarding the intensity. A flood

  12. The August 2002 flood in Salzburg / Austria experience gained and lessons learned from the ``Flood of the century''?

    NASA Astrophysics Data System (ADS)

    Wiesenegger, H.

    2003-04-01

    longer lead times in Salzburg's flood forecasts. Methods to incorporate precipitation forecasts, provided by the Met Office, as well as observations of actual soil conditions, therefore, have to be developed and should enable hydrologists to predict possible scenarios and impacts of floods, forecasted for the next 24 hours. As a further consequence of the August 2002 flood, building regulations, e.g. the use of oil tanks in flood prone areas, have to be checked and were necessary adapted. It is also necessary to make people, who already live in flood prone areas, aware of the dangers of floods. They also need to know about the limits of flood protection measurements and about what happens, if flood protection design values are exceeded. Alarm plans, dissemination of information by using modern communication systems (Internet) as well as communication failure in peak times and co-ordination of rescue units are also a subject to be looked at carefully. The above mentioned measurements are amongst others of a 10 point program, developed by the Government of the Province of Salzburg and at present checked with regards to feasibility. As it is to be expected, that the August 2002 flood was not the last rare one of this century, experience gained should be valuably for the next event.

  13. Floods and droughts: friends or foes?

    NASA Astrophysics Data System (ADS)

    Prudhomme, Christel

    2017-04-01

    Water hazards are some of the biggest threats to lives and livelihoods globally, causing serious damages to society and infrastructure. But floods and droughts are an essential part of the hydrological regime that ensures fundamental ecosystem functions, providing natural ways to bring in nutrients, flush out pollutants and enabling soils, rivers and lakes natural biodiversity to thrive. Traditionally, floods and droughts are too often considered separately, with scientific advance in process understanding, modelling, statistical characterisation and impact assessment are often done independently, possibly delaying the development of innovative methods that could be applied to both. This talk will review some of the key characteristics of floods and droughts, highlighting differences and commonalties, losses and benefits, with the aim of identifying future key research challenges faced by both current and next generation of hydrologists.

  14. Regional L-Moment-Based Flood Frequency Analysis in the Upper Vistula River Basin, Poland

    NASA Astrophysics Data System (ADS)

    Rutkowska, A.; Żelazny, M.; Kohnová, S.; Łyp, M.; Banasik, K.

    2017-02-01

    The Upper Vistula River basin was divided into pooling groups with similar dimensionless frequency distributions of annual maximum river discharge. The cluster analysis and the Hosking and Wallis (HW) L-moment-based method were used to divide the set of 52 mid-sized catchments into disjoint clusters with similar morphometric, land use, and rainfall variables, and to test the homogeneity within clusters. Finally, three and four pooling groups were obtained alternatively. Two methods for identification of the regional distribution function were used, the HW method and the method of Kjeldsen and Prosdocimi based on a bivariate extension of the HW measure. Subsequently, the flood quantile estimates were calculated using the index flood method. The ordinary least squares (OLS) and the generalised least squares (GLS) regression techniques were used to relate the index flood to catchment characteristics. Predictive performance of the regression scheme for the southern part of the Upper Vistula River basin was improved by using GLS instead of OLS. The results of the study can be recommended for the estimation of flood quantiles at ungauged sites, in flood risk mapping applications, and in engineering hydrology to help design flood protection structures.

  15. Flood-frequency analyses, Manual of Hydrology: Part 3

    USGS Publications Warehouse

    Dalrymple, Tate

    1960-01-01

    This report describes the method used by the U.S. Geological Survey to determine the magnitude and frequency of momentary peak discharges at any place on a stream, whether a gaging-station record is available or not. The method is applicable to a region of any size, as a river basin or a State, so long as the region is hydrologically homogeneous. The analysis provides two curves. The first expresses the flood discharge-time relation, showing variation of peak discharge, expressed as a ratio to the mean annual flood, with recurrence interval. The second relates the mean annual flood to the size of drainage area alone, or to the size area and other significant basin characteristics. A frequency curve may be defined for any place in the region by use of these two curves. The procedure is: (a) measure the drainage area and other appropriate basin characteristics from maps; (b) from the second curve, select the mean annual flood corresponding to the proper drainage area factors; (c) from the first curve, select ratios of peak discharge to mean annual flood for selected recurrence intervals, as 2, 10, 25, and 50 years; and (d) multiply these ratios by the mean annual flood and plot the resulting discharges of known frequency to define the frequency curve. Two reports not previously given general circulation are included as sections of this report. These are 'Plotting Positions in Frequency Analysis' by W. B. Langbein, and 'Characteristics of Frequency Curves Based on a Theoretical 1,000-Year Record' by M. A. Benson.

  16. Dissociation Energies of the Alkaline Earth Monofluorides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BLUE, GARY D.; GREEN, JOHN W.; EHLERT, THOMAS C.

    1963-08-24

    New results and theoretical calculations are presented that indicate consistently high dissocintion energies for all the alkaline earth monofluorides. Experimental results were obtained by utilizing a mass spectrometer to analyze the vapors from a heated Ta Knudsen cell containing an alkaline earth fluoride salt with Al present as a reducing agent. Ionization efficiency curves were obtained and temperature dependence investigations were made to determine the molecular precursor of the ions observed. Values of the equilibrium constants at different temperatures were used together with the free-energy functions to calculate the third law heats of reaction at 298 deg K. Data aremore » tabulated for the heats of various reactions for Al--MF2 systems with M = Mg, Ca, Sr, and Ba, and dissociation energies of MF molecules by various methods for Be, Mg, Ca, Sr, and Ba. (C.H.)« less

  17. The complexities of urban flood response: Flood frequency analyses for the Charlotte metropolitan region

    NASA Astrophysics Data System (ADS)

    Zhou, Zhengzheng; Smith, James A.; Yang, Long; Baeck, Mary Lynn; Chaney, Molly; Ten Veldhuis, Marie-Claire; Deng, Huiping; Liu, Shuguang

    2017-08-01

    We examine urban flood response through data-driven analyses for a diverse sample of "small" watersheds (basin scale ranging from 7.0 to 111.1 km2) in the Charlotte Metropolitan region. These watersheds have experienced extensive urbanization and suburban development since the 1960s. The objective of this study is to develop a broad characterization of land surface and hydrometeorological controls of urban flood hydrology. Our analyses are based on peaks-over-threshold flood data developed from USGS streamflow observations and are motivated by problems of flood hazard characterization for urban regions. We examine flood-producing rainfall using high-resolution (1 km2 spatial resolution and 15 min time resolution), bias-corrected radar rainfall fields that are developed through the Hydro-NEXRAD system. The analyses focus on the 2001-2015 period. The results highlight the complexities of urban flood response. There are striking spatial heterogeneities in flood peak magnitudes, response times, and runoff ratios across the study region. These spatial heterogeneities are mainly linked to watershed scale, the distribution of impervious cover, and storm water management. Contrasting land surface properties also determine the mixture of flood-generating mechanisms for a particular watershed. Warm-season thunderstorm systems and tropical cyclones are main flood agents in Charlotte, with winter/spring storms playing a role in less-urbanized watersheds. The mixture of flood agents exerts a strong impact on the upper tail of flood frequency distributions. Antecedent watershed wetness plays a minor role in urban flood response, compared with less-urbanized watersheds. Implications for flood hazard characterization in urban watersheds and for advances in flood science are discussed.

  18. Electrochemical Behavior of Sulfur in Aqueous Alkaline Solutions

    NASA Astrophysics Data System (ADS)

    Mamyrbekova, Aigul; Mamitova, A. D.; Mamyrbekova, Aizhan

    2018-03-01

    The kinetics and mechanism of the electrode oxidation-reduction of sulfur on an electrically conductive sulfur-graphite electrode in an alkaline solution was studied by the potentiodynamic method. To examine the mechanism of electrode processes occurring during AC polarization on a sulfur-graphite electrode, the cyclic polarization in both directions and anodic polarization curves were recorded. The kinetic parameters: charge transfer coefficients (α), diffusion coefficients ( D), heterogeneous rate constants of electrode process ( k s), and effective activation energies of the process ( E a) were calculated from the results of polarization measurements. An analysis of the results and calculated kinetic parameters of electrode processes showed that discharge ionization of sulfur in alkaline solutions occurs as a sequence of two stages and is a quasireversible process.

  19. Flood frequency analysis - the challenge of using historical data

    NASA Astrophysics Data System (ADS)

    Engeland, Kolbjorn

    2015-04-01

    Estimates of high flood quantiles are needed for many applications, .e.g. dam safety assessments are based on the 1000 years flood, whereas the dimensioning of important infrastructure requires estimates of the 200 year flood. The flood quantiles are estimated by fitting a parametric distribution to a dataset of high flows comprising either annual maximum values or peaks over a selected threshold. Since the record length of data is limited compared to the desired flood quantile, the estimated flood magnitudes are based on a high degree of extrapolation. E.g. the longest time series available in Norway are around 120 years, and as a result any estimation of a 1000 years flood will require extrapolation. One solution is to extend the temporal dimension of a data series by including information about historical floods before the stream flow was systematically gaugeded. Such information could be flood marks or written documentation about flood events. The aim of this study was to evaluate the added value of using historical flood data for at-site flood frequency estimation. The historical floods were included in two ways by assuming: (1) the size of (all) floods above a high threshold within a time interval is known; and (2) the number of floods above a high threshold for a time interval is known. We used a Bayesian model formulation, with MCMC used for model estimation. This estimation procedure allowed us to estimate the predictive uncertainty of flood quantiles (i.e. both sampling and parameter uncertainty is accounted for). We tested the methods using 123 years of systematic data from Bulken in western Norway. In 2014 the largest flood in the systematic record was observed. From written documentation and flood marks we had information from three severe floods in the 18th century and they were likely to exceed the 2014 flood. We evaluated the added value in two ways. First we used the 123 year long streamflow time series and investigated the effect of having several

  20. Prehistoric floods on the Tennessee River—Assessing the use of stratigraphic records of past floods for improved flood-frequency analysis

    USGS Publications Warehouse

    Harden, Tessa M.; O'Connor, Jim E.

    2017-06-14

    Stratigraphic analysis, coupled with geochronologic techniques, indicates that a rich history of large Tennessee River floods is preserved in the Tennessee River Gorge area. Deposits of flood sediment from the 1867 peak discharge of record (460,000 cubic feet per second at Chattanooga, Tennessee) are preserved at many locations throughout the study area at sites with flood-sediment accumulation. Small exposures at two boulder overhangs reveal evidence of three to four other floods similar in size, or larger, than the 1867 flood in the last 3,000 years—one possibly as much or more than 50 percent larger. Records of floods also are preserved in stratigraphic sections at the mouth of the gorge at Williams Island and near Eaves Ferry, about 70 river miles upstream of the gorge. These stratigraphic records may extend as far back as about 9,000 years ago, giving a long history of Tennessee River floods. Although more evidence is needed to confirm these findings, a more in-depth comprehensive paleoflood study is feasible for the Tennessee River.

  1. Proposal for management and alkalinity transformation of bauxite residue in China.

    PubMed

    Xue, Shengguo; Kong, Xiangfeng; Zhu, Feng; Hartley, William; Li, Xiaofei; Li, Yiwei

    2016-07-01

    Bauxite residue is a hazardous solid waste produced during the production of alumina. Its high alkalinity is a potential threat to the environment which may disrupt the surrounding ecological balance of its disposal areas. China is one of the major global producers of alumina and bauxite residue, but differences in alkalinity and associated chemistry exist between residues from China and those from other countries. A detailed understanding of the chemistry of bauxite residue remains the key to improving its management, both in terms of minimizing environmental impacts and reducing its alkaline properties. The nature of bauxite residue and the chemistry required for its transformation are still poorly understood. This review focuses on various transformation processes generated from the Bayer process, sintering process, and combined Bayer-sintering process in China. Problems associated with transformation mechanisms, technical methods, and relative merits of these technologies are reviewed, while current knowledge gaps and research priorities are recommended. Future research should focus on transformation chemistry and its associated mechanisms and for the development of a clear and economic process to reduce alkalinity and soda in bauxite residue.

  2. The world's largest floods, past and present: Their causes and magnitudes

    USGS Publications Warehouse

    O'Connor, Jim E.; Costa, John E.

    2004-01-01

    Floods are among the most powerful forces on earth. Human societies worldwide have lived and died with floods from the very beginning, spawning a prominent role for floods within legends, religions, and history. Inspired by such accounts, geologists, hydrologists, and historians have studied the role of floods on humanity and its supporting ecosystems, resulting in new appreciation for the many-faceted role of floods in shaping our world. Part of this appreciation stems from ongoing analysis of long-term streamflow measurements, such as those recorded by the U.S. Geological Survey's (USGS) streamflow gaging network. But the recognition of the important role of flooding in shaping our cultural and physical landscape also owes to increased understanding of the variety of mechanisms that cause floods and how the types and magnitudes of floods can vary with time and space. The USGS has contributed to this understanding through more than a century of diverse research activities on many aspects of floods, including their causes, effects, and hazards. This Circular summarizes a facet of this research by describing the causes and magnitudes of the world's largest floods, including those measured and described by modern methods in historic times, as well as floods of prehistoric times, for which the only records are those left by the floods themselves.

  3. Improving Flood Predictions in Data-Scarce Basins

    NASA Astrophysics Data System (ADS)

    Vimal, Solomon; Zanardo, Stefano; Rafique, Farhat; Hilberts, Arno

    2017-04-01

    Flood modeling methodology at Risk Management Solutions Ltd. has evolved over several years with the development of continental scale flood risk models spanning most of Europe, the United States and Japan. Pluvial (rain fed) and fluvial (river fed) flood maps represent the basis for the assessment of regional flood risk. These maps are derived by solving the 1D energy balance equation for river routing and 2D shallow water equation (SWE) for overland flow. The models are run with high performance computing and GPU based solvers as the time taken for simulation is large in such continental scale modeling. These results are validated with data from authorities and business partners, and have been used in the insurance industry for many years. While this methodology has been proven extremely effective in regions where the quality and availability of data are high, its application is very challenging in other regions where data are scarce. This is generally the case for low and middle income countries, where simpler approaches are needed for flood risk modeling and assessment. In this study we explore new methods to make use of modeling results obtained in data-rich contexts to improve predictive ability in data-scarce contexts. As an example, based on our modeled flood maps in data-rich countries, we identify statistical relationships between flood characteristics and topographic and climatic indicators, and test their generalization across physical domains. Moreover, we apply the Height Above Nearest Drainage (HAND)approach to estimate "probable" saturated areas for different return period flood events as functions of basin characteristics. This work falls into the well-established research field of Predictions in Ungauged Basins.

  4. Early warning method of Glacial Lake Outburst Floods based on temperature and rainfall

    NASA Astrophysics Data System (ADS)

    Liu, Jingjing; Su, Pengcheng; Cheng, Zunlan

    2017-04-01

    Glacial lake outburst floods (GLOFs) are serious disasters in glacial areas. At present, glaciers are retreating while glacial lake area and the outburst risk increases due to the global warming. Therefore, the research of early warning method of GLOFs is important to prevent and reduce the disasters. This paper provides an early warning method using the temperature and rainfall as indices. The daily growth rate of positive antecedent accumulative temperature and the antecedent thirty days accumulative precipitation are calculated for 21 events of GLOF before 2010, based on data from the 21 meteorological stations nearby. The result shows that all the events are above the curve, TV = -0.0193RDC + 3.0018, which can be taken as the early warning threshold curve. This has been verified by the GLOF events in the Ranzeaco glacial lake on 2013-07-05.

  5. Flood Frequency Analysis With Historical and Paleoflood Information

    NASA Astrophysics Data System (ADS)

    Stedinger, Jery R.; Cohn, Timothy A.

    1986-05-01

    An investigation is made of flood quantile estimators which can employ "historical" and paleoflood information in flood frequency analyses. Two categories of historical information are considered: "censored" data, where the magnitudes of historical flood peaks are known; and "binomial" data, where only threshold exceedance information is available. A Monte Carlo study employing the two-parameter lognormal distribution shows that maximum likelihood estimators (MLEs) can extract the equivalent of an additional 10-30 years of gage record from a 50-year period of historical observation. The MLE routines are shown to be substantially better than an adjusted-moment estimator similar to the one recommended in Bulletin 17B of the United States Water Resources Council Hydrology Committee (1982). The MLE methods performed well even when floods were drawn from other than the assumed lognormal distribution.

  6. Impact of social preparedness on flood early warning systems

    NASA Astrophysics Data System (ADS)

    Girons Lopez, M.; Di Baldassarre, G.; Seibert, J.

    2017-01-01

    Flood early warning systems play a major role in the disaster risk reduction paradigm as cost-effective methods to mitigate flood disaster damage. The connections and feedbacks between the hydrological and social spheres of early warning systems are increasingly being considered as key aspects for successful flood mitigation. The behavior of the public and first responders during flood situations, determined by their preparedness, is heavily influenced by many behavioral traits such as perceived benefits, risk awareness, or even denial. In this study, we use the recency of flood experiences as a proxy for social preparedness to assess its impact on the efficiency of flood early warning systems through a simple stylized model and implemented this model using a simple mathematical description. The main findings, which are based on synthetic data, point to the importance of social preparedness for flood loss mitigation, especially in circumstances where the technical forecasting and warning capabilities are limited. Furthermore, we found that efforts to promote and preserve social preparedness may help to reduce disaster-induced losses by almost one half. The findings provide important insights into the role of social preparedness that may help guide decision-making in the field of flood early warning systems.

  7. The credibility challenge for global fluvial flood risk analysis

    NASA Astrophysics Data System (ADS)

    Trigg, M. A.; Birch, C. E.; Neal, J. C.; Bates, P. D.; Smith, A.; Sampson, C. C.; Yamazaki, D.; Hirabayashi, Y.; Pappenberger, F.; Dutra, E.; Ward, P. J.; Winsemius, H. C.; Salamon, P.; Dottori, F.; Rudari, R.; Kappes, M. S.; Simpson, A. L.; Hadzilacos, G.; Fewtrell, T. J.

    2016-09-01

    Quantifying flood hazard is an essential component of resilience planning, emergency response, and mitigation, including insurance. Traditionally undertaken at catchment and national scales, recently, efforts have intensified to estimate flood risk globally to better allow consistent and equitable decision making. Global flood hazard models are now a practical reality, thanks to improvements in numerical algorithms, global datasets, computing power, and coupled modelling frameworks. Outputs of these models are vital for consistent quantification of global flood risk and in projecting the impacts of climate change. However, the urgency of these tasks means that outputs are being used as soon as they are made available and before such methods have been adequately tested. To address this, we compare multi-probability flood hazard maps for Africa from six global models and show wide variation in their flood hazard, economic loss and exposed population estimates, which has serious implications for model credibility. While there is around 30%-40% agreement in flood extent, our results show that even at continental scales, there are significant differences in hazard magnitude and spatial pattern between models, notably in deltas, arid/semi-arid zones and wetlands. This study is an important step towards a better understanding of modelling global flood hazard, which is urgently required for both current risk and climate change projections.

  8. Assessment of the spatial scaling behaviour of floods in the United Kingdom

    NASA Astrophysics Data System (ADS)

    Formetta, Giuseppe; Stewart, Elizabeth; Bell, Victoria

    2017-04-01

    Floods are among the most dangerous natural hazards, causing loss of life and significant damage to private and public property. Regional flood-frequency analysis (FFA) methods are essential tools to assess the flood hazard and plan interventions for its mitigation. FFA methods are often based on the well-known index flood method that assumes the invariance of the coefficient of variation of floods with drainage area. This assumption is equivalent to the simple scaling or self-similarity assumption for peak floods, i.e. their spatial structure remains similar in a particular, relatively simple, way to itself over a range of scales. Spatial scaling of floods has been evaluated at national scale for different countries such as Canada, USA, and Australia. According our knowledge. Such a study has not been conducted for the United Kingdom even though the standard FFA method there is based on the index flood assumption. In this work we present an integrated approach to assess of the spatial scaling behaviour of floods in the United Kingdom using three different methods: product moments (PM), probability weighted moments (PWM), and quantile analysis (QA). We analyse both instantaneous and daily annual observed maximum floods and performed our analysis both across the entire country and in its sub-climatic regions as defined in the Flood Studies Report (NERC, 1975). To evaluate the relationship between the k-th moments or quantiles and the drainage area we used both regression with area alone and multiple regression considering other explanatory variables to account for the geomorphology, amount of rainfall, and soil type of the catchments. The latter multiple regression approach was only recently demonstrated being more robust than the traditional regression with area alone that can lead to biased estimates of scaling exponents and misinterpretation of spatial scaling behaviour. We tested our framework on almost 600 rural catchments in UK considered as entire region and

  9. Phosphatidylinositol anchor of HeLa cell alkaline phosphatase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jemmerson, R.; Low, M.G.

    1987-09-08

    Alkaline phosphatase from cancer cells, HeLa TCRC-1, was biosynthetically labeled with either /sup 3/H-fatty acids or (/sup 3/H)ethanolamine as analyzed by sodium dodecyl sulfate-polyacrylamide gel electrophoresis and fluorography of immunoprecipitated material. Phosphatidylinositol-specific phospholipase C (PI-PLC) released a substantial proportion of the /sup 3/H-fatty acid label from immunoaffinity-purified alkaline phosphatase but had no effect on the radioactivity of (/sup 3/H)ethanolamine-labeled material. PI-PLC also liberated catalytically active alkaline phosphatase from viable cells, and this could be selectively blocked by monoclonal antibodies to alkaline phosphatase. However, the alkaline phosphatase released from /sup 3/H-fatty acid labeled cells by PI-PLC was not radioactive. By contrast,more » treatment with bromelain removed both the /sup 3/H-fatty acid and the (/sup 3/H)ethanolamine label from purified alkaline phosphatase. Subtilisin was also able to remove the (/sup 3/H)ethanolamine label from the purified alkaline phosphatase. The /sup 3/H radioactivity in alkaline phosphatase purified from (/sup 3/H)ethanolamine-labeled cells comigrated with authentic (/sup 3/H)ethanolamine by anion-exchange chromatography after acid hydrolysis. The data suggest that the /sup 3/H-fatty acid and (/sup 3/H)ethanolamine are covalently attached to the carboxyl-terminal segment since bromelain and subtilisin both release alkaline phosphatase from the membrane by cleavage at that end of the polypeptide chain. The data are consistent with findings for other proteins recently shown to be anchored in the membrane through a glycosylphosphatidylinositol structure and indicate that a similar structure contributes to the membrane anchoring of alkaline phosphatase.« less

  10. Increased river alkalinization in the Eastern U.S.

    PubMed

    Kaushal, Sujay S; Likens, Gene E; Utz, Ryan M; Pace, Michael L; Grese, Melissa; Yepsen, Metthea

    2013-09-17

    The interaction between human activities and watershed geology is accelerating long-term changes in the carbon cycle of rivers. We evaluated changes in bicarbonate alkalinity, a product of chemical weathering, and tested for long-term trends at 97 sites in the eastern United States draining over 260,000 km(2). We observed statistically significant increasing trends in alkalinity at 62 of the 97 sites, while remaining sites exhibited no significant decreasing trends. Over 50% of study sites also had statistically significant increasing trends in concentrations of calcium (another product of chemical weathering) where data were available. River alkalinization rates were significantly related to watershed carbonate lithology, acid deposition, and topography. These three variables explained ~40% of variation in river alkalinization rates. The strongest predictor of river alkalinization rates was carbonate lithology. The most rapid rates of river alkalinization occurred at sites with highest inputs of acid deposition and highest elevation. The rise of alkalinity in many rivers throughout the Eastern U.S. suggests human-accelerated chemical weathering, in addition to previously documented impacts of mining and land use. Increased river alkalinization has major environmental implications including impacts on water hardness and salinization of drinking water, alterations of air-water exchange of CO2, coastal ocean acidification, and the influence of bicarbonate availability on primary production.

  11. Annual timing of river floods in the Northeast United States: seasonal characterization and temporal trends

    NASA Astrophysics Data System (ADS)

    Collins, M. J.

    2016-12-01

    Increases in flood magnitude and frequency have been documented in climate-sensitive watersheds in the Northeast United States. Associated changes in inundation frequency and/or magnitude, or changes in stream channel form and function, can affect human uses of floodplain environments (e.g., dwellings or transportation infrastructure) as well as aquatic and riparian habitats. Historical changes in flood magnitude and frequency also have important implications for designing floodplain infrastructure and channel modifications because well-accepted statistical methods for design-flood prediction require flood records with stationary means and variances. Changes in flood timing during the year may also be impactful, but have not been studied in detail for the Northeast United States. For example, relatively modest shifts in the timing of winter/spring floods can affect the incidence of ice jam complications. Or, changes in spring or fall flood timing may positively or negatively affect a vulnerable life stage for a migratory fish (e.g., egg setting) depending on whether floods occur more frequently before or after the life history event. With this study I apply an objective, probabilistic method for identifying flood seasonality in climate-sensitive watersheds of the Mid-Atlantic and New England regions (Hydrologic Unit Codes 01 and 02). Temporal trends in the timing of floods within significant flood seasons at a site are then analyzed using a method that employs directional statistics. The analyses are based on partial duration flood series that are an average of 85 years long. Documented changes in flood timing during the year are considered in the context of both potential historical impacts and expectations for future flood timing given regional climate change projections.

  12. Spatial coherence of flood-rich and flood-poor periods across Germany

    NASA Astrophysics Data System (ADS)

    Merz, Bruno; Dung, Nguyen Viet; Apel, Heiko; Gerlitz, Lars; Schröter, Kai; Steirou, Eva; Vorogushyn, Sergiy

    2018-04-01

    Despite its societal relevance, the question whether fluctuations in flood occurrence or magnitude are coherent in space has hardly been addressed in quantitative terms. We investigate this question for Germany by analysing fluctuations in annual maximum series (AMS) values at 68 discharge gauges for the common time period 1932-2005. We find remarkable spatial coherence across Germany given its different flood regimes. For example, there is a tendency that flood-rich/-poor years in sub-catchments of the Rhine basin, which are dominated by winter floods, coincide with flood-rich/-poor years in the southern sub-catchments of the Danube basin, which have their dominant flood season in summer. Our findings indicate that coherence is caused rather by persistence in catchment wetness than by persistent periods of higher/lower event precipitation. Further, we propose to differentiate between event-type and non-event-type coherence. There are quite a number of hydrological years with considerable non-event-type coherence, i.e. AMS values of the 68 gauges are spread out through the year but in the same magnitude range. Years with extreme flooding tend to be of event-type and non-coherent, i.e. there is at least one precipitation event that affects many catchments to various degree. Although spatial coherence is a remarkable phenomenon, and large-scale flooding across Germany can lead to severe situations, extreme magnitudes across the whole country within one event or within one year were not observed in the investigated period.

  13. The link between land use and flood risk assessment in urban areas

    NASA Astrophysics Data System (ADS)

    Sörensen, Johanna; Kalantari, Zahra

    2017-04-01

    Densification of urban areas rises a concern for increased pluvial flooding. Flood risk in urban areas might rise under impact of land use changes. Urbanisation involves conversion of natural areas to impermeable areas giving lower infiltration rates and increased runoff. When high-intense rainfall excess the capacity of the drainage system in a city, high runoff causes pluvial flooding in low-laying areas. In the present study, a long time series (20 years) of geo-referenced flood claims from property owners has been collected and analysed in detail to assess flood risk under impact of land use changes in urban areas. The flood claim data come from property owners with flood insurance that covers property loss from overland flooding, groundwater intrusion through basement walls, as well as flooding from the drainage system, and are used as a proxy for flood severity. The spatial relationships between land use change and flood occurrences in different urban areas were analysed. Special emphasis were put on how nature-based solutions and blue-green infrastructure relates to flood risk. The relationships defined by a statistical method explaining the tendencies where the land use change contributes to flood risk changes and others engaged factors.

  14. Parsimonious nonstationary flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Serago, Jake M.; Vogel, Richard M.

    2018-02-01

    There is now widespread awareness of the impact of anthropogenic influences on extreme floods (and droughts) and thus an increasing need for methods to account for such influences when estimating a frequency distribution. We introduce a parsimonious approach to nonstationary flood frequency analysis (NFFA) based on a bivariate regression equation which describes the relationship between annual maximum floods, x, and an exogenous variable which may explain the nonstationary behavior of x. The conditional mean, variance and skewness of both x and y = ln (x) are derived, and combined with numerous common probability distributions including the lognormal, generalized extreme value and log Pearson type III models, resulting in a very simple and general approach to NFFA. Our approach offers several advantages over existing approaches including: parsimony, ease of use, graphical display, prediction intervals, and opportunities for uncertainty analysis. We introduce nonstationary probability plots and document how such plots can be used to assess the improved goodness of fit associated with a NFFA.

  15. Impact of rainfall spatial variability on Flash Flood Forecasting

    NASA Astrophysics Data System (ADS)

    Douinot, Audrey; Roux, Hélène; Garambois, Pierre-André; Larnier, Kevin

    2014-05-01

    According to the United States National Hazard Statistics database, flooding and flash flooding have caused the largest number of deaths of any weather-related phenomenon over the last 30 years (Flash Flood Guidance Improvement Team, 2003). Like the storms that cause them, flash floods are very variable and non-linear phenomena in time and space, with the result that understanding and anticipating flash flood genesis is far from straightforward. In the U.S., the Flash Flood Guidance (FFG) estimates the average number of inches of rainfall for given durations required to produce flash flooding in the indicated county. In Europe, flash flood often occurred on small catchments (approximately 100 km2) and it has been shown that the spatial variability of rainfall has a great impact on the catchment response (Le Lay and Saulnier, 2007). Therefore, in this study, based on the Flash flood Guidance method, rainfall spatial variability information is introduced in the threshold estimation. As for FFG, the threshold is the number of millimeters of rainfall required to produce a discharge higher than the discharge corresponding to the first level (yellow) warning of the French flood warning service (SCHAPI: Service Central d'Hydrométéorologie et d'Appui à la Prévision des Inondations). The indexes δ1 and δ2 of Zoccatelli et al. (2010), based on the spatial moments of catchment rainfall, are used to characterize the rainfall spatial distribution. Rainfall spatial variability impacts on warning threshold and on hydrological processes are then studied. The spatially distributed hydrological model MARINE (Roux et al., 2011), dedicated to flash flood prediction is forced with synthetic rainfall patterns of different spatial distributions. This allows the determination of a warning threshold diagram: knowing the spatial distribution of the rainfall forecast and therefore the 2 indexes δ1 and δ2, the threshold value is read on the diagram. A warning threshold diagram is

  16. Millimeter-scale alkalinity measurement in marine sediment using DET probes and colorimetric determination.

    PubMed

    Metzger, E; Viollier, E; Simonucci, C; Prévot, F; Langlet, D; Jézéquel, D

    2013-10-01

    Constrained DET (Diffusive Equilibration in Thin films) probes equipped with 75 sampling layers of agarose gel (DGT Research(©)) were used to sample bottom and pore waters in marine sediment with a 2 mm vertical resolution. After retrieval, each piece of hydrogel, corresponding to 25 μL, was introduced into 1 mL of colorimetric reagent (CR) solution consisting of formic acid and bromophenol blue. After the elution/reaction time, absorbance of the latter mixture was read at 590 nm and compared to a calibration curve obtained with the same protocol applied to mini DET probes soaked in sodium hydrogen carbonate standard solutions. This method allows rapid alkalinity determinations for the small volumes of anoxic pore water entrapped into the gel. The method was assessed on organic-rich coastal marine sediments from Thau lagoon (France). Alkalinity values in the overlying waters were in agreement with data obtained by classical sampling techniques. Pore water data showed a progressive increase of alkalinity in the sediment from 2 to 10 mmol kg(-1), corresponding to anaerobic respiration in organic-rich sediments. Moreover, replicates of high-resolution DET profiles showed important lateral heterogeneity at a decimeter scale. This underlines the importance of high-resolution spatial methods for alkalinity profiling in coastal marine systems. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Reduction of nitrobenzene with alkaline ascorbic acid: Kinetics and pathways.

    PubMed

    Liang, Chenju; Lin, Ya-Ting; Shiu, Jia-Wei

    2016-01-25

    Alkaline ascorbic acid (AA) exhibits the potential to reductively degrade nitrobenzene (NB), which is the simplest of the nitroaromatic compounds. The nitro group (NO2(-)) of NB has a +III oxidation state of the N atom and tends to gain electrons. The effect of alkaline pH ranging from 9 to 13 was initially assessed and the results demonstrated that the solution pH, when approaching or above the pKa2 of AA (11.79), would increase reductive electron transfer to NB. The rate equation for the reactions between NB and AA at pH 12 can be described as r=((0.89±0.11)×10(-4) mM(1-(a+b))h(-1))×[NB](a=1.35±0.10)[AA](b=0.89±0.01). The GC/MS analytical method identified nitrosobenzene, azoxybenzene, and azobenzene as NB reduction intermediates, and aniline (AN) as a final product. These experimental results indicate that the alkaline AA reduction of NB to AN mainly proceeds via the direct route, consisting of a series of two-electron or four-electron transfers, and the condensation reaction plays a minor route. Preliminary evaluation of the remediation of spiked NB contaminated soils revealed that maintenance of alkaline pH and a higher water to soil ratio are essential for a successful alkaline AA application. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Mobility of as, Cu, Cr, and Zn from tailings covered with sealing materials using alkaline industrial residues: a comparison between two leaching methods.

    PubMed

    Jia, Yu; Maurice, Christian; Öhlander, Björn

    2016-01-01

    Different alkaline residue materials (fly ash, green liquor dregs, and lime mud) generated from the pulp and paper industry as sealing materials were evaluated to cover aged mine waste tailings (<1% sulfur content, primarily pyrite). The mobility of four selected trace elements (Cr, Cu, Zn, and As) was compared based on batch and column leaching studies to assess the effectiveness of these alkaline materials as sealing agents. Based on the leaching results, Cr, Cu, and Zn were immobilized by the alkaline amendments. In the amended tailings in the batch system only As dramatically exceeded the limit values at L/S 10 L/kg. The leaching results showed similar patterns to the batch results, though leached Cr, Cu, and Zn showed higher levels in the column tests than in the batch tests. However, when the columns were compared with the batches, the trend for Cu was opposite for the unamended tailings. By contrast, both batch and column results showed that the amendment caused mobilization of As compared with the unamended tailings in the ash-amended tailings. The amount of As released was greatest in the ash column and decreased from the dregs to the lime columns. The leaching of As at high levels can be a potential problem whenever alkaline materials (especially for fly ash) are used as sealing materials over tailings. The column test was considered by the authors to be a more informative method in remediation of the aged tailings with low sulfur content, since it mimics better actual situation in a field.

  19. Flooded Place

    NASA Image and Video Library

    2006-07-26

    This MOC image shows gullies a portion of a flood-carved canyon within the larger Kasei Valles system on Mars. This canyon is the result of the very last flood event that poured through the Kasei valleys, long ago

  20. Flooded Crater

    NASA Image and Video Library

    2003-04-04

    This image from NASA Mars Odyssey spacecraft shows a flooded crater in Amazonis Planitia. This crater has been either flooded with mud and or lava. The fluid then ponded up, dried and formed the surface textures we see today.

  1. Alkaline phosphatase as a screening test for osteomalacia.

    PubMed

    Chinoy, Muhammad Amin; Javed, Muhammad Imran; Khan, Alamzeb; Sadruddin, Nooruddin

    2011-01-01

    Vitamin D deficiency remains common in children and adults in Pakistan despite adequate sunlight exposure. Diagnosis in adults is usually delayed and is made following pathological fractures that result in significant morbidity. The objective of this study was to see whether Serum Alkaline Phosphatase levels could be used as a screening test for osteomalacia. The Study was conducted at Fatima Hospital, Baqai Medical University, Gadap, Karachi, between July 2002 and June 2005. Serum calcium levels are commonly used to screen patients suspected of osteomalacia, and raised serum alkaline phosphatase (SALP) is considered a diagnostic finding. We used SALP to screen patients who presented with back or non-specific aches and pain of more than six months duration. Three hundred thirty-four (334) patients were screened of which 116 (35%) had raised SALP. Osteomalacia was diagnosed in 92 (79.3%) of these 116 either by plain radiographs, bone biopsy or isotope bone scan. Fifty-four (53.4%) of the 101 cases had a normal level of serum calcium. Osteomalacia is likely to be missed if only serum calcium is used to screen patients. Serum Alkaline Phosphate should be used as the preferred method for screening these patients.

  2. Improving flash flood frequency analyses by using non-systematic dendrogeomorphic data

    NASA Astrophysics Data System (ADS)

    Mediero, Luis; María Bodoque, Jose; Garrote, Julio; Ballesteros-Cánovas, Juan Antonio; Aroca-Jimenez, Estefania

    2017-04-01

    Flash floods have a rapid hydrological response in catchments with short lag times, characterized by ''peaky'' hydrographs. The peak flows are reached within a few hours, thus giving little or no advance warning to prevent and mitigate flood damage. As a result, flash floods may result in a high social risk, as shown for instance by the 1997 Biescas disaster in Spain. The analysis and management of flood risk are clearly conditioned by data availability, especially in mountain areas where usually flash-floods occur. Nevertheless, in mountain basins there is often short data series available that are not accurate in terms of statistical significance. In addition, when flow data is ready for use maximum annual values are generally not as reliable as average flow values, since conventional stream gauge stations may not record the extreme floods, leading to gaps in the time series. Dendrogeomorphology has been shown to be especially useful for improving flood frequency analyses in catchments where short flood series limit the use of conventional hydrological methods. This study presents pros and cons of using a given probability distribution function, such as the Generalized Extreme Value (GEV), and Bayesian Markov Chain Monte Carlo (MCMC) methods to account for non-systematic data provided by dendrogeomorphic techniques, in order to asses flood quantile estimates accuracy. To this end, we have considered a set of locations in Central Spain, where systematic flow available at a gauging site can be extended with non-systematic data obtained from implementation of dendrogeomorphic techniques.

  3. Coupling alkaline pre-extraction with alkaline-oxidative post-treatment of corn stover to enhance enzymatic hydrolysis and fermentability.

    PubMed

    Liu, Tongjun; Williams, Daniel L; Pattathil, Sivakumar; Li, Muyang; Hahn, Michael G; Hodge, David B

    2014-04-03

    A two-stage chemical pretreatment of corn stover is investigated comprising an NaOH pre-extraction followed by an alkaline hydrogen peroxide (AHP) post-treatment. We propose that conventional one-stage AHP pretreatment can be improved using alkaline pre-extraction, which requires significantly less H2O2 and NaOH. To better understand the potential of this approach, this study investigates several components of this process including alkaline pre-extraction, alkaline and alkaline-oxidative post-treatment, fermentation, and the composition of alkali extracts. Mild NaOH pre-extraction of corn stover uses less than 0.1 g NaOH per g corn stover at 80°C. The resulting substrates were highly digestible by cellulolytic enzymes at relatively low enzyme loadings and had a strong susceptibility to drying-induced hydrolysis yield losses. Alkaline pre-extraction was highly selective for lignin removal over xylan removal; xylan removal was relatively minimal (~20%). During alkaline pre-extraction, up to 0.10 g of alkali was consumed per g of corn stover. AHP post-treatment at low oxidant loading (25 mg H2O2 per g pre-extracted biomass) increased glucose hydrolysis yields by 5%, which approached near-theoretical yields. ELISA screening of alkali pre-extraction liquors and the AHP post-treatment liquors demonstrated that xyloglucan and β-glucans likely remained tightly bound in the biomass whereas the majority of the soluble polymeric xylans were glucurono (arabino) xylans and potentially homoxylans. Pectic polysaccharides were depleted in the AHP post-treatment liquor relative to the alkaline pre-extraction liquor. Because the already-low inhibitor content was further decreased in the alkaline pre-extraction, the hydrolysates generated by this two-stage pretreatment were highly fermentable by Saccharomyces cerevisiae strains that were metabolically engineered and evolved for xylose fermentation. This work demonstrates that this two-stage pretreatment process is well suited for

  4. Coupling alkaline pre-extraction with alkaline-oxidative post-treatment of corn stover to enhance enzymatic hydrolysis and fermentability

    PubMed Central

    2014-01-01

    Background A two-stage chemical pretreatment of corn stover is investigated comprising an NaOH pre-extraction followed by an alkaline hydrogen peroxide (AHP) post-treatment. We propose that conventional one-stage AHP pretreatment can be improved using alkaline pre-extraction, which requires significantly less H2O2 and NaOH. To better understand the potential of this approach, this study investigates several components of this process including alkaline pre-extraction, alkaline and alkaline-oxidative post-treatment, fermentation, and the composition of alkali extracts. Results Mild NaOH pre-extraction of corn stover uses less than 0.1 g NaOH per g corn stover at 80°C. The resulting substrates were highly digestible by cellulolytic enzymes at relatively low enzyme loadings and had a strong susceptibility to drying-induced hydrolysis yield losses. Alkaline pre-extraction was highly selective for lignin removal over xylan removal; xylan removal was relatively minimal (~20%). During alkaline pre-extraction, up to 0.10 g of alkali was consumed per g of corn stover. AHP post-treatment at low oxidant loading (25 mg H2O2 per g pre-extracted biomass) increased glucose hydrolysis yields by 5%, which approached near-theoretical yields. ELISA screening of alkali pre-extraction liquors and the AHP post-treatment liquors demonstrated that xyloglucan and β-glucans likely remained tightly bound in the biomass whereas the majority of the soluble polymeric xylans were glucurono (arabino) xylans and potentially homoxylans. Pectic polysaccharides were depleted in the AHP post-treatment liquor relative to the alkaline pre-extraction liquor. Because the already-low inhibitor content was further decreased in the alkaline pre-extraction, the hydrolysates generated by this two-stage pretreatment were highly fermentable by Saccharomyces cerevisiae strains that were metabolically engineered and evolved for xylose fermentation. Conclusions This work demonstrates that this two

  5. Quantification of increased flood risk due to global climate change for urban river management planning.

    PubMed

    Morita, M

    2011-01-01

    Global climate change is expected to affect future rainfall patterns. These changes should be taken into account when assessing future flooding risks. This study presents a method for quantifying the increase in flood risk caused by global climate change for use in urban flood risk management. Flood risk in this context is defined as the product of flood damage potential and the probability of its occurrence. The study uses a geographic information system-based flood damage prediction model to calculate the flood damage caused by design storms with different return periods. Estimation of the monetary damages these storms produce and their return periods are precursors to flood risk calculations. The design storms are developed from modified intensity-duration-frequency relationships generated by simulations of global climate change scenarios (e.g. CGCM2A2). The risk assessment method is applied to the Kanda River basin in Tokyo, Japan. The assessment provides insights not only into the flood risk cost increase due to global warming, and the impact that increase may have on flood control infrastructure planning.

  6. Amplification of flood frequencies with local sea level rise and emerging flood regimes

    NASA Astrophysics Data System (ADS)

    Buchanan, Maya K.; Oppenheimer, Michael; Kopp, Robert E.

    2017-06-01

    The amplification of flood frequencies by sea level rise (SLR) is expected to become one of the most economically damaging impacts of climate change for many coastal locations. Understanding the magnitude and pattern by which the frequency of current flood levels increase is important for developing more resilient coastal settlements, particularly since flood risk management (e.g. infrastructure, insurance, communications) is often tied to estimates of flood return periods. The Intergovernmental Panel on Climate Change’s Fifth Assessment Report characterized the multiplication factor by which the frequency of flooding of a given height increases (referred to here as an amplification factor; AF). However, this characterization neither rigorously considered uncertainty in SLR nor distinguished between the amplification of different flooding levels (such as the 10% versus 0.2% annual chance floods); therefore, it may be seriously misleading. Because both historical flood frequency and projected SLR are uncertain, we combine joint probability distributions of the two to calculate AFs and their uncertainties over time. Under probabilistic relative sea level projections, while maintaining storm frequency fixed, we estimate a median 40-fold increase (ranging from 1- to 1314-fold) in the expected annual number of local 100-year floods for tide-gauge locations along the contiguous US coastline by 2050. While some places can expect disproportionate amplification of higher frequency events and thus primarily a greater number of historically precedented floods, others face amplification of lower frequency events and thus a particularly fast growing risk of historically unprecedented flooding. For example, with 50 cm of SLR, the 10%, 1%, and 0.2% annual chance floods are expected respectively to recur 108, 335, and 814 times as often in Seattle, but 148, 16, and 4 times as often in Charleston, SC.

  7. Design flood estimation in ungauged basins: probabilistic extension of the design-storm concept

    NASA Astrophysics Data System (ADS)

    Berk, Mario; Špačková, Olga; Straub, Daniel

    2016-04-01

    Design flood estimation in ungauged basins is an important hydrological task, which is in engineering practice typically solved with the design storm concept. However, neglecting the uncertainty in the hydrological response of the catchment through the assumption of average-recurrence-interval (ARI) neutrality between rainfall and runoff can lead to flawed design flood estimates. Additionally, selecting a single critical rainfall duration neglects the contribution of other rainfall durations on the probability of extreme flood events. In this study, the design flood problem is approached with concepts from structural reliability that enable a consistent treatment of multiple uncertainties in estimating the design flood. The uncertainty of key model parameters are represented probabilistically and the First-Order Reliability Method (FORM) is used to compute the flood exceedance probability. As an important by-product, the FORM analysis provides the most likely parameter combination to lead to a flood with a certain exceedance probability; i.e. it enables one to find representative scenarios for e.g., a 100 year or a 1000 year flood. Possible different rainfall durations are incorporated by formulating the event of a given design flood as a series system. The method is directly applicable in practice, since for the description of the rainfall depth-duration characteristics, the same inputs as for the classical design storm methods are needed, which are commonly provided by meteorological services. The proposed methodology is applied to a case study of Trauchgauer Ach catchment in Bavaria, SCS Curve Number (CN) and Unit hydrograph models are used for modeling the hydrological process. The results indicate, in accordance with past experience, that the traditional design storm concept underestimates design floods.

  8. Flood loss modelling with FLF-IT: a new flood loss function for Italian residential structures

    NASA Astrophysics Data System (ADS)

    Hasanzadeh Nafari, Roozbeh; Amadio, Mattia; Ngo, Tuan; Mysiak, Jaroslav

    2017-07-01

    The damage triggered by different flood events costs the Italian economy millions of euros each year. This cost is likely to increase in the future due to climate variability and economic development. In order to avoid or reduce such significant financial losses, risk management requires tools which can provide a reliable estimate of potential flood impacts across the country. Flood loss functions are an internationally accepted method for estimating physical flood damage in urban areas. In this study, we derived a new flood loss function for Italian residential structures (FLF-IT), on the basis of empirical damage data collected from a recent flood event in the region of Emilia-Romagna. The function was developed based on a new Australian approach (FLFA), which represents the confidence limits that exist around the parameterized functional depth-damage relationship. After model calibration, the performance of the model was validated for the prediction of loss ratios and absolute damage values. It was also contrasted with an uncalibrated relative model with frequent usage in Europe. In this regard, a three-fold cross-validation procedure was carried out over the empirical sample to measure the range of uncertainty from the actual damage data. The predictive capability has also been studied for some sub-classes of water depth. The validation procedure shows that the newly derived function performs well (no bias and only 10 % mean absolute error), especially when the water depth is high. Results of these validation tests illustrate the importance of model calibration. The advantages of the FLF-IT model over other Italian models include calibration with empirical data, consideration of the epistemic uncertainty of data, and the ability to change parameters based on building practices across Italy.

  9. Distillation Column Flooding Predictor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George E. Dzyacky

    2010-11-23

    The Flooding Predictor™ is a patented advanced control technology proven in research at the Separations Research Program, University of Texas at Austin, to increase distillation column throughput by over 6%, while also increasing energy efficiency by 10%. The research was conducted under a U. S. Department of Energy Cooperative Agreement awarded to George Dzyacky of 2ndpoint, LLC. The Flooding Predictor™ works by detecting the incipient flood point and controlling the column closer to its actual hydraulic limit than historical practices have allowed. Further, the technology uses existing column instrumentation, meaning no additional refining infrastructure is required. Refiners often push distillationmore » columns to maximize throughput, improve separation, or simply to achieve day-to-day optimization. Attempting to achieve such operating objectives is a tricky undertaking that can result in flooding. Operators and advanced control strategies alike rely on the conventional use of delta-pressure instrumentation to approximate the column’s approach to flood. But column delta-pressure is more an inference of the column’s approach to flood than it is an actual measurement of it. As a consequence, delta pressure limits are established conservatively in order to operate in a regime where the column is never expected to flood. As a result, there is much “left on the table” when operating in such a regime, i.e. the capacity difference between controlling the column to an upper delta-pressure limit and controlling it to the actual hydraulic limit. The Flooding Predictor™, an innovative pattern recognition technology, controls columns at their actual hydraulic limit, which research shows leads to a throughput increase of over 6%. Controlling closer to the hydraulic limit also permits operation in a sweet spot of increased energy-efficiency. In this region of increased column loading, the Flooding Predictor is able to exploit the benefits of higher liquid

  10. Flood-frequency characteristics of Wisconsin streams

    USGS Publications Warehouse

    Walker, John F.; Peppler, Marie C.; Danz, Mari E.; Hubbard, Laura E.

    2017-05-22

    Flood-frequency characteristics for 360 gaged sites on unregulated rural streams in Wisconsin are presented for percent annual exceedance probabilities ranging from 0.2 to 50 using a statewide skewness map developed for this report. Equations of the relations between flood-frequency and drainage-basin characteristics were developed by multiple-regression analyses. Flood-frequency characteristics for ungaged sites on unregulated, rural streams can be estimated by use of the equations presented in this report. The State was divided into eight areas of similar physiographic characteristics. The most significant basin characteristics are drainage area, soil saturated hydraulic conductivity, main-channel slope, and several land-use variables. The standard error of prediction for the equation for the 1-percent annual exceedance probability flood ranges from 56 to 70 percent for Wisconsin Streams; these values are larger than results presented in previous reports. The increase in the standard error of prediction is likely due to increased variability of the annual-peak discharges, resulting in increased variability in the magnitude of flood peaks at higher frequencies. For each of the unregulated rural streamflow-gaging stations, a weighted estimate based on the at-site log Pearson type III analysis and the multiple regression results was determined. The weighted estimate generally has a lower uncertainty than either the Log Pearson type III or multiple regression estimates. For regulated streams, a graphical method for estimating flood-frequency characteristics was developed from the relations of discharge and drainage area for selected annual exceedance probabilities. Graphs for the major regulated streams in Wisconsin are presented in the report.

  11. Flood Mapping in the Lower Mekong River Basin Using Daily MODIS Observations

    NASA Technical Reports Server (NTRS)

    Fayne, Jessica V.; Bolten, John D.; Doyle, Colin S.; Fuhrmann, Sven; Rice, Matthew T.; Houser, Paul R.; Lakshmi, Venkat

    2017-01-01

    In flat homogenous terrain such as in Cambodia and Vietnam, the monsoon season brings significant and consistent flooding between May and November. To monitor flooding in the Lower Mekong region, the near real-time NASA Flood Extent Product (NASA-FEP) was developed using seasonal normalized difference vegetation index (NDVI) differences from the 250 m resolution Moderate Resolution Imaging Spectroradiometer (MODIS) sensor compared to daily observations. The use of a percentage change interval classification relating to various stages of flooding reduces might be confusing to viewers or potential users, and therefore reducing the product usage. To increase the product usability through simplification, the classification intervals were compared with other commonly used change detection schemes to identify the change classification scheme that best delineates flooded areas. The percentage change method used in the NASA-FEP proved to be helpful in delineating flood boundaries compared to other change detection methods. The results of the accuracy assessments indicate that the -75% NDVI change interval can be reclassified to a descriptive 'flood' classification. A binary system was used to simplify the interpretation of the NASA-FEP by removing extraneous information from lower interval change classes.

  12. Implementing the EU Floods Directive (2007/60/EC) in Austria: Flood Risk Management Plans

    NASA Astrophysics Data System (ADS)

    Neuhold, Clemens

    2013-04-01

    he Directive 2007/60/EC of the European Parliament and of the Council of 23 October 2007 on the assessment and management of flood risks (EFD) aims at the reduction of the adverse consequences for human health, the environment, cultural heritage and economic activity associated with floods in the Community. This task is to be achieved based on three process steps (1) preliminary flood risk assessment (finalised by the end of 2011), (2) flood hazard maps and flood risk maps (due 2013) and (3) flood risk management plans (due 2015). Currently, an interdisciplinary national working group is defining the methodological framework for flood risk management plans in Austria supported by a constant exchange with international bodies and experts. Referring to the EFD the components of the flood risk management plan are (excerpt): 1. conclusions of the preliminary flood risk assessment 2. flood hazard maps and flood risk maps and the conclusions that can be drawn from those maps 3. a description of the appropriate objectives of flood risk management 4. a summary of measures and their prioritisation aiming to achieve the appropriate objectives of flood risk management The poster refers to some of the major challenges in this process, such as the legal provisions, coordination of administrative units, definition of public relations, etc. The implementation of the EFD requires the harmonisation of legal instruments of various disciplines (e.g. water management, spatial planning, civil protection) enabling a coordinated - and ideally binding - practice of flood risk management. This process is highly influenced by the administrative organisation in Austria - federal, provincial and municipality level. The Austrian approach meets this organisational framework by structuring the development of the flood risk management plan into 3 time-steps: (a) federal blueprint, (b) provincial editing and (c) federal finishing as well as reporting to the European Commission. Each time

  13. Uncertainty in surface water flood risk modelling

    NASA Astrophysics Data System (ADS)

    Butler, J. B.; Martin, D. N.; Roberts, E.; Domuah, R.

    2009-04-01

    uniform flow formulae (Manning's Equation) to direct flow over the model domain, sourcing water from the channel or sea so as to provide a detailed representation of river and coastal flood risk. The initial development step was to include spatially-distributed rainfall as a new source term within the model domain. This required optimisation to improve computational efficiency, given the ubiquity of ‘wet' cells early on in the simulation. Collaboration with UK water companies has provided detailed drainage information, and from this a simplified representation of the drainage system has been included in the model via the inclusion of sinks and sources of water from the drainage network. This approach has clear advantages relative to a fully coupled method both in terms of reduced input data requirements and computational overhead. Further, given the difficulties associated with obtaining drainage information over large areas, tests were conducted to evaluate uncertainties associated with excluding drainage information and the impact that this has upon flood model predictions. This information can be used, for example, to inform insurance underwriting strategies and loss estimation as well as for emergency response and planning purposes. The Flowroute surface-water flood risk platform enables efficient mapping of areas sensitive to flooding from high-intensity rainfall events due to topography and drainage infrastructure. As such, the technology has widespread potential for use as a risk mapping tool by the UK Environment Agency, European Member States, water authorities, local governments and the insurance industry. Keywords: Surface water flooding, Model Uncertainty, Insurance Underwriting, Flood inundation modelling, Risk mapping.

  14. Identification and delineation of areas flood hazard using high accuracy of DEM data

    NASA Astrophysics Data System (ADS)

    Riadi, B.; Barus, B.; Widiatmaka; Yanuar, M. J. P.; Pramudya, B.

    2018-05-01

    Flood incidents that often occur in Karawang regency need to be mitigated. These expectations exist on technologies that can predict, anticipate and reduce disaster risks. Flood modeling techniques using Digital Elevation Model (DEM) data can be applied in mitigation activities. High accuracy DEM data used in modeling, will result in better flooding flood models. The result of high accuracy DEM data processing will yield information about surface morphology which can be used to identify indication of flood hazard area. The purpose of this study was to identify and describe flood hazard areas by identifying wetland areas using DEM data and Landsat-8 images. TerraSAR-X high-resolution data is used to detect wetlands from landscapes, while land cover is identified by Landsat image data. The Topography Wetness Index (TWI) method is used to detect and identify wetland areas with basic DEM data, while for land cover analysis using Tasseled Cap Transformation (TCT) method. The result of TWI modeling yields information about potential land of flood. Overlay TWI map with land cover map that produces information that in Karawang regency the most vulnerable areas occur flooding in rice fields. The spatial accuracy of the flood hazard area in this study was 87%.

  15. New developments at the Flood Forecasting Centre: operational flood risk assessment and guidance

    NASA Astrophysics Data System (ADS)

    Pilling, Charlie

    2017-04-01

    The Flood Forecasting Centre (FFC) is a partnership between the UK Met Office, the Environment Agency and Natural Resources Wales. The FFC was established in 2009 to provide an overview of flood risk across England and Wales and to provide flood guidance services primarily for the emergency response community. The FFC provides forecasts for all natural sources of flooding, these being fluvial, surface water, coastal and groundwater. This involves an assessment of possible hydrometeorological events and their impacts over the next five days. During times of heightened flood risk, the close communication between the FFC, the Environment Agency and Natural Resources Wales allows mobilization and deployment of staff and flood defences. Following a number of severe flood events during winters 2013-14 and 2015-16, coupled with a drive from the changing landscape in national incident response, there is a desire to identify flood events at even longer lead time. This earlier assessment and mobilization is becoming increasingly important and high profile within Government. For example, following the exceptional flooding across the north of England in December 2015 the Environment Agency have invested in 40 km of temporary barriers that will be moved around the country to help mitigate against the impacts of large flood events. Efficient and effective use of these barriers depends on identifying the broad regions at risk well in advance of the flood, as well as scaling the magnitude and duration of large events. Partly in response to this, the FFC now produce a flood risk assessment for a month ahead. In addition, since January 2017, the 'new generation' daily flood guidance statement includes an assessment of flood risk for the 6 to 10 day period. Examples of both these new products will be introduced, as will some of the new developments in science and technical capability that underpin these assessments. Examples include improvements to fluvial forecasting from 'fluvial

  16. Use of Flood Seasonality in Pooling-Group Formation and Quantile Estimation: An Application in Great Britain

    NASA Astrophysics Data System (ADS)

    Formetta, Giuseppe; Bell, Victoria; Stewart, Elizabeth

    2018-02-01

    Regional flood frequency analysis is one of the most commonly applied methods for estimating extreme flood events at ungauged sites or locations with short measurement records. It is based on: (i) the definition of a homogeneous group (pooling-group) of catchments, and on (ii) the use of the pooling-group data to estimate flood quantiles. Although many methods to define a pooling-group (pooling schemes, PS) are based on catchment physiographic similarity measures, in the last decade methods based on flood seasonality similarity have been contemplated. In this paper, two seasonality-based PS are proposed and tested both in terms of the homogeneity of the pooling-groups they generate and in terms of the accuracy in estimating extreme flood events. The method has been applied in 420 catchments in Great Britain (considered as both gauged and ungauged) and compared against the current Flood Estimation Handbook (FEH) PS. Results for gauged sites show that, compared to the current PS, the seasonality-based PS performs better both in terms of homogeneity of the pooling-group and in terms of the accuracy of flood quantile estimates. For ungauged locations, a national-scale hydrological model has been used for the first time to quantify flood seasonality. Results show that in 75% of the tested locations the seasonality-based PS provides an improvement in the accuracy of the flood quantile estimates. The remaining 25% were located in highly urbanized, groundwater-dependent catchments. The promising results support the aspiration that large-scale hydrological models complement traditional methods for estimating design floods.

  17. Paleohydrology of flash floods in small desert watersheds in western Arizona

    NASA Astrophysics Data System (ADS)

    House, P. Kyle; Baker, Victor R.

    2001-06-01

    In this study, geological, historical, and meteorological data were combined to produce a regional chronology of flood magnitude and frequency in nine small basins (7-70 km2). The chronology spans more than 1000 years and demonstrates that detailed records of flood magnitude and frequency can be compiled in arid regions with little to no conventional hydrologic information. The recent (i.e., post-1950) flood history was evaluated by comparing a 50-year series of aerial photographs with precipitation data, ages of flood-transported beer cans, anthropogenic horizons in flood sediments, postbomb 14C dates on flotsam, and anecdotal accounts. Stratigraphic analysis of paleoflood deposits extended the regional flood record in time, and associated flood magnitudes were determined by incorporating relict high-water evidence into a hydraulic model. The results reveal a general consistency among the magnitudes of the largest floods in the historical and the paleoflood records and indicate that the magnitudes and relative frequencies of actual large floods are at variance with "100-year" flood magnitudes predicted by regional flood frequency models. This suggests that the predictive equations may not be appropriate for regulatory, management, or design purposes in the absence of additional, real data on flooding. Augmenting conventional approaches to regional flood magnitude and frequency analysis with real information derived from the alternative methods described here is a viable approach to improving assessments of regional flood characteristics in sparsely gaged desert areas.

  18. One year water chemistry monitoring of the flooding of the Meirama open pit (NW Spain)

    NASA Astrophysics Data System (ADS)

    Delgado, J.; Juncosa, R.; Vázquez, A.; Fernández-Bogo, S.

    2009-04-01

    In December, 2007, after 30 years of operations, the mine of Meirama finished the extraction of brown lignite. Starting in April 2008, the flooding of the open pit has started and this is leading to the formation of a large mining lake (~2 km2 surface and up to 180 m depth) in which surface (river and rain water) and ground waters are involved. Since the beginning of the flooding, lake waters are weekly sampled and analyzed for temperature, pH, redox, EC, TDS, TSS, DO,DIC, DOC, turbidity, alkalinity/acidity as well as nearly 40 inorganic chemical components. Stable water isotopes (deuterium and oxygen) are also being recorded. In order to better understand the dynamic chemical evolution of lake waters, the chemical characteristics of rain water as well as a series of lake tributaries and ground waters are also being measured. Since the beginning of the flooding process, the chemical quality of lake water has experienced an interesting evolution that obeys to a variety of circumstances. The silicic geologic substratum of the catchment determines that both ground and surface waters have a rather low alkalinity. Moreover, the presence of disseminated sulfides (mainly pyrite) within the schistous materials of the mine slopes and internal rock dumps provokes a significant acidic load. From April to October 2008, the lake waters had only the contribution of rain and ground waters. Since the beginning of October, a significant volume of surface waters has been derived to the mine hole. Taking pH as indicator, the first water body had a rather acidic pH (~3) which was progressively amended with the addition of a certain amount of lime to reach an upper value of ~8 by late August. The diminution in the addition of lime up to its elimination, in December, has conducted to the progressive acidification of the lake. At present, an instrumented floating deck is being deployed in the lake. This device will serve as a base point where it is planned to locate a series of

  19. Urban flood risk mitigation: from vulnerability assessment to resilient city

    NASA Astrophysics Data System (ADS)

    Serre, D.; Barroca, B.

    2009-04-01

    some research activities have been undertaken, there are no specific methods and tools to assess flood vulnerability at the scale of the city. Indeed, by studying literature we can list some vulnerability indicators and a few Geographic Information System (GIS) tools. But generally indicators and GIS are not developed specifically at the city scale: often a regional scale is used. Analyzing vulnerability at this scale needs more accurate and formalized indicators and GIS tools. The second limit of existing GIS is temporal: even if vulnerability could be assessed and localized through GIS, such tools cannot assist city managers in their decision to efficiency recover after a severe flood event. Due to scale and temporal limits, methods and tools available to assess urban vulnerability need large improvements. Talking into account all these considerations and limits, our research is focusing on: • vulnerability indicators design; • recovery scenarios design; • GIS for city vulnerability assessment and recovery scenarios. Dealing with vulnerability indicators, the goal is to design a set of indicators of city sub systems. Sub systems are seen like assets of high value and complex and interdependent infrastructure networks (i.e. power supplies, communications, water, transport etc.). The infrastructure networks are critical for the continuity of economic activities as well as for the people's basic living needs. Their availability is also required for fast and effective recovery after flood disasters. The severity of flood damage therefore largely depends on the degree that both high value assets and critical urban infrastructure are affected, either directly or indirectly. To face the challenge of designing indicators, a functional model of the city system (and sub systems) has to be built to analyze the system response to flood solicitation. Then, a coherent and an efficient set of vulnerability of indicators could be built up. With such methods city stakeholders

  20. Flood Extent Mapping Using Dual-Polarimetric SENTINEL-1 Synthetic Aperture Radar Imagery

    NASA Astrophysics Data System (ADS)

    Jo, M.-J.; Osmanoglu, B.; Zhang, B.; Wdowinski, S.

    2018-04-01

    Rapid generation of synthetic aperture radar (SAR) based flood extent maps provide valuable data in disaster response efforts thanks to the cloud penetrating ability of microwaves. We present a method using dual-polarimetric SAR imagery acquired on Sentinel-1a/b satellites. A false-colour map is generated using pre- and post- disaster imagery, allowing operators to distinguish between existing standing water pre-flooding, and recently flooded areas. The method works best in areas of standing water and provides mixed results in urban areas. A flood depth map is also estimated by using an external DEM. We will present the methodology, it's estimated accuracy as well as investigations into improving the response in urban areas.

  1. Identification and characterization of miRNAs and targets in flax (Linum usitatissimum) under saline, alkaline, and saline-alkaline stresses.

    PubMed

    Yu, Ying; Wu, Guangwen; Yuan, Hongmei; Cheng, Lili; Zhao, Dongsheng; Huang, Wengong; Zhang, Shuquan; Zhang, Liguo; Chen, Hongyu; Zhang, Jian; Guan, Fengzhi

    2016-05-27

    MicroRNAs (miRNAs) play a critical role in responses to biotic and abiotic stress and have been characterized in a large number of plant species. Although flax (Linum usitatissimum L.) is one of the most important fiber and oil crops worldwide, no reports have been published describing flax miRNAs (Lus-miRNAs) induced in response to saline, alkaline, and saline-alkaline stresses. In this work, combined small RNA and degradome deep sequencing was used to analyze flax libraries constructed after alkaline-salt stress (AS2), neutral salt stress (NSS), alkaline stress (AS), and the non-stressed control (CK). From the CK, AS, AS2, and NSS libraries, a total of 118, 119, 122, and 120 known Lus-miRNAs and 233, 213, 211, and 212 novel Lus-miRNAs were isolated, respectively. After assessment of differential expression profiles, 17 known Lus-miRNAs and 36 novel Lus-miRNAs were selected and used to predict putative target genes. Gene ontology term enrichment analysis revealed target genes that were involved in responses to stimuli, including signaling and catalytic activity. Eight Lus-miRNAs were selected for analysis using qRT-PCR to confirm the accuracy and reliability of the miRNA-seq results. The qRT-PCR results showed that changes in stress-induced expression profiles of these miRNAs mirrored expression trends observed using miRNA-seq. Degradome sequencing and transcriptome profiling showed that expression of 29 miRNA-target pairs displayed inverse expression patterns under saline, alkaline, and saline-alkaline stresses. From the target prediction analysis, the miR398a-targeted gene codes for a copper/zinc superoxide dismutase, and the miR530 has been shown to explicitly target WRKY family transcription factors, which suggesting that these two micRNAs and their targets may significant involve in the saline, alkaline, and saline-alkaline stress response in flax. Identification and characterization of flax miRNAs, their target genes, functional annotations, and gene

  2. Posttranslational heterogeneity of bone alkaline phosphatase in metabolic bone disease.

    PubMed

    Langlois, M R; Delanghe, J R; Kaufman, J M; De Buyzere, M L; Van Hoecke, M J; Leroux-Roels, G G

    1994-09-01

    Bone alkaline phosphatase is a marker of osteoblast activity. In order to study the posttranscriptional modification (glycosylation) of bone alkaline phosphatase in bone disease, we investigated the relationship between mass and catalytic activity of bone alkaline phosphatase in patients with osteoporosis and hyperthyroidism. Serum bone alkaline phosphatase activity was measured after lectin precipitation using the Iso-ALP test kit. Mass concentration of bone alkaline phosphatase was determined with an immunoradiometric assay (Tandem-R Ostase). In general, serum bone alkaline phosphatase mass and activity concentration correlated well. The activity : mass ratio of bone alkaline phosphatase was low in hyperthyroidism. Activation energy of the reaction catalysed by bone alkaline phosphatase was high in osteoporosis and in hyperthyroidism. Experiments with neuraminidase digestion further demonstrated that the thermodynamic heterogeneity of bone alkaline phosphatase can be explained by a different glycosylation of the enzyme.

  3. Urban flood return period assessment through rainfall-flood response modelling

    NASA Astrophysics Data System (ADS)

    Murla Tuyls, Damian; Thorndahl, Søren

    2017-04-01

    Intense rainfall can often cause severe floods, especially in urbanized areas, where population density or large impermeable areas are found. In this context, floods can generate a direct impact in a social-environmental-economic viewpoint. Traditionally, in design of Urban Drainage Systems (UDS), correlation between return period (RP) of a given rainfall and RP of its consequent flood has been assumed to be linear (e.g. DS/EN752 (2008)). However, this is not always the case. Complex UDS, where diverse hydraulic infrastructures are often found, increase the heterogeneity of system response, which may cause an alteration of the mentioned correlation. Consequently, reliability on future urban planning, design and resilience against floods may be also affected by this misassumption. In this study, an assessment of surface flood RP across rainfall RP has been carried out at Lystrup, a urbanized catchment area of 440ha and 10.400inhab. located in Jutland (Denmark), which has received the impact of several pluvial flooding in the last recent years. A historical rainfall dataset from the last 35 years from two different rain gauges located at 2 and 10 km from the study area has been provided by the Danish Wastewater Pollution Committee and the Danish Meteorological Institute (DMI). The most extreme 25 rainfall events have been selected through a two-step multi-criteria procedure, ensuring an adequate variability of rainfall, from extreme high peak storms with a short duration to moderate rainfall with longer duration. In addition, a coupled 1D/2D surface and network UDS model of the catchment area developed in an integrated MIKE URBAN and MIKE Flood model (DHI 2014), considering both permeable and impermeable areas, in combination with a DTM (2x2m res.) has been used to study and assess in detail flood RP. Results show an ambiguous relation between rainfall RP and flood response. Local flood levels, flood area and volume RP estimates should therefore not be neglected in

  4. The Incidence of Posttraumatic Stress Disorder After Floods: A Meta-Analysis.

    PubMed

    Chen, Long; Liu, Aizhong

    2015-06-01

    This study analyzes the incidence of posttraumatic stress disorder (PTSD) among flood victims, between different flood intensities, and between different time points after a flood. A search of several electronic literature databases was conducted to collect data on the incidence of PTSD after a flood. Loney criteria for research quality were used to evaluate the quality of selected search results. The combined incidence of PTSD was estimated using the Freeman-Tukey double arcsine transformation method. Subgroup analyses were conducted on different trauma intensities and different time points after a flood. Sensitivity analysis was performed to evaluate the impact of research quality. Fourteen articles were included in this meta-analysis, including a total of 40 600 flood victims; 3862 victims were diagnosed with PTSD. The combined incidence of PTSD was 15.74%. The subgroup analyses showed that the incidence of PTSD in victims who experienced severe and moderate flood intensity was higher than that in victims who experienced mild flood intensity. The incidence of PTSD was lower at 6 or more months after a flood (11.45%) than within 6 months (16.01%) of a flood. In conclusion, the incidence of PTSD among floods of different trauma intensities was statistically significant.

  5. Floods in 2002 and 2013: comparing flood warnings and emergency measures from the perspective of affected parties

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Pech, Ina; Schröter, Kai; Müller, Meike; Thieken, Annegret

    2016-04-01

    Early warning is essential for protecting people and mitigating damage in case of flood events. However, early warning is only helpful if the flood-endangered parties are reached by the warning and if they know how to react effectively. Finding suitable methods for communicating helpful warnings to the "last mile" remains a challenge, but not much information is available. Surveys were undertaken after the August 2002 and the June 2013 floods in Germany, asking affected private households and companies about warnings they received and emergency measures they undertook. Results show, that in 2002 early warning did not work well: in too many areas warnings came too late or were too imprecise and many people (27%) and companies (45%) did not receive a flood warning. Afterwards, the warning systems were significantly improved, so that in 2013 only a small share of the affected people (7%) and companies (7 %) was not reached by any warning. Additionally, private households and companies were hardly aware of the flood risk in the Elbe catchment before 2002, mainly due to a lack of flood experience. For instance, in 2002 only 14% of private households clearly knew how to protect themselves and their assets when the warning reached them, in 2013 this fraction was 46 %. Although the share of companies which had an emergency plan in place had increased from 10 % in 2002 to 26 % in 2013, and the share of those conducting regular emergency exercises had increased from 4 % to 13 %, there is still plenty of room for improvement. Therefore, integrated early warning systems from monitoring through to the reaction of the affected parties as well as effective risk and emergency communication need continuous further improvement to protect people and mitigate residual risks in case of floods.

  6. Confidence intervals for expected moments algorithm flood quantile estimates

    USGS Publications Warehouse

    Cohn, Timothy A.; Lane, William L.; Stedinger, Jery R.

    2001-01-01

    Historical and paleoflood information can substantially improve flood frequency estimates if appropriate statistical procedures are properly applied. However, the Federal guidelines for flood frequency analysis, set forth in Bulletin 17B, rely on an inefficient “weighting” procedure that fails to take advantage of historical and paleoflood information. This has led researchers to propose several more efficient alternatives including the Expected Moments Algorithm (EMA), which is attractive because it retains Bulletin 17B's statistical structure (method of moments with the Log Pearson Type 3 distribution) and thus can be easily integrated into flood analyses employing the rest of the Bulletin 17B approach. The practical utility of EMA, however, has been limited because no closed‐form method has been available for quantifying the uncertainty of EMA‐based flood quantile estimates. This paper addresses that concern by providing analytical expressions for the asymptotic variance of EMA flood‐quantile estimators and confidence intervals for flood quantile estimates. Monte Carlo simulations demonstrate the properties of such confidence intervals for sites where a 25‐ to 100‐year streamgage record is augmented by 50 to 150 years of historical information. The experiments show that the confidence intervals, though not exact, should be acceptable for most purposes.

  7. Pleistocene glaciers, lakes, and floods in north-central Washington State

    USGS Publications Warehouse

    Waitt, Richard B.; Haugerud, Ralph A.; Kelsey, Harvey M.

    2017-01-01

    The Methow, Chelan, Wenatchee, and other terrane blocks accreted in late Mesozoic to Eocene times. Methow valley is excavated in an exotic terrane of folded Mesozoic sedimentary and volcanic rocks faulted between crystalline blocks. Repeated floods of Columbia River Basalt about 16 Ma drowned a backarc basin to the southeast. Cirques, aretes, and U-shaped hanging troughs brand the Methow, Skagit, and Chelan headwaters. The Late Wisconsin Cordilleran icesheet beveled the alpine topography and deposited drift. Cordilleran ice flowed into the heads of Methow tributaries and overflowed from Skagit tributaries to greatly augment Chelan trough's glacier. Joined Okanogan and Methow ice flowed down Columbia valley and up lower Chelan trough. This tongue met the icesheet tongue flowing southeast down Chelan valley. Successively lower ice-marginal channels and kame terraces show that the icesheet withered away largely by downwasting. Immense late Wisconsin floods from glacial Lake Missoula occasionally swept the Chelan-Vantage reach of Columbia valley by different routes. The earliest debacles, nearly 19,000 cal yr BP (by radiocarbon methods), raged 335 m deep down the Columbia and built high Pangborn bar at Wenatchee. As Cordilleran ice blocked the northwest of Columbia valley, several giant floods descended Moses Coulee and backflooded up the Columbia. As advancing ice then blocked Moses Coulee, Grand Coulee to Quincy basin became the westmost floodway. From Quincy basin many Missoula floods backflowed 50 km upvalley past Wenatchee 18,000 to 15,500 years ago. Receding ice dammed glacial Lake Columbia centuries more--till it burst about 15,000 years ago. After Glacier Peak ashfall about 13,600 years ago, smaller great flood(s) swept down the Columbia from glacial Lake Kootenay in British Columbia. A cache of huge fluted Clovis points had been laid atop Pangborn bar (East Wenatchee) after the Glacier Peak ashfall. Clovis people came two and a half millennia after the last

  8. Mapping Coastal Flood Zones for the National Flood Insurance Program

    NASA Astrophysics Data System (ADS)

    Carlton, D.; Cook, C. L.; Weber, J.

    2004-12-01

    The National Flood Insurance Program (NFIP) was created by Congress in 1968, and significantly amended in 1973 to reduce loss of life and property caused by flooding, reduce disaster relief costs caused by flooding and make Federally backed flood insurance available to property owners. These goals were to be achieved by requiring building to be built to resist flood damages, guide construction away from flood hazards, and transferring the cost of flood losses from taxpayers to policyholders. Areas subject to flood hazards were defined as those areas that have a probability greater than 1 percent of being inundated in any given year. Currently over 19,000 communities participate in the NFIP, many of them coastal communities subject to flooding from tides, storm surge, waves, or tsunamis. The mapping of coastal hazard areas began in the early 1970's and has been evolving ever since. At first only high tides and storm surge were considered in determining the hazardous areas. Then, after significant wave caused storm damage to structures outside of the mapped hazard areas wave hazards were also considered. For many years FEMA has had Guidelines and Specifications for mapping coastal hazards for the East Coast and the Gulf Coast. In September of 2003 a study was begun to develop similar Guidelines and Specifications for the Pacific Coast. Draft Guidelines and Specifications will be delivered to FEMA by September 30, 2004. During the study tsunamis were identified as a potential source of a 1 percent flood event on the West Coast. To better understand the analytical results, and develop adequate techniques to estimate the magnitude of a tsunami with a 1 percent probability of being equaled or exceeded in any year, a pilot study has begun at Seaside Oregon. Both the onshore velocity and the resulting wave runup are critical functions for FEMA to understand and potentially map. The pilot study is a cooperative venture between NOAA and USGS that is partially funded by both

  9. Field Testing of Energy-Efficient Flood-Damage-Resistant Residential Envelope Systems Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aglan, H.

    2005-08-04

    The primary purpose of the project was to identify materials and methods that will make the envelope of a house flood damage resistant. Flood damage resistant materials and systems are intended to be used to repair houses subsequent to flooding. This project was also intended to develop methods of restoring the envelopes of houses that have been flooded but are repairable and may be subject to future flooding. Then if the house floods again, damage will not be as extensive as in previous flood events and restoration costs and efforts will be minimized. The purpose of the first pair ofmore » field tests was to establish a baseline for typical current residential construction practice. The first test modules used materials and systems that were commonly found in residential envelopes throughout the U.S. The purpose of the second pair of field tests was to begin evaluating potential residential envelope materials and systems that were projected to be more flood-damage resistant and restorable than the conventional materials and systems tested in the first pair of tests. The purpose of testing the third slab-on-grade module was to attempt to dry flood proof the module (no floodwater within the structure). If the module could be sealed well enough to prevent water from entering, then this would be an effective method of making the interior materials and systems flood damage resistant. The third crawl space module was tested in the same manner as the previous modules and provided an opportunity to do flood tests of additional residential materials and systems. Another purpose of the project was to develop the methodology to collect representative, measured, reproducible (i.e. scientific) data on how various residential materials and systems respond to flooding conditions so that future recommendations for repairing flood damaged houses could be based on scientific data. An additional benefit of collecting this data is that it will be used in the development of a

  10. Analyzing Future Flooding under Climate Change Scenario using CMIP5 Streamflow Data

    NASA Astrophysics Data System (ADS)

    Nyaupane, Narayan; Parajuli, Ranjan; Kalra, Ajay

    2017-12-01

    Flooding is the most severe and costlier natural hazard in US. The effect of climate change has intensified the scenario in recent years. Flood prevention practice along with proper understanding of flooding event can mitigate the risk of such hazard. The flood plain mapping is one of the technique to quantify the severity of the flooding. Carson City, which is one of the agricultural area in the desert of Nevada has experienced peak flood in recent year. The underlying probability distribution for the area, latest Coupled Model Intercomparison Project (CMIP5) streamflow data of Carson River were analyzed for 27 different statistical distributions. The best fitted distribution underlying was used to forecast the 100yr flood (design flood). The data from 1950-2099 derived from 31 model and total 97 projections were used to predict the future streamflow. Delta change method is adopted to quantify the amount of future (2050-2099) flood. To determine the extent of flooding 3 scenarios (i) historic design flood, (ii) 500yr flood and (iii) future 100yr flood were routed on a HEC-RAS model, prepared using available terrain data. Some of the climate projection shows extreme increase in future design flood. The future design flood could be more than the historic 500yr flood. At the same time, the extent of flooding could go beyond the historic flood of 0.2% annual probability. This study suggests an approach to quantify the future flood and floodplain using climate model projections. The study would provide helpful information to the facility manager, design engineer and stake holders.

  11. Alkaline polymer electrolyte membranes for fuel cell applications.

    PubMed

    Wang, Yan-Jie; Qiao, Jinli; Baker, Ryan; Zhang, Jiujun

    2013-07-07

    In this review, we examine the most recent progress and research trends in the area of alkaline polymer electrolyte membrane (PEM) development in terms of material selection, synthesis, characterization, and theoretical approach, as well as their fabrication into alkaline PEM-based membrane electrode assemblies (MEAs) and the corresponding performance/durability in alkaline polymer electrolyte membrane fuel cells (PEMFCs). Respective advantages and challenges are also reviewed. To overcome challenges hindering alkaline PEM technology advancement and commercialization, several research directions are then proposed.

  12. Spectroflourometric and spectrophotometric methods for the determination of sitagliptin in binary mixture with metformin and ternary mixture with metformin and sitagliptin alkaline degradation product.

    PubMed

    El-Bagary, Ramzia I; Elkady, Ehab F; Ayoub, Bassam M

    2011-03-01

    Simple, accurate and precise spectroflourometric and spectrophotometric methods have been developed and validated for the determination of sitagliptin phosphate monohydrate (STG) and metformin HCL (MET). Zero order, first derivative, ratio derivative spectrophotometric methods and flourometric methods have been developed. The zero order spectrophotometric method was used for the determination of STG in the range of 50-300 μg mL(-1). The first derivative spectrophotometric method was used for the determination of MET in the range of 2-12 μg mL(-1) and STG in the range of 50-300 μg mL(-1) by measuring the peak amplitude at 246.5 nm and 275 nm, respectively. The first derivative of ratio spectra spectrophotometric method used the peak amplitudes at 232 nm and 239 nm for the determination of MET in the range of 2-12 μg mL(-1). The flourometric method was used for the determination of STG in the range of 0.25-110 μg mL(-1). The proposed methods used to determine each drug in binary mixture with metformin and ternary mixture with metformin and sitagliptin alkaline degradation product that is obtained after alkaline hydrolysis of sitagliptin. The results were statistically compared using one-way analysis of variance (ANOVA). The methods developed were satisfactorily applied to the analysis of the pharmaceutical formulations and proved to be specific and accurate for the quality control of the cited drugs in pharmaceutical dosage forms.

  13. Spectroflourometric and Spectrophotometric Methods for the Determination of Sitagliptin in Binary Mixture with Metformin and Ternary Mixture with Metformin and Sitagliptin Alkaline Degradation Product

    PubMed Central

    El-Bagary, Ramzia I.; Elkady, Ehab F.; Ayoub, Bassam M.

    2011-01-01

    Simple, accurate and precise spectroflourometric and spectrophotometric methods have been developed and validated for the determination of sitagliptin phosphate monohydrate (STG) and metformin HCL (MET). Zero order, first derivative, ratio derivative spectrophotometric methods and flourometric methods have been developed. The zero order spectrophotometric method was used for the determination of STG in the range of 50-300 μg mL-1. The first derivative spectrophotometric method was used for the determination of MET in the range of 2–12 μg mL-1 and STG in the range of 50-300 μg mL-1 by measuring the peak amplitude at 246.5 nm and 275 nm, respectively. The first derivative of ratio spectra spectrophotometric method used the peak amplitudes at 232 nm and 239 nm for the determination of MET in the range of 2–12 μg mL-1. The flourometric method was used for the determination of STG in the range of 0.25-110 μg mL-1. The proposed methods used to determine each drug in binary mixture with metformin and ternary mixture with metformin and sitagliptin alkaline degradation product that is obtained after alkaline hydrolysis of sitagliptin. The results were statistically compared using one-way analysis of variance (ANOVA). The methods developed were satisfactorily applied to the analysis of the pharmaceutical formulations and proved to be specific and accurate for the quality control of the cited drugs in pharmaceutical dosage forms. PMID:23675222

  14. Evidence of floods on the Potomac River from anatomical abnormalities in the wood of flood-plain trees

    USGS Publications Warehouse

    Yanosky, Thomas M.

    1983-01-01

    Ash trees along the Potomac River flood plain near Washington, D.C., were studied to determine changes in wood anatomy related to flood damage, and anomalous growth was compared to flood records for April 15 to August 31, 1930-79. Collectively, anatomical evidence was detected for 33 of the 34 growing-season floods during the study period. Evidence of 12 floods prior to 1930 was also noted, including catastrophic ones in 1889 and 1924. Trees damaged after the transition from earlywood to latewood growth typically formed ' flood rings ' of enlarged vessels within the latewood zone. Trees damaged near the beginning of the growth year developed flood rings within, or contiguous with, the earlywood. Both patterns are assumed to have developed when flood-damaged trees produced a second crop of leaves. Trees damaged by high-magnitude floods developed well formed flood rings along the entire height and around the entire circumference of the stem. Small floods were generally associated wtih diffuse or discontinuous anomalies restricted to stem apices. Frequency of flood rings was positively related to flood magnitude, and time of flood generation during the tree-growth season was estimated from the radial position of anomalous growth relative to annual ring width. Reconstructing tree heights in a year of flood-ring formation gives a minimum stage estimate along local stream reaches. Some trees provided evidence of numerous floods. Those with the greatest number of flood rings grew on frequently flooded surfaces subject to flood-flow velocities of at least 1 m/s, and more typically greater than 2 m/s. Tree size, more than age, was related to flood-ring formation. Trees kept small by frequent flood damage had more flood rings than taller trees of comparable age. (USGS)

  15. Iowa Flood Information System

    NASA Astrophysics Data System (ADS)

    Demir, I.; Krajewski, W. F.; Goska, R.; Mantilla, R.; Weber, L. J.; Young, N.

    2011-12-01

    The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, flood-related data, information and interactive visualizations for communities in Iowa. The key element of the system's architecture is the notion of community. Locations of the communities, those near streams and rivers, define basin boundaries. The IFIS provides community-centric watershed and river characteristics, weather (rainfall) conditions, and streamflow data and visualization tools. Interactive interfaces allow access to inundation maps for different stage and return period values, and flooding scenarios with contributions from multiple rivers. Real-time and historical data of water levels, gauge heights, and rainfall conditions are available in the IFIS by streaming data from automated IFC bridge sensors, USGS stream gauges, NEXRAD radars, and NWS forecasts. Simple 2D and 3D interactive visualizations in the IFIS make the data more understandable to general public. Users are able to filter data sources for their communities and selected rivers. The data and information on IFIS is also accessible through web services and mobile applications. The IFIS is optimized for various browsers and screen sizes to provide access through multiple platforms including tablets and mobile devices. The IFIS includes a rainfall-runoff forecast model to provide a five-day flood risk estimate for around 500 communities in Iowa. Multiple view modes in the IFIS accommodate different user types from general public to researchers and decision makers by providing different level of tools and details. River view mode allows users to visualize data from multiple IFC bridge sensors and USGS stream gauges to follow flooding condition along a river. The IFIS will help communities make better-informed decisions on the occurrence of floods, and will alert communities

  16. Lessons From the Largest Historic Floods Documented by the U.S. Geological Survey

    NASA Astrophysics Data System (ADS)

    Costa, J. E.

    2003-12-01

    A recent controversy over the flood risk downstream from a USGS streamgaging station in southern California that recorded a large debris flow led to the decision to closely examine a sample of the largest floods documented in the US. Twenty-nine floods that define the envelope curve of the largest rainfall-runoff floods were examined in detail, including field visits. These floods have a profound impact on local, regional, and national interpretations of potential peak discharges and flood risk. These 29 floods occured throughout the US from the northern Chesapeake Bay in Maryland to Kauai, Hawaii, and over time from 1935-1978. Methods used to compute peak discharges were slope-area (21/29), culvert computations (2/29), measurements lost or not available for study (2/29), bridge contraction, culvert flow, and flow over road (1/29), rating curve extension (1/29), current meter measurement (1/29), and rating curve and current meter measurement (1/29). While field methods and tools have improved significantly over the last 70 years (e.g. total stations, GPS, GIS, hydroacoustics, digital plotters and computer programs like SAC and CAP), the primary methods of hydraulic analysis for indirect measurements of outstanding floods has not changed: today flow is still assumed to be 1-D and gradually varied. Unsteady or multi-dimensional flow models are rarely if ever used to determine peak discharges. Problems identified in this sample of 29 floods include debris flows misidentified as water floods, small drainage areas determined from small-scale maps and mislocated sites, high-water marks set by transient hydraulic phenomena, possibility of disconnected flow surfaces, scour assumptions in sand channels, poor site selection, incorrect approach angle for road overflow, and missing or lost records. Each published flood magnitude was checked by applying modern computer models with original field data, or by re-calculating computations. Four of 29 floods in this sample were

  17. Contribution of an exposure indicator to better anticipate damages with the AIGA flood warning method: a case study in the South of France

    NASA Astrophysics Data System (ADS)

    Saint-Martin, Clotilde; Fouchier, Catherine; Douvinet, Johnny; Javelle, Pierre; Vinet, Freddy

    2016-04-01

    On the 3rd October 2015, heavy localized precipitations have occurred in South Eastern France leading to major flash floods on the Mediterranean coast. The severity of those floods has caused 20 fatalities and important damage in almost 50 municipalities in the French administrative area of Alpes-Maritimes. The local recording rain gauges have shown how fast the event has happened: 156 mm of rain were recorded in Mandelieu-la-Napoule and 145 mm in Cannes within 2 hours. As the affected rivers are not monitored, no anticipation was possible from the authorities in charge of risk management. In this case, forecasting floods is indeed complex because of the small size of the watersheds which implies a reduced catchment response time. In order to cope with the need of issuing flood warnings on un-monitored small catchments, Irstea and Météo-France have developed an alternative warning system for ungauged basins called the AIGA method. AIGA is a flood warning system based on a simple distributed hydrological model run at a 1 km² resolution using real time radar rainfall information (Javelle, Demargne, Defrance, Pansu, & Arnaud, 2014). The flood warnings, produced every 15 minutes, result of the comparison of the real time runoff data produced by the model with statistical runoff values. AIGA is running in real time in the South of France, within the RHYTMME project (https://rhytmme.irstea.fr/). Work is on-going in order to offer a similar service for the whole French territory. More than 200 impacts of the 3rd October floods have been located using media, social networks and fieldwork. The first comparisons between these impacts and the AIGA warning levels computed for this event show several discrepancies. However, these latter discrepancies appear to be explained by the land-use. An indicator of the exposure of territories to flooding has thus been created to weight the levels of the AIGA hydrological warnings with the land-use of the area surrounding the streams

  18. Computation of backwater and discharge at width constrictions of heavily vegetated flood plains

    USGS Publications Warehouse

    Schneider, V.R.; Board, J.W.; Colson, B.E.; Lee, F.N.; Druffel, Leroy

    1977-01-01

    The U.S. Geological Survey, cooperated with the Federal Highway Administration and the State Highway Departments of Mississippi, Alabama, and Louisiana, to develop a proposed method for computing backwater and discharge at width constrictions of heavily vegetated flood plains. Data were collected at 20 single opening sites for 31 floods. Flood-plain width varied from 4 to 14 times the bridge opening width. The recurrence intervals of peak discharge ranged from a 2-year flood to greater than a 100-year flood, with a median interval of 6 years. Measured backwater ranged from 0.39 to 3.16 feet. Backwater computed by the present standard Geological Survey method averaged 29 percent less than the measured, and that computed by the currently used Federal Highway Administration method averaged 47 percent less than the measured. Discharge computed by the Survey method averaged 21 percent more then the measured. Analysis of data showed that the flood-plain widths and the Manning 's roughness coefficient are larger than those used to develop the standard methods. A method to more accurately compute backwater and discharge was developed. The difference between the contracted and natural water-surface profiles computed using standard step-backwater procedures is defined as backwater. The energy loss terms in the step-backwater procedure are computed as the product of the geometric mean of the energy slopes and the flow distance in the reach was derived from potential flow theory. The mean error was 1 percent when using the proposed method for computing backwater and 3 percent for computing discharge. (Woodard-USGS)

  19. Epic Flooding in Georgia, 2009

    USGS Publications Warehouse

    Gotvald, Anthony J.; McCallum, Brian E.

    2010-01-01

    Metropolitan Atlanta-September 2009 Floods The epic floods experienced in the Atlanta area in September 2009 were extremely rare. Eighteen streamgages in the Metropolitan Atlanta area had flood magnitudes much greater than the estimated 0.2-percent (500-year) annual exceedance probability. The Federal Emergency Management Agency (FEMA) reported that 23 counties in Georgia were declared disaster areas due to this flood and that 16,981 homes and 3,482 businesses were affected by floodwaters. Ten lives were lost in the flood. The total estimated damages exceed $193 million (H.E. Longenecker, Federal Emergency Management Agency, written commun., November 2009). On Sweetwater Creek near Austell, Ga., just north of Interstate 20, the peak stage was more than 6 feet higher than the estimated peak stage of the 0.2-percent (500-year) flood. Flood magnitudes in Cobb County on Sweetwater, Butler, and Powder Springs Creeks greatly exceeded the estimated 0.2-percent (500-year) floods for these streams. In Douglas County, the Dog River at Ga. Highway 5 near Fairplay had a peak stage nearly 20 feet higher than the estimated peak stage of the 0.2-percent (500-year) flood. On the Chattahoochee River, the U.S. Geological Survey (USGS) gage at Vinings reached the highest level recorded in the past 81 years. Gwinnett, De Kalb, Fulton, and Rockdale Counties also had record flooding.South Georgia March and April 2009 FloodsThe March and April 2009 floods in South Georgia were smaller in magnitude than the September floods but still caused significant damage. No lives were lost in this flood. Approximately $60 million in public infrastructure damage occurred to roads, culverts, bridges and a water treatment facility (Joseph T. McKinney, Federal Emergency Management Agency, written commun., July 2009). Flow at the Satilla River near Waycross, exceeded the 0.5-percent (200-year) flood. Flows at seven other stations in South Georgia exceeded the 1-percent (100-year) flood.

  20. A methodology for urban flood resilience assessment

    NASA Astrophysics Data System (ADS)

    Lhomme, Serge; Serre, Damien; Diab, Youssef; Laganier, Richard

    2010-05-01

    , multiple networks that innervate the city are particularly sensitive to flooding, through their structures and geographic constraints. Because societal functions are highly dependent on networked systems and the operability of these systems can be vulnerable to disasters, there is a need to understand how networked systems are resilient. That is why, considering that networks can be regarded as the "flood gateway" [Lhomme et al., 2009], we will focus on the resilience assessment of these critical networks before urban resilience assessment. The first part of this paper introduce resilience concept to well understand the importance of this concept to manage flood risk and of assessing this resilience. In a second part, this paper presents the use of safety methods to model network system dysfunctions during flood and then to produce resilience indicators. Finally it presents use of graph theory to assess adaptive capacity of these networks. These researches are the first steps toward the development of a GIS tool to optimize preparedness and recovery after a flood event.

  1. Applications of Experimental Suomi-NPP VIIRS Flood Inundation Maps in Operational Flood Forecasting

    NASA Astrophysics Data System (ADS)

    Deweese, M. M.

    2017-12-01

    Flooding is the most costly natural disaster across the globe. In 2016 flooding caused more fatalities than any other natural disaster in the United States. The U.S. National Weather Service (NWS) is mandated to forecast rivers for the protection of life and property and the enhancement of the national economy. Since 2014, the NWS North Central River Forecast Center has utilized experimental near real time flood mapping products from the JPSS Suomi-NPP VIIRS satellite. These products have been demonstrated to provide reliable and high value information for forecasters in ice jam and snowmelt flooding in data sparse regions of the northern plains. In addition, they have proved valuable in rainfall induced flooding within the upper Mississippi River basin. Aerial photography and ground observations have validated the accuracy of the products. Examples are provided from numerous flooding events to demonstrate the operational application of this satellite derived information as a remotely sensed observational data source and it's utility in real time flood forecasting.

  2. Methods for estimating the magnitude and frequency of floods for urban and small, rural streams in Georgia, South Carolina, and North Carolina, 2011

    USGS Publications Warehouse

    Feaster, Toby D.; Gotvald, Anthony J.; Weaver, J. Curtis

    2014-01-01

    Reliable estimates of the magnitude and frequency of floods are essential for the design of transportation and water-conveyance structures, flood-insurance studies, and flood-plain management. Such estimates are particularly important in densely populated urban areas. In order to increase the number of streamflow-gaging stations (streamgages) available for analysis, expand the geographical coverage that would allow for application of regional regression equations across State boundaries, and build on a previous flood-frequency investigation of rural U.S Geological Survey streamgages in the Southeast United States, a multistate approach was used to update methods for determining the magnitude and frequency of floods in urban and small, rural streams that are not substantially affected by regulation or tidal fluctuations in Georgia, South Carolina, and North Carolina. The at-site flood-frequency analysis of annual peak-flow data for urban and small, rural streams (through September 30, 2011) included 116 urban streamgages and 32 small, rural streamgages, defined in this report as basins draining less than 1 square mile. The regional regression analysis included annual peak-flow data from an additional 338 rural streamgages previously included in U.S. Geological Survey flood-frequency reports and 2 additional rural streamgages in North Carolina that were not included in the previous Southeast rural flood-frequency investigation for a total of 488 streamgages included in the urban and small, rural regression analysis. The at-site flood-frequency analyses for the urban and small, rural streamgages included the expected moments algorithm, which is a modification of the Bulletin 17B log-Pearson type III method for fitting the statistical distribution to the logarithms of the annual peak flows. Where applicable, the flood-frequency analysis also included low-outlier and historic information. Additionally, the application of a generalized Grubbs-Becks test allowed for the

  3. Alkaline flocculation of Phaeodactylum tricornutum induced by brucite and calcite

    DOE PAGES

    Vandamme, Dries; Pohl, Philip I.; Beuckels, Annelies; ...

    2015-08-20

    Alkaline flocculation holds great potential as a low-cost harvesting method for marine microalgae biomass production. Alkaline flocculation is induced by an increase in pH and is related to precipitation of calcium and magnesium salts. In this study, we used the diatom Phaeodactylum tricornutum as model organism to study alkaline flocculation of marine microalgae cultured in seawater medium. Flocculation started when pH was increased to 10 and flocculation efficiency reached 90% when pH was 10.5, which was consistent with precipitation modeling for brucite or Mg(OH) 2. Compared to freshwater species, more magnesium is needed to achieve flocculation (>7.5 mM). Zeta potentialmore » measurements suggest that brucite precipitation caused flocculation by charge neutralization. When calcium concentration was 12.5 mM, flocculation was also observed at a pH of 10. Furthermore, zeta potential remained negative up to pH 11.5, suggesting that precipitated calcite caused flocculation by a sweeping coagulation mechanism.« less

  4. Alkalinity and hardness: Critical but elusive concepts in aquaculture

    USDA-ARS?s Scientific Manuscript database

    Total alkalinity and total hardness are familiar variables to those involved in aquatic animal production. Aquaculturists – both scientists and practitioners alike – tend to have some understanding of the two variables and of methods for adjusting their concentrations. The chemistry and the biolog...

  5. DEM-based Approaches for the Identification of Flood Prone Areas

    NASA Astrophysics Data System (ADS)

    Samela, Caterina; Manfreda, Salvatore; Nardi, Fernando; Grimaldi, Salvatore; Roth, Giorgio; Sole, Aurelia

    2013-04-01

    The remarkable number of inundations that caused, in the last decades, thousands of deaths and huge economic losses, testifies the extreme vulnerability of many Countries to the flood hazard. As a matter of fact, human activities are often developed in the floodplains, creating conditions of extremely high risk. Terrain morphology plays an important role in understanding, modelling and analyzing the hydraulic behaviour of flood waves. Research during the last 10 years has shown that the delineation of flood prone areas can be carried out using fast methods that relay on basin geomorphologic features. In fact, the availability of new technologies to measure surface elevation (e.g., GPS, SAR, SAR interferometry, RADAR and LASER altimetry) has given a strong impulse to the development of Digital Elevation Models (DEMs) based approaches. The identification of the dominant topographic controls on the flood inundation process is a critical research question that we try to tackle with a comparative analysis of several techniques. We reviewed four different approaches for the morphological characterization of a river basin with the aim to provide a description of their performances and to identify their range of applicability. In particular, we explored the potential of the following tools. 1) The hydrogeomorphic method proposed by Nardi et al. (2006) which defines the flood prone areas according to the water level in the river network through the hydrogeomorphic theory. 2) The linear binary classifier proposed by Degiorgis et al. (2012) which allows distinguishing flood-prone areas using two features related to the location of the site under exam with respect to the nearest hazard source. The two features, proposed in the study, are the length of the path that hydrologically connects the location under exam to the nearest element of the drainage network and the difference in elevation between the cell under exam and the final point of the same path. 3) The method by

  6. An Application of a Stochastic Semi-Continuous Simulation Method for Flood Frequency Analysis: A Case Study in Slovakia

    NASA Astrophysics Data System (ADS)

    Valent, Peter; Paquet, Emmanuel

    2017-09-01

    A reliable estimate of extreme flood characteristics has always been an active topic in hydrological research. Over the decades a large number of approaches and their modifications have been proposed and used, with various methods utilizing continuous simulation of catchment runoff, being the subject of the most intensive research in the last decade. In this paper a new and promising stochastic semi-continuous method is used to estimate extreme discharges in two mountainous Slovak catchments of the rivers Váh and Hron, in which snow-melt processes need to be taken into account. The SCHADEX method used, couples a precipitation probabilistic model with a rainfall-runoff model used to both continuously simulate catchment hydrological conditions and to transform generated synthetic rainfall events into corresponding discharges. The stochastic nature of the method means that a wide range of synthetic rainfall events were simulated on various historical catchment conditions, taking into account not only the saturation of soil, but also the amount of snow accumulated in the catchment. The results showed that the SCHADEX extreme discharge estimates with return periods of up to 100 years were comparable to those estimated by statistical approaches. In addition, two reconstructed historical floods with corresponding return periods of 100 and 1000 years were compared to the SCHADEX estimates. The results confirmed the usability of the method for estimating design discharges with a recurrence interval of more than 100 years and its applicability in Slovak conditions.

  7. Chemically durable polymer electrolytes for solid-state alkaline water electrolysis

    NASA Astrophysics Data System (ADS)

    Park, Eun Joo; Capuano, Christopher B.; Ayers, Katherine E.; Bae, Chulsung

    2018-01-01

    Generation of high purity hydrogen using electrochemical splitting of water is one of the most promising methods for sustainable fuel production. The materials to be used as solid-state electrolytes for alkaline water electrolyzer require high thermochemical stability against hydroxide ion attack in alkaline environment during the operation of electrolysis. In this study, two quaternary ammonium-tethered aromatic polymers were synthesized and investigated for anion exchange membrane (AEM)-based alkaline water electrolyzer. The membranes properties including ion exchange capacity (IEC), water uptake, swelling degree, and anion conductivity were studied. The membranes composed of all C-C bond polymer backbones and flexible side chain terminated by cation head groups exhibited remarkably good chemical stability by maintaining structural integrity in 1 M NaOH solution at 95 °C for 60 days. Initial electrochemical performance and steady-state operation performance were evaluated, and both membranes showed a good stabilization of the cell voltage during the steady-state operation at the constant current density at 200 mA/cm2. Although both membranes in current form require improvement in mechanical stability to afford better durability in electrolysis operation, the next generation AEMs based on this report could lead to potentially viable AEM candidates which can provide high electrolysis performance under alkaline operating condition.

  8. The development of flood map in Malaysia

    NASA Astrophysics Data System (ADS)

    Zakaria, Siti Fairus; Zin, Rosli Mohamad; Mohamad, Ismail; Balubaid, Saeed; Mydin, Shaik Hussein; MDR, E. M. Roodienyanto

    2017-11-01

    In Malaysia, flash floods are common occurrences throughout the year in flood prone areas. In terms of flood extent, flash floods affect smaller areas but because of its tendency to occur in densely urbanized areas, the value of damaged property is high and disruption to traffic flow and businesses are substantial. However, in river floods especially the river floods of Kelantan and Pahang, the flood extent is widespread and can extend over 1,000 square kilometers. Although the value of property and density of affected population is lower, the damage inflicted by these floods can also be high because the area affected is large. In order to combat these floods, various flood mitigation measures have been carried out. Structural flood mitigation alone can only provide protection levels from 10 to 100 years Average Recurrence Intervals (ARI). One of the economically effective non-structural approaches in flood mitigation and flood management is using a geospatial technology which involves flood forecasting and warning services to the flood prone areas. This approach which involves the use of Geographical Information Flood Forecasting system also includes the generation of a series of flood maps. There are three types of flood maps namely Flood Hazard Map, Flood Risk Map and Flood Evacuation Map. Flood Hazard Map is used to determine areas susceptible to flooding when discharge from a stream exceeds the bank-full stage. Early warnings of incoming flood events will enable the flood victims to prepare themselves before flooding occurs. Properties and life's can be saved by keeping their movable properties above the flood levels and if necessary, an early evacuation from the area. With respect to flood fighting, an early warning with reference through a series of flood maps including flood hazard map, flood risk map and flood evacuation map of the approaching flood should be able to alert the organization in charge of the flood fighting actions and the authority to

  9. Local Flood Proofing Programs

    DTIC Science & Technology

    2005-02-01

    Carolina, funded its flood audits and other flood protection projects with stormwater utility income. Impact fees: Impact fees are contributions...determining appropriate projects . Local Flood Proofing Programs – 68 – February 2005 Bolingbrook’s Flood Audit Bolingbrook, Illinois, has used different...GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND

  10. Floods at Mount Clemens, Michigan

    USGS Publications Warehouse

    Wiitala, S.W.; Ash, Arlington D.

    1962-01-01

    The approximate areas inundated during the flood of April 5-6, 1947, by Clinton River, North Branch and Middle Branch of Clinton River, and Harrington Drain, in Clinton Township, Macomb County, Mich., are shown on a topographic map base to record the flood hazard in graphical form. The flood of April 1947 is the highest known since 1934 and probably since 1902. Greater floods are possible, but no attempt was made to define their probable overflow limits.The Clinton River Cut-Off Canal, a flood-relief channel which diverts flow directly into Lake St. Clair from a point about 1500 feet downstream from Gratiot Avenue (about 9 miles upstream from the mouth) has been in operation since October 1951. The approximate limits of overflow that would results from a flood equivalent in discharge to that of April 1947, and occurring with the Cut-Off Canal in operation, are also shown. Although the Cut-Off Canal may reduce the frequency and depth of flooding it will not necessarily eliminate future flooding in the area. Improvements and additions to the drainage systems in the basin, expanding urbanization, new highways, and other cultural changes may influence the inundation pattern of future floods.The preparation of this flood inundation map was financed through a cooperative agreement between Clinton Township, Macomb County, Mich., and the U.S. Geological Survey.Backwater curves used to define the profile for a hypothetical flood on the Clinton River downstream from Moravian Drive, equivalent in discharge to the 1947 flood, but occurring with the present Cut-Off Canal in operation; flood stage established at the gaging station on Clinton River at Mount Clemens; and supplementary floodmark elevations were furnished by the Corps of Engineers.Bench-mark elevations and field survey data, used in the analysis of floods on Harrington Drain, were furnished by the Macomb County Drain Commission.

  11. Wellbeing in the aftermath of floods.

    PubMed

    Walker-Springett, Kate; Butler, Catherine; Adger, W Neil

    2017-01-01

    The interactions between flood events, their aftermath, and recovery leading to health and wellbeing outcomes for individuals are complex, and the pathways and mechanisms through which wellbeing is affected are often hidden and remain under-researched. This study analyses the diverse processes that explain changes in wellbeing for those experiencing flooding. It identifies key pathways to wellbeing outcomes that concern perceptions of lack of agency, dislocation from home, and disrupted futures inducing negative impacts, with offsetting positive effects through community networks and interactions. The mixed method study is based on data from repeated qualitative semi-structured interviews (n=60) and a structured survey (n=1000) with individuals that experienced flooding directly during winter 2013/14 in two UK regions. The results show for the first time the diversity and intersection of pathways to wellbeing outcomes in the aftermath of floods. The findings suggest that enhanced public health planning and interventions could focus on the precise practices and mechanisms that intersect to produce anxiety, stress, and their amelioration at individual and community levels. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Response and Recovery of Streams From an Extreme Flood

    NASA Astrophysics Data System (ADS)

    Kantack, K. M.; Renshaw, C. E.; Magilligan, F. J.; Dethier, E.

    2015-12-01

    In temperate regions, channels are expected to recover from intense floods in a matter of months to years, but quantitative empirical support for this idea remains limited. Moreover, existing literature fails to address the spatial variability of the recovery process. Using an emerging technology, we investigate the immediate response to and progressive recovery of channels in the Northeastern United States from an extreme flood. We seek to determine what factors, including the nature and extent of the immediate response of the channel to the flood and post-flood availability of sediment, contribute to the spatial variability of the rate of recovery. Taking advantage of the 2011 flooding from Tropical Storm Irene, for which pre- and post-flood aerial lidar exist, along with a third set of terrestrial lidar collected in 2015, we assess channel response and recovery with multi-temporal lidar comparison. This method, with kilometers of continuous data, allows for analysis beyond traditional cross-section and reach-scale studies. Results indicate that landscape-scale factors, such as valley morphology and gradients in unit stream power, are controls on channel response to the flood, producing spatially variable impacts. Along a 16.4-km section (drainage area = 82 km2) of the Deerfield River in Vermont, over 148,000 m3 or erosion occurred during the flood. The spatial variation of impacts was correlated (R2= 0.476) with the ratio of channel width to valley width. We expect the recovery process will similarly exhibit spatial variation in rate and magnitude, possibly being governed by gradients in unit stream power and sediment availability. We test the idea that channel widening during the flood reduces post-flood unit stream power, creating a pathway for deposition and recovery to pre-flood width. Flood-widened reaches downstream of point-sources of sediment, such as landslides, will recover more quickly than those without consistent sediment supply. Results of this

  13. Alkaline earth cation extraction from acid solution

    DOEpatents

    Dietz, Mark; Horwitz, E. Philip

    2003-01-01

    An extractant medium for extracting alkaline earth cations from an aqueous acidic sample solution is described as are a method and apparatus for using the same. The separation medium is free of diluent, free-flowing and particulate, and comprises a Crown ether that is a 4,4'(5')[C.sub.4 -C.sub.8 -alkylcyclohexano]18-Crown-6 dispersed on an inert substrate material.

  14. Use of documentary sources on past flood events for flood risk management and land planning

    NASA Astrophysics Data System (ADS)

    Cœur, Denis; Lang, Michel

    2008-09-01

    The knowledge of past catastrophic events can improve flood risk mitigation policy, with a better awareness against risk. As such historical information is usually available in Europe for the past five centuries, historians are able to understand how past society dealt with flood risk, and hydrologists can include information on past floods into an adapted probabilistic framework. In France, Flood Risk Mitigation Maps are based either on the largest historical known flood event or on the 100-year flood event if it is greater. Two actions can be suggested in terms of promoting the use of historical information for flood risk management: (1) the development of a regional flood data base, with both historical and current data, in order to get a good feedback on recent events and to improve the flood risk education and awareness; (2) the commitment to keep a persistent/perennial management of a reference network of hydrometeorological observations for climate change studies.

  15. Analyzing Future Flooding under Climate Change Scenario using CMIP5 Streamflow Data

    NASA Astrophysics Data System (ADS)

    Parajuli, Ranjan; Nyaupane, Narayan; Kalra, Ajay

    2017-12-01

    Flooding is a severe and costlier natural hazard. The effect of climate change has intensified the scenario in recent years. Flood prevention practice along with a proper understanding of flooding event can mitigate the risk of such hazard. The floodplain mapping is one of the technique to quantify the severity of the flooding. Carson City, which is one of the agricultural areas in the desert of Nevada has experienced peak flood in the recent year. The underlying probability distribution for the area, latest Coupled Model Intercomparison Project (CMIP5) streamflow data of Carson River were analyzed for 27 different statistical distributions. The best-fitted distribution underlying was used to forecast the 100yr flood (design flood). The data from 1950-2099 derived from 31 model and total 97 projections were used to predict the future streamflow. Delta change method is adopted to quantify the amount of future (2050-2099) flood. To determine the extent of flooding 3 scenarios (i) historic design flood, (ii) 500yr flood and (iii) future 100yr flood were routed on an HEC-RAS model, prepared using available terrain data. Some of the climate projection shows an extreme increase in future design flood. This study suggests an approach to quantify the future flood and floodplain using climate model projections. The study would provide helpful information to the facility manager, design engineer, and stakeholders.

  16. Evaluation of Flooding Risk and Engineering Protection Against Floods for Ulan-Ude

    NASA Astrophysics Data System (ADS)

    Borisova, T. A.

    2017-11-01

    The report presents the results of the study on analysis and risk assessment in relation to floods for Ulan-Ude and provides the developed recommendations of the activities for engineering protection of the population and economic installations. The current situation is reviewed and the results of the site survey are shown to identify the challenges and areas of negative water influence along with the existing security system. The report presents a summary of floods and index risk assessment. The articles describes the scope of eventual flooding, underflooding and enumerates the economic installations inside the urban areas’ research-based zones of flooding at the rated levels of water to identify the likeliness of exceedance. The assessment of damage from flood equal to 1% is shown.

  17. Spatio-temporal characteristics of the extreme precipitation by L-moment-based index-flood method in the Yangtze River Delta region, China

    NASA Astrophysics Data System (ADS)

    Yin, Yixing; Chen, Haishan; Xu, Chong-Yu; Xu, Wucheng; Chen, Changchun; Sun, Shanlei

    2016-05-01

    The regionalization methods, which "trade space for time" by pooling information from different locations in the frequency analysis, are efficient tools to enhance the reliability of extreme quantile estimates. This paper aims at improving the understanding of the regional frequency of extreme precipitation by using regionalization methods, and providing scientific background and practical assistance in formulating the regional development strategies for water resources management in one of the most developed and flood-prone regions in China, the Yangtze River Delta (YRD) region. To achieve the main goals, L-moment-based index-flood (LMIF) method, one of the most popular regionalization methods, is used in the regional frequency analysis of extreme precipitation with special attention paid to inter-site dependence and its influence on the accuracy of quantile estimates, which has not been considered by most of the studies using LMIF method. Extensive data screening of stationarity, serial dependence, and inter-site dependence was carried out first. The entire YRD region was then categorized into four homogeneous regions through cluster analysis and homogenous analysis. Based on goodness-of-fit statistic and L-moment ratio diagrams, generalized extreme-value (GEV) and generalized normal (GNO) distributions were identified as the best fitted distributions for most of the sub-regions, and estimated quantiles for each region were obtained. Monte Carlo simulation was used to evaluate the accuracy of the quantile estimates taking inter-site dependence into consideration. The results showed that the root-mean-square errors (RMSEs) were bigger and the 90 % error bounds were wider with inter-site dependence than those without inter-site dependence for both the regional growth curve and quantile curve. The spatial patterns of extreme precipitation with a return period of 100 years were finally obtained which indicated that there are two regions with highest precipitation

  18. Urbanization and climate change implications in flood risk management: Developing an efficient decision support system for flood susceptibility mapping.

    PubMed

    Mahmoud, Shereif H; Gan, Thian Yew

    2018-04-26

    The effects of urbanization and climate change impact to the flood risk of two governorates in Egypt were analyzed. Non-parametric change point and trend detection algorithms were applied to the annual rainfall, rainfall anomaly, and temperature anomaly of both study sites. Next, change points and trends of the annual and monthly surface runoff data generated by the Curve Number method over 1948-2014 were also analyzed to detect the effects of urbanization on the surface runoff. Lastly, a GIS decision support system was developed to delineate flood susceptibility zones for the two governorates. The significant decline in annual rainfall and rainfall anomaly after 1994 at 8.96 and 15.3 mm/decade respectively was likely due to climate change impact, especially significant warming trend since 1976 at 0.16 °C/decade, though that could partly be attributed to rapid urbanization. Since 1970, effects of urbanization to flood risk are clear, because despite a decline in rainfall, the annual surface runoff and runoff anomaly show positive trends of 12.7 and of 14.39 mm/decade, respectively. Eleven flood contributing factors have been identified and used in mapping flood susceptibility zones of both sites. In the El-Beheira governorate, 9.2%, 17.9%, 32.3%, 28.3% and 12.3% of its area are categorized as very high, high, moderate, low and very low susceptibility to flooding, respectively. Similarly, in Alexandria governorate, 15.9%, 33.5%, 41%, 8.8% and 0.8% of its area are categorized as very high, high, moderate, low and very low susceptibility to flooding, respectively. Very high and high susceptible zones are located in the northern, northwestern and northeastern parts of the Beheira governorates, and in the northeastern and northwestern parts of Alexandria. The flood related information obtained in this study will be useful to assist mitigating potential flood damages and future land use planning of both governorates of Egypt. Copyright © 2018 Elsevier B.V. All

  19. Improving methane production from digested manure biofibers by mechanical and thermal alkaline pretreatment.

    PubMed

    Tsapekos, P; Kougias, Panagiotis G; Frison, A; Raga, R; Angelidaki, I

    2016-09-01

    Animal manure digestion is associated with limited methane production, due to the high content in fibers, which are hardly degradable lignocellulosic compounds. In this study, different mechanical and thermal alkaline pretreatment methods were applied to partially degradable fibers, separated from the effluent stream of biogas reactors. Batch and continuous experiments were conducted to evaluate the efficiency of these pretreatments. In batch experiments, the mechanical pretreatment improved the degradability up to 45%. Even higher efficiency was shown by applying thermal alkaline pretreatments, enhancing fibers degradability by more than 4-fold. In continuous experiments, the thermal alkaline pretreatment, using 6% NaOH at 55°C was proven to be the most efficient pretreatment method as the methane production was increased by 26%. The findings demonstrated that the methane production of the biogas plants can be increased by further exploiting the fraction of the digested manure fibers which are discarded in the post-storage tank. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Mesh versus bathtub - effects of flood models on exposure analysis in Switzerland

    NASA Astrophysics Data System (ADS)

    Röthlisberger, Veronika; Zischg, Andreas; Keiler, Margreth

    2016-04-01

    In Switzerland, mainly two types of maps that indicate potential flood zones are available for flood exposure analyses: 1) Aquaprotect, a nationwide overview provided by the Federal Office for the Environment and 2) communal flood hazard maps available from the 26 cantons. The model used to produce Aquaprotect can be described as a bathtub approach or linear superposition method with three main parameters, namely the horizontal and vertical distance of a point to water features and the size of the river sub-basin. Whereas the determination of flood zones in Aquaprotect is based on a uniform, nationwide model, the communal flood hazard maps are less homogenous, as they have been elaborated either at communal or cantonal levels. Yet their basic content (i.e. indication of potential flood zones for three recurrence periods, with differentiation of at least three inundation depths) is described in national directives and the vast majority of communal flood hazard maps are based on 2D inundation simulations using meshes. Apart from the methodical differences between Aquaprotect and the communal flood hazard maps (and among different communal flood hazard maps), all of these maps include a layer with a similar recurrence period (i.e. Aquaprotect 250 years, flood hazard maps 300 years) beyond the intended protection level of installed structural systems. In our study, we compare the resulting exposure by overlaying the two types of flood maps with a complete, harmonized, and nationwide dataset of building polygons. We assess the different exposure at the national level, and also consider differences among the 26 cantons and the six biogeographically unique regions, respectively. It was observed that while the nationwide exposure rates for both types of flood maps are similar, the differences within certain cantons and biogeographical regions are remarkable. We conclude that flood maps based on bathtub models are appropriate for assessments at national levels, while maps

  1. Advanced inorganic separators for alkaline batteries

    NASA Technical Reports Server (NTRS)

    Sheibley, D. W. (Inventor)

    1982-01-01

    A flexible, porous battery separator comprising a coating applied to a porous, flexible substrate is described. The coating comprises: (1) a thermoplastic rubber-based resin which is insoluble and unreactive in the alkaline electrolyte; (2) a polar organic plasticizer which is reactive with the alkaline electrolyte to produce a reaction product which contains a hydroxyl group and/or a carboxylic acid group; and (3) a mixture of polar particulate filler materials which are unreactive with the electrolyte, the mixture comprising at least one first filler material having a surface area of greater than 25 meters sq/gram, at least one second filler material having a surface area of 10 to 25 sq meters/gram, wherein the volume of the mixture of filler materials is less than 45% of the total volume of the fillers and the binder, the filler surface area per gram of binder is about 20 to 60 sq meters/gram, and the amount of plasticizer is sufficient to coat each filler particle. A method of forming the battery separator is also described.

  2. Real-Time Optimal Flood Control Decision Making and Risk Propagation Under Multiple Uncertainties

    NASA Astrophysics Data System (ADS)

    Zhu, Feilin; Zhong, Ping-An; Sun, Yimeng; Yeh, William W.-G.

    2017-12-01

    Multiple uncertainties exist in the optimal flood control decision-making process, presenting risks involving flood control decisions. This paper defines the main steps in optimal flood control decision making that constitute the Forecast-Optimization-Decision Making (FODM) chain. We propose a framework for supporting optimal flood control decision making under multiple uncertainties and evaluate risk propagation along the FODM chain from a holistic perspective. To deal with uncertainties, we employ stochastic models at each link of the FODM chain. We generate synthetic ensemble flood forecasts via the martingale model of forecast evolution. We then establish a multiobjective stochastic programming with recourse model for optimal flood control operation. The Pareto front under uncertainty is derived via the constraint method coupled with a two-step process. We propose a novel SMAA-TOPSIS model for stochastic multicriteria decision making. Then we propose the risk assessment model, the risk of decision-making errors and rank uncertainty degree to quantify the risk propagation process along the FODM chain. We conduct numerical experiments to investigate the effects of flood forecast uncertainty on optimal flood control decision making and risk propagation. We apply the proposed methodology to a flood control system in the Daduhe River basin in China. The results indicate that the proposed method can provide valuable risk information in each link of the FODM chain and enable risk-informed decisions with higher reliability.

  3. Learning about Flood Risk: Comparing the Web-Based and Physical Flood-Walk Learning Environments

    ERIC Educational Resources Information Center

    Chang Rundgren, Shu-Nu; Nyberg, Lars; Evers, Mariele; Alexandersson, Jan

    2015-01-01

    Numerous of sustainable development related challenges are emerging today, e.g. flooding problems. Our group has developed "the flood walk" project since 2010 to convey flood risk knowledge in an authentic context. Considering the limitation of time and space to educate people the flood risk knowledge, we tried to transform the physical…

  4. Flood warnings, flood disaster assessments, and flood hazard reduction: the roles of orbital remote sensing

    NASA Technical Reports Server (NTRS)

    Brakenridge, G. R.; Anderson, E.; Nghiem, S. V.; Caquard, S.; Shabaneh, T. B.

    2003-01-01

    Orbital remote sensing of the Earth is now poised to make three fundamental contributions towards reducing the detrimental effects of extreme floods. Effective Flood warning requires frequent radar observation of the Earth's surface through cloud cover. In contrast, both optical and radar wavelengths will increasingly be used for disaster assessment and hazard reduction.

  5. Accuracy of the evaluation method for alkaline agents’ bactericidal efficacies in solid, and the required time of bacterial inactivation

    PubMed Central

    HAKIM, Hakimullah; TOYOFUKU, Chiharu; OTA, Mari; SUZUKI, Mayuko; KOMURA, Miyuki; YAMADA, Masashi; ALAM, Md. Shahin; SANGSRIRATANAKUL, Natthanan; SHOHAM, Dany; TAKEHARA, Kazuaki

    2016-01-01

    An alkaline agent, namely food additive grade calcium hydroxide (FdCa (OH)2) in the powder form, was evaluated for its bactericidal efficacies in chicken feces at pH 13. The point for this evaluation was neutralization of the alkaline agent’s pH at the time of bacterial recovery, since otherwise the results are substantially misleading. Without neutralization of the FdCa (OH)2 pH, the spiked bacteria were killed within min at the time of recovery in aqueous phase, but not in the solid form in feces, hence, it has been demonstrated that when bacteria were in solid, it took longer time than in liquid for the alkaline agent to inactivate them down to the acceptable level (≥3 log10 CFU/ml). PMID:27890906

  6. Application research for 4D technology in flood forecasting and evaluation

    NASA Astrophysics Data System (ADS)

    Li, Ziwei; Liu, Yutong; Cao, Hongjie

    1998-08-01

    In order to monitor the region which disaster flood happened frequently in China, satisfy the great need of province governments for high accuracy monitoring and evaluated data for disaster and improve the efficiency for repelling disaster, under the Ninth Five-year National Key Technologies Programme, the method was researched for flood forecasting and evaluation using satellite and aerial remoted sensed image and land monitor data. The effective and practicable flood forecasting and evaluation system was established and DongTing Lake was selected as the test site. Modern Digital photogrammetry, remote sensing and GIS technology was used in this system, the disastrous flood could be forecasted and loss can be evaluated base on '4D' (DEM -- Digital Elevation Model, DOQ -- Digital OrthophotoQuads, DRG -- Digital Raster Graph, DTI -- Digital Thematic Information) disaster background database. The technology of gathering and establishing method for '4D' disaster environment background database, application technology for flood forecasting and evaluation based on '4D' background data and experimental results for DongTing Lake test site were introduced in detail in this paper.

  7. Development of a flood-warning system and flood-inundation mapping in Licking County, Ohio

    USGS Publications Warehouse

    Ostheimer, Chad J.

    2012-01-01

    Digital flood-inundation maps for selected reaches of South Fork Licking River, Raccoon Creek, North Fork Licking River, and the Licking River in Licking County, Ohio, were created by the U.S. Geological Survey (USGS), in cooperation with the Ohio Department of Transportation; U.S. Department of Transportation, Federal Highway Administration; Muskingum Watershed Conservancy District; U.S. Department of Agriculture, Natural Resources Conservation Service; and the City of Newark and Village of Granville, Ohio. The inundation maps depict estimates of the areal extent of flooding corresponding to water levels (stages) at the following USGS streamgages: South Fork Licking River at Heath, Ohio (03145173); Raccoon Creek below Wilson Street at Newark, Ohio (03145534); North Fork Licking River at East Main Street at Newark, Ohio (03146402); and Licking River near Newark, Ohio (03146500). The maps were provided to the National Weather Service (NWS) for incorporation into a Web-based flood-warning system that can be used in conjunction with NWS flood-forecast data to show areas of predicted flood inundation associated with forecasted flood-peak stages. As part of the flood-warning streamflow network, the USGS re-installed one streamgage on North Fork Licking River, and added three new streamgages, one each on North Fork Licking River, South Fork Licking River, and Raccoon Creek. Additionally, the USGS upgraded a lake-level gage on Buckeye Lake. Data from the streamgages and lake-level gage can be used by emergency-management personnel, in conjunction with the flood-inundation maps, to help determine a course of action when flooding is imminent. Flood profiles for selected reaches were prepared by calibrating steady-state step-backwater models to selected, established streamgage rating curves. The step-backwater models then were used to determine water-surface-elevation profiles for up to 10 flood stages at a streamgage with corresponding streamflows ranging from approximately

  8. Francisella DnaK Inhibits Tissue-nonspecific Alkaline Phosphatase*

    PubMed Central

    Arulanandam, Bernard P.; Chetty, Senthilnath Lakshmana; Yu, Jieh-Juen; Leonard, Sean; Klose, Karl; Seshu, Janakiram; Cap, Andrew; Valdes, James J.; Chambers, James P.

    2012-01-01

    Following pulmonary infection with Francisella tularensis, we observed an unexpected but significant reduction of alkaline phosphatase, an enzyme normally up-regulated following inflammation. However, no reduction was observed in mice infected with a closely related Gram-negative pneumonic organism (Klebsiella pneumoniae) suggesting the inhibition may be Francisella-specific. In similar fashion to in vivo observations, addition of Francisella lysate to exogenous alkaline phosphatase (tissue-nonspecific isozyme) was inhibitory. Partial purification and subsequent proteomic analysis indicated the inhibitory factor to be the heat shock protein DnaK. Incubation with increasing amounts of anti-DnaK antibody reduced the inhibitory effect in a dose-dependent manner. Furthermore, DnaK contains an adenosine triphosphate binding domain at its N terminus, and addition of adenosine triphosphate enhances dissociation of DnaK with its target protein, e.g. alkaline phosphatase. Addition of adenosine triphosphate resulted in decreased DnaK co-immunoprecipitated with alkaline phosphatase as well as reduction of Francisella-mediated alkaline phosphatase inhibition further supporting the binding of Francisella DnaK to alkaline phosphatase. Release of DnaK via secretion and/or bacterial cell lysis into the extracellular milieu and inhibition of plasma alkaline phosphatase could promote an orchestrated, inflammatory response advantageous to Francisella. PMID:22923614

  9. Francisella DnaK inhibits tissue-nonspecific alkaline phosphatase.

    PubMed

    Arulanandam, Bernard P; Chetty, Senthilnath Lakshmana; Yu, Jieh-Juen; Leonard, Sean; Klose, Karl; Seshu, Janakiram; Cap, Andrew; Valdes, James J; Chambers, James P

    2012-10-26

    Following pulmonary infection with Francisella tularensis, we observed an unexpected but significant reduction of alkaline phosphatase, an enzyme normally up-regulated following inflammation. However, no reduction was observed in mice infected with a closely related gram-negative pneumonic organism (Klebsiella pneumoniae) suggesting the inhibition may be Francisella-specific. In similar fashion to in vivo observations, addition of Francisella lysate to exogenous alkaline phosphatase (tissue-nonspecific isozyme) was inhibitory. Partial purification and subsequent proteomic analysis indicated the inhibitory factor to be the heat shock protein DnaK. Incubation with increasing amounts of anti-DnaK antibody reduced the inhibitory effect in a dose-dependent manner. Furthermore, DnaK contains an adenosine triphosphate binding domain at its N terminus, and addition of adenosine triphosphate enhances dissociation of DnaK with its target protein, e.g. alkaline phosphatase. Addition of adenosine triphosphate resulted in decreased DnaK co-immunoprecipitated with alkaline phosphatase as well as reduction of Francisella-mediated alkaline phosphatase inhibition further supporting the binding of Francisella DnaK to alkaline phosphatase. Release of DnaK via secretion and/or bacterial cell lysis into the extracellular milieu and inhibition of plasma alkaline phosphatase could promote an orchestrated, inflammatory response advantageous to Francisella.

  10. Enhancing boron rejection in FO using alkaline draw solutions.

    PubMed

    Wang, Yi-Ning; Li, Weiyi; Wang, Rong; Tang, Chuyang Y

    2017-07-01

    This study provides a novel method to enhance boron removal in a forward osmosis (FO) process. It utilizes the reverse solute diffusion (RSD) of ions from alkaline draw solutions (DSs) and the concentration polarization of the hydroxyl ions to create a highly alkaline environment near the membrane active surface. The results show that boron rejection can be significantly enhanced by increasing the pH of NaCl DS to 12.5 in the active-layer-facing-feed-solution (AL-FS) orientation. The effect of RSD enhanced boron rejection was further promoted in the presence of concentration polarization (e.g., in the active-layer-facing-draw-solution (AL-DS) orientation). The current study opens a new dimension for controlling contaminant removal by FO using tailored DS chemistry, where the RSD-induced localized water chemistry change is taken advantage in contrast to the conventional method of chemical dosing to the bulk feed water. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Regionalisation of a distributed method for flood quantiles estimation: Revaluation of local calibration hypothesis to enhance the spatial structure of the optimised parameter

    NASA Astrophysics Data System (ADS)

    Odry, Jean; Arnaud, Patrick

    2016-04-01

    The SHYREG method (Aubert et al., 2014) associates a stochastic rainfall generator and a rainfall-runoff model to produce rainfall and flood quantiles on a 1 km2 mesh covering the whole French territory. The rainfall generator is based on the description of rainy events by descriptive variables following probability distributions and is characterised by a high stability. This stochastic generator is fully regionalised, and the rainfall-runoff transformation is calibrated with a single parameter. Thanks to the stability of the approach, calibration can be performed against only flood quantiles associated with observated frequencies which can be extracted from relatively short time series. The aggregation of SHYREG flood quantiles to the catchment scale is performed using an areal reduction factor technique unique on the whole territory. Past studies demonstrated the accuracy of SHYREG flood quantiles estimation for catchments where flow data are available (Arnaud et al., 2015). Nevertheless, the parameter of the rainfall-runoff model is independently calibrated for each target catchment. As a consequence, this parameter plays a corrective role and compensates approximations and modelling errors which makes difficult to identify its proper spatial pattern. It is an inherent objective of the SHYREG approach to be completely regionalised in order to provide a complete and accurate flood quantiles database throughout France. Consequently, it appears necessary to identify the model configuration in which the calibrated parameter could be regionalised with acceptable performances. The revaluation of some of the method hypothesis is a necessary step before the regionalisation. Especially the inclusion or the modification of the spatial variability of imposed parameters (like production and transfer reservoir size, base flow addition and quantiles aggregation function) should lead to more realistic values of the only calibrated parameter. The objective of the work presented

  12. Let's think in alkaline phosphatase at heart function.

    PubMed

    Martins, Maria João; Azevedo, Isabel

    2010-10-08

    In their recent paper, Cheung et al [B.M. Cheung, K.L. Ong, L.Y. Wong, Elevated serum alkaline phosphatase and peripheral arterial disease in the United States National Health and Nutrition Examination Survey 1999-2004. Int J Cardiol 2008 (Electronic publication ahead of print)] described a significant association between serum alkaline phosphatase levels and low ankle-brachial blood pressure index, a risk factor for cardiovascular pathology. We had verified that alkaline phosphatase is present at the rat heart, showing a distribution compatible with cardiomyocyte sarcoplasmic reticulum. Moreover, several drugs with cardiac effect were shown to interfere with heart alkaline phosphatase activity. We therefore propose that alkaline phosphatase may be a local regulator at heart function and a putative target for therapeutic interventions. Copyright © 2009 Elsevier Ireland Ltd. All rights reserved.

  13. Characterization of alkaline hydroxide-preserved whole poultry as a dry byproduct meal.

    PubMed

    Shafer, D J; Burgess, R P; Conrad, K A; Prochaska, J F; Carey, J B

    2001-11-01

    Studies were conducted to examine the chemical preservation of whole broiler carcasses by using aqueous alkaline hydroxide solutions. Conversion of the preserved carcasses and solutions into an acceptable poultry byproduct meal was examined. Carcasses and alkaline solutions at a 1:1 ratio were blended and freeze-dried to produce a high fat whole poultry byproduct meal. The dry meal was analyzed for nutrient composition, true metabolizable energy, and amino acid content. Viable bacteria were not recovered after inoculation of the experimental meal with Salmonella enteritidis. The meal was incorporated at 5 and 10% of chick starter diets. Chicks found the meal-containing diets acceptable. Feed consumption, water consumption, BW, and mortality were not significantly different among the dietary treatments in either of the two feeding trials. Necropsy samples revealed no pathological or histological differences attributable to consumption of the alkaline poultry byproduct and blood serum evaluation found no variation in blood chemistry. Alkaline treatment of whole broiler carcasses was an effective preservation method and acceptable as a dry poultry byproduct meal.

  14. Low cost, multiscale and multi-sensor application for flooded area mapping

    NASA Astrophysics Data System (ADS)

    Giordan, Daniele; Notti, Davide; Villa, Alfredo; Zucca, Francesco; Calò, Fabiana; Pepe, Antonio; Dutto, Furio; Pari, Paolo; Baldo, Marco; Allasia, Paolo

    2018-05-01

    Flood mapping and estimation of the maximum water depth are essential elements for the first damage evaluation, civil protection intervention planning and detection of areas where remediation is needed. In this work, we present and discuss a methodology for mapping and quantifying flood severity over floodplains. The proposed methodology considers a multiscale and multi-sensor approach using free or low-cost data and sensors. We applied this method to the November 2016 Piedmont (northwestern Italy) flood. We first mapped the flooded areas at the basin scale using free satellite data from low- to medium-high-resolution from both the SAR (Sentinel-1, COSMO-Skymed) and multispectral sensors (MODIS, Sentinel-2). Using very- and ultra-high-resolution images from the low-cost aerial platform and remotely piloted aerial system, we refined the flooded zone and detected the most damaged sector. The presented method considers both urbanised and non-urbanised areas. Nadiral images have several limitations, in particular in urbanised areas, where the use of terrestrial images solved this limitation. Very- and ultra-high-resolution images were processed with structure from motion (SfM) for the realisation of 3-D models. These data, combined with an available digital terrain model, allowed us to obtain maps of the flooded area, maximum high water area and damaged infrastructures.

  15. Alkaline sorbent injection for mercury control

    DOEpatents

    Madden, Deborah A.; Holmes, Michael J.

    2003-01-01

    A mercury removal system for removing mercury from combustion flue gases is provided in which alkaline sorbents at generally extremely low stoichiometric molar ratios of alkaline earth or an alkali metal to sulfur of less than 1.0 are injected into a power plant system at one or more locations to remove at least between about 40% and 60% of the mercury content from combustion flue gases. Small amounts of alkaline sorbents are injected into the flue gas stream at a relatively low rate. A particulate filter is used to remove mercury-containing particles downstream of each injection point used in the power plant system.

  16. Alkaline sorbent injection for mercury control

    DOEpatents

    Madden, Deborah A.; Holmes, Michael J.

    2002-01-01

    A mercury removal system for removing mercury from combustion flue gases is provided in which alkaline sorbents at generally extremely low stoichiometric molar ratios of alkaline earth or an alkali metal to sulfur of less than 1.0 are injected into a power plant system at one or more locations to remove at least between about 40% and 60% of the mercury content from combustion flue gases. Small amounts of alkaline sorbents are injected into the flue gas stream at a relatively low rate. A particulate filter is used to remove mercury-containing particles downstream of each injection point used in the power plant system.

  17. Inorganic-organic separators for alkaline batteries

    NASA Technical Reports Server (NTRS)

    Sheibley, D. W. (Inventor)

    1978-01-01

    A flexible separator is reported for use between the electrodes of Ni-Cd and Ni-Zn batteries using alkaline electrolytes. The separator was made by coating a porous substrate with a battery separator composition. The coating material included a rubber-based resin copolymer, a plasticizer and inorganic and organic fillers which comprised 55% by volume or less of the coating as finally dried. One or more of the filler materials, whether organic or inorganic, is preferably active with the alkaline electrolyte to produce pores in the separator coating. The plasticizer was an organic material which is hydrolyzed by the alkaline electrolyte to improve conductivity of the separator coating.

  18. How useful are Swiss flood insurance data for flood vulnerability assessments?

    NASA Astrophysics Data System (ADS)

    Röthlisberger, Veronika; Bernet, Daniel; Zischg, Andreas; Keiler, Margreth

    2015-04-01

    The databases of Swiss flood insurance companies build a valuable but to date rarely used source of information on physical flood vulnerability. Detailed insights into the Swiss flood insurance system are crucial for using the full potential of the different databases for research on flood vulnerability. Insurance against floods in Switzerland is a federal system, the modalities are manly regulated on cantonal level. However there are some common principles that apply throughout Switzerland. First of all coverage against floods (and other particular natural hazards) is an integral part of every fire insurance policy for buildings or contents. This coupling of insurance as well as the statutory obligation to insure buildings in most of the cantons and movables in some of the cantons lead to a very high penetration. Second, in case of damage, the reinstatement costs (value as new) are compensated and third there are no (or little) deductible and co-pay. High penetration and the fact that the compensations represent a large share of the direct, tangible losses of the individual policy holders make the databases of the flood insurance companies a comprehensive and therefore valuable data source for flood vulnerability research. Insurance companies not only store electronically data about losses (typically date, amount of claims payment, cause of damage, identity of the insured object or policyholder) but also about insured objects. For insured objects the (insured) value and the details on the policy and its holder are the main feature to record. On buildings the insurance companies usually computerize additional information such as location, volume, year of construction or purpose of use. For the 19 (of total 26) cantons with a cantonal monopoly insurer the data of these insurance establishments have the additional value to represent (almost) the entire building stock of the respective canton. Spatial referenced insurance data can be used for many aspects of

  19. Projections of Flood Risk using Credible Climate Signals in the Ohio River Basin

    NASA Astrophysics Data System (ADS)

    Schlef, K.; Robertson, A. W.; Brown, C.

    2017-12-01

    Estimating future hydrologic flood risk under non-stationary climate is a key challenge to the design of long-term water resources infrastructure and flood management strategies. In this work, we demonstrate how projections of large-scale climate patterns can be credibly used to create projections of long-term flood risk. Our study area is the northwest region of the Ohio River Basin in the United States Midwest. In the region, three major teleconnections have been previously demonstrated to affect synoptic patterns that influence extreme precipitation and streamflow: the El Nino Southern Oscillation, the Pacific North American pattern, and the Pacific Decadal Oscillation. These teleconnections are strongest during the winter season (January-March), which also experiences the greatest number of peak flow events. For this reason, flood events are defined as the maximum daily streamflow to occur in the winter season. For each gage in the region, the location parameter of a log Pearson type 3 distribution is conditioned on the first principal component of the three teleconnections to create a statistical model of flood events. Future projections of flood risk are created by forcing the statistical model with projections of the teleconnections from general circulation models selected for skill. We compare the results of our method to the results of two other methods: the traditional model chain (i.e., general circulation model projections to downscaling method to hydrologic model to flood frequency analysis) and that of using the historic trend. We also discuss the potential for developing credible projections of flood events for the continental United States.

  20. Increasing resilience through participative flood risk map design

    NASA Astrophysics Data System (ADS)

    Fuchs, Sven; Spira, Yvonne; Stickler, Therese

    2013-04-01

    regarding participation was not the methods used for participation but the involvement of concerned lay persons not only in the design of the hazard and risk maps or the risk assessments itself but the cooperative elaboration of the risk assessment approach especially for the harbour area. Following these principles, flood risk maps were created in the underlying EU-project DANUBE FLOODRISK. In this ETC SEE project "DANUBE FLOODRISK - Stakeholder Oriented Assessment of the Danube Floodplains" (2009-2012), hazard and risk maps harmonized across borders for the Danube main stream were produced. This way the overall DANUBE FLOODRISK project contributed to Article 6 of the EU Floods Directive, the hazard and risk maps for international river basins, and provides with the involvement of the national and regional stakeholders the first step to the implementation of Article 7, the Flood Risk Management Plans. By testing the involvement of the broad public and local stakeholders, first exemplary steps were taken for local flood risk management planning. A first set of maps was created for an underlying hazard scenario of a 1-in-100 year flood affecting the city of Krems assuming a failure of the temporal flood protection due to the impact of a ship in the area of the pier. Moreover, both, hazard scenarios with and without a second line of defence were visualised. The set of maps includes (a) an evaluative risk map showing the risk qualitatively aggregated for each building exposed and the number of affected citizens, (b) an evaluative risk map showing the risk qualitatively aggregated per square footage for each building exposed and the number of affected citizens, (c) an evaluative risk map showing the risk quantitatively in monetary units per square footage for each building exposed and the number of affected citizens, and (d) as well as (e) risk maps according to (a) and (b) without the second line of defence in order to communicate the effectiveness of temporal flood protection

  1. More arrows in the quiver: new pathways and old problems with heavy alkaline earth metal diphenylmethanides.

    PubMed

    Alexander, Jacob S; Ruhlandt-Senge, Karin

    2004-03-05

    Progress in the field of sigma-bonded alkaline earth organometallics has been handicapped by numerous complications, such as high reactivity, low solubility, and the limited availability of suitable starting materials. Here we present two synthetic methods, hydrocarbon elimination and desilylation, as alternative routes into this chemistry. A novel barium diphenylmethanide was prepared using these routes delineating that both methods provide a powerful, versatile synthetic access route to an extended library of organometallic alkaline earth derivatives.

  2. Does soil compaction increase floods? A review

    NASA Astrophysics Data System (ADS)

    Alaoui, Abdallah; Rogger, Magdalena; Peth, Stephan; Blöschl, Günter

    2018-02-01

    Europe has experienced a series of major floods in the past years which suggests that flood magnitudes may have increased. Land degradation due to soil compaction from crop farming or grazing intensification is one of the potential drivers of this increase. A literature review suggests that most of the experimental evidence was generated at plot and hillslope scales. At larger scales, most studies are based on models. There are three ways in which soil compaction affects floods at the catchment scale: (i) through an increase in the area affected by soil compaction; (ii) by exacerbating the effects of changes in rainfall, especially for highly degraded soils; and (iii) when soil compaction coincides with soils characterized by a fine texture and a low infiltration capacity. We suggest that future research should focus on better synthesising past research on soil compaction and runoff, tailored field experiments to obtain a mechanistic understanding of the coupled mechanical and hydraulic processes, new mapping methods of soil compaction that combine mechanical and remote sensing approaches, and an effort to bridge all disciplines relevant to soil compaction effects on floods.

  3. Estimation of phosphorus flux in rivers during flooding.

    PubMed

    Chen, Yen-Chang; Liu, Jih-Hung; Kuo, Jan-Tai; Lin, Cheng-Fang

    2013-07-01

    Reservoirs in Taiwan are inundated with nutrients that result in algal growth, and thus also reservoir eutrophication. Controlling the phosphorus load has always been the most crucial issue for maintaining reservoir water quality. Numerous agricultural activities, especially the production of tea in riparian areas, are conducted in watersheds in Taiwan. Nutrients from such activities, including phosphorus, are typically flushed into rivers during flooding, when over 90% of the yearly total amount of phosphorous enters reservoirs. Excessive or enhanced soil erosion from rainstorms can dramatically increase the river sediment load and the amount of particulate phosphorus flushed into rivers. When flow rates are high, particulate phosphorus is the dominant form of phosphorus, but sediment and discharge measurements are difficult during flooding, which makes estimating phosphorus flux in rivers difficult. This study determines total amounts of phosphorus transport by measuring flood discharge and phosphorous levels during flooding. Changes in particulate phosphorus, dissolved phosphorus, and their adsorption behavior during a 24-h period are analyzed owing to the fact that the time for particulate phosphorus adsorption and desorption approaching equilibrium is about 16 h. Erosion of the reservoir watershed was caused by adsorption and desorption of suspended solids in the river, a process which can be summarily described using the Lagmuir isotherm. A method for estimating the phosphorus flux in the Daiyujay Creek during Typhoon Bilis in 2006 is presented in this study. Both sediment and phosphorus are affected by the drastic discharge during flooding. Water quality data were collected during two flood events, flood in June 9, 2006 and Typhoon Bilis, to show the concentrations of suspended solids and total phosphorus during floods are much higher than normal stages. Therefore, the drastic changes of total phosphorus, particulate phosphorus, and dissolved phosphorus in

  4. On the stationarity of Floods in west African rivers

    NASA Astrophysics Data System (ADS)

    NKA, B. N.; Oudin, L.; Karambiri, H.; Ribstein, P.; Paturel, J. E.

    2014-12-01

    West Africa undergoes a big change since the years 1970-1990, characterized by very low precipitation amounts, leading to low stream flows in river basins, except in the Sahelian region where the impact of human activities where pointed out to justify the substantial increase of floods in some catchments. More recently, studies showed an increase in the frequency of intense rainfall events, and according to observations made over the region, increase of flood events is also noticeable during the rainy season. Therefore, the assumption of stationarity on flood events is questionable and the reliability of flood evolution and climatic patterns is justified. In this work, we analyzed the trends of floods events for several catchments in the Sahelian and Sudanian regions of Burkina Faso. We used thirteen tributaries of large river basins (Niger, Nakambe, Mouhoun, Comoé) for which daily rainfall and flow data were collected from national hydrological and meteorological services of the country. We used Mann-Kendall and Pettitt tests to detect trends and break points in the annual time series of 8 rainfall indices and the annual maximum discharge records. We compare the trends of precipitation indices and flood size records to analyze the possible causality link between floods size and rainfall pattern. We also analyze the stationary of the frequency of flood exceeding the ten year return period level. The samples were extracted by a Peak over threshold method and the quantification of change in flood frequency was assessed by using a test developed by Lang M. (1995). The results exhibit two principal behaviors. Generally speaking, no trend is detected on catchments annual maximum discharge, but positive break points are pointed out in a group of three right bank tributaries of the Niger river that are located in the sahelian region between 300mm to 650mm. These same catchments show as well an increase of the yearly number of flood greater than the ten year flood since

  5. Flood-hazard mapping in Honduras in response to Hurricane Mitch

    USGS Publications Warehouse

    Mastin, M.C.

    2002-01-01

    The devastation in Honduras due to flooding from Hurricane Mitch in 1998 prompted the U.S. Agency for International Development, through the U.S. Geological Survey, to develop a country-wide systematic approach of flood-hazard mapping and a demonstration of the method at selected sites as part of a reconstruction effort. The design discharge chosen for flood-hazard mapping was the flood with an average return interval of 50 years, and this selection was based on discussions with the U.S. Agency for International Development and the Honduran Public Works and Transportation Ministry. A regression equation for estimating the 50-year flood discharge using drainage area and annual precipitation as the explanatory variables was developed, based on data from 34 long-term gaging sites. This equation, which has a standard error of prediction of 71.3 percent, was used in a geographic information system to estimate the 50-year flood discharge at any location for any river in the country. The flood-hazard mapping method was demonstrated at 15 selected municipalities. High-resolution digital-elevation models of the floodplain were obtained using an airborne laser-terrain mapping system. Field verification of the digital elevation models showed that the digital-elevation models had mean absolute errors ranging from -0.57 to 0.14 meter in the vertical dimension. From these models, water-surface elevation cross sections were obtained and used in a numerical, one-dimensional, steady-flow stepbackwater model to estimate water-surface profiles corresponding to the 50-year flood discharge. From these water-surface profiles, maps of area and depth of inundation were created at the 13 of the 15 selected municipalities. At La Lima only, the area and depth of inundation of the channel capacity in the city was mapped. At Santa Rose de Aguan, no numerical model was created. The 50-year flood and the maps of area and depth of inundation are based on the estimated 50-year storm tide.

  6. Evaluation of various modelling approaches in flood routing simulation and flood area mapping

    NASA Astrophysics Data System (ADS)

    Papaioannou, George; Loukas, Athanasios; Vasiliades, Lampros; Aronica, Giuseppe

    2016-04-01

    An essential process of flood hazard analysis and mapping is the floodplain modelling. The selection of the modelling approach, especially, in complex riverine topographies such as urban and suburban areas, and ungauged watersheds may affect the accuracy of the outcomes in terms of flood depths and flood inundation area. In this study, a sensitivity analysis implemented using several hydraulic-hydrodynamic modelling approaches (1D, 2D, 1D/2D) and the effect of modelling approach on flood modelling and flood mapping was investigated. The digital terrain model (DTMs) used in this study was generated from Terrestrial Laser Scanning (TLS) point cloud data. The modelling approaches included 1-dimensional hydraulic-hydrodynamic models (1D), 2-dimensional hydraulic-hydrodynamic models (2D) and the coupled 1D/2D. The 1D hydraulic-hydrodynamic models used were: HECRAS, MIKE11, LISFLOOD, XPSTORM. The 2D hydraulic-hydrodynamic models used were: MIKE21, MIKE21FM, HECRAS (2D), XPSTORM, LISFLOOD and FLO2d. The coupled 1D/2D models employed were: HECRAS(1D/2D), MIKE11/MIKE21(MIKE FLOOD platform), MIKE11/MIKE21 FM(MIKE FLOOD platform), XPSTORM(1D/2D). The validation process of flood extent achieved with the use of 2x2 contingency tables between simulated and observed flooded area for an extreme historical flash flood event. The skill score Critical Success Index was used in the validation process. The modelling approaches have also been evaluated for simulation time and requested computing power. The methodology has been implemented in a suburban ungauged watershed of Xerias river at Volos-Greece. The results of the analysis indicate the necessity of sensitivity analysis application with the use of different hydraulic-hydrodynamic modelling approaches especially for areas with complex terrain.

  7. Flooding: A unique year

    USGS Publications Warehouse

    Putnam, A.L.

    1984-01-01

    Floods have been and continue to be one of the most destructive hazards facing the people of the United States. Of all the natural hazards, floods are the most widespread and the most ruinous to life and property. Today, floods are a greater menace to our welfare than ever before because we live in large numbers near water and have developed a complex reliance upon it. From large rivers to country creeks, from mountain rills to the trickles that occasionally dampen otherwise arid wastelands, every stream in the United States is subject to flooding at some time. Floods strike in myriad forms, including sea surges driven by wild winds or tsunamis churned into fury by seismic activity. By far the most frequent, however, standing in a class by themselves, are the inland, freshwater floods that are caused by rain, by melting snow and ice, or by the bursting of structures that man has erected to protect himself and his belongings from angry waters.

  8. Hurricane Harvey Riverine Flooding: Part 1 - Reconstruction of Hurricane Harvey Flooding for Harris County, TX using a GPU-accelerated 2D flood model for post-flood hazard analysis

    NASA Astrophysics Data System (ADS)

    Kalyanapu, A. J.; Dullo, T. T.; Gangrade, S.; Kao, S. C.; Marshall, R.; Islam, S. R.; Ghafoor, S. K.

    2017-12-01

    Hurricane Harvey that made landfall in the southern Texas this August is one of the most destructive hurricanes during the 2017 hurricane season. During its active period, many areas in coastal Texas region received more than 40 inches of rain. This downpour caused significant flooding resulting in about 77 casualties, displacing more than 30,000 people, inundating hundreds of thousands homes and is currently estimated to have caused more than $70 billion in direct damage. One of the significantly affected areas is Harris County where the city of Houston, TX is located. Covering over two HUC-8 drainage basins ( 2702 mi2), this county experienced more than 80% of its annual average rainfall during this event. This study presents an effort to reconstruct flooding caused by extreme rainfall due to Hurricane Harvey in Harris County, Texas. This computationally intensive task was performed at a 30-m spatial resolution using a rapid flood model called Flood2D-GPU, a graphics processing unit (GPU) accelerated model, on Oak Ridge National Laboratory's (ORNL) Titan Supercomputer. For this task, the hourly rainfall estimates from the National Center for Environmental Prediction Stage IV Quantitative Precipitation Estimate were fed into the Variable Infiltration Capacity (VIC) hydrologic model and Routing Application for Parallel computation of Discharge (RAPID) routing model to estimate flow hydrographs at 69 locations for Flood2D-GPU simulation. Preliminary results of the simulation including flood inundation extents, maps of flood depths and inundation duration will be presented. Future efforts will focus on calibrating and validating the simulation results and assessing the flood damage for better understanding the impacts made by Hurricane Harvey.

  9. A fuzzy Bayesian approach to flood frequency estimation with imprecise historical information

    PubMed Central

    Kiss, Andrea; Viglione, Alberto; Viertl, Reinhard; Blöschl, Günter

    2016-01-01

    Abstract This paper presents a novel framework that links imprecision (through a fuzzy approach) and stochastic uncertainty (through a Bayesian approach) in estimating flood probabilities from historical flood information and systematic flood discharge data. The method exploits the linguistic characteristics of historical source material to construct membership functions, which may be wider or narrower, depending on the vagueness of the statements. The membership functions are either included in the prior distribution or the likelihood function to obtain a fuzzy version of the flood frequency curve. The viability of the approach is demonstrated by three case studies that differ in terms of their hydromorphological conditions (from an Alpine river with bedrock profile to a flat lowland river with extensive flood plains) and historical source material (including narratives, town and county meeting protocols, flood marks and damage accounts). The case studies are presented in order of increasing fuzziness (the Rhine at Basel, Switzerland; the Werra at Meiningen, Germany; and the Tisza at Szeged, Hungary). Incorporating imprecise historical information is found to reduce the range between the 5% and 95% Bayesian credibility bounds of the 100 year floods by 45% and 61% for the Rhine and Werra case studies, respectively. The strengths and limitations of the framework are discussed relative to alternative (non‐fuzzy) methods. The fuzzy Bayesian inference framework provides a flexible methodology that fits the imprecise nature of linguistic information on historical floods as available in historical written documentation. PMID:27840456

  10. Application of satellite radar altimetry for near-real time monitoring of floods

    NASA Astrophysics Data System (ADS)

    Lee, H.; Calmant, S.; Shum, C.; Kim, J.; Huang, Z.; Bettadpur, S. V.; Alsdorf, D. E.

    2011-12-01

    According to the 2004 UNESCO World Disasters Report, it is estimated that flooding affected 116 million people globally, causing about 7000 deaths and leading to $7.5 billion in losses. The report also indicates that flood is the most frequently occurring disaster type among all other natural disasters. Hence, timely monitoring of changing of river, wetland and lake/reservoir levels is important to support disaster monitoring and proper response. Yet, we have surprisingly poor knowledge of the spatial and temporal dynamics of surface water discharge and storage changes globally. Although satellite radar altimetry has been successfully used to observe water height changes over rivers, lakes, reservoirs, and wetlands, there have been few studies for near-real time monitoring of floods mainly due to its limited spatial and temporal sampling of surface water elevations. In this study, we monitor flood by examining its spatial and temporal origin of the flooding and its timely propagation using multiple altimeter-river intersections over the entire hydrologic basin. We apply our method to the Amazon 2009 flood event that caused the most severe flooding in more than two decades. We also compare our results with inundated areas estimated from ALOS PALSAR ScanSAR measurements and GRACE 15-day Quick-Look (QL) gravity field data product. Our developed method would potentially enhance the capability of satellite altimeter toward near-real time monitoring of floods and mitigating their hazards.

  11. A fuzzy Bayesian approach to flood frequency estimation with imprecise historical information

    NASA Astrophysics Data System (ADS)

    Salinas, José Luis; Kiss, Andrea; Viglione, Alberto; Viertl, Reinhard; Blöschl, Günter

    2016-09-01

    This paper presents a novel framework that links imprecision (through a fuzzy approach) and stochastic uncertainty (through a Bayesian approach) in estimating flood probabilities from historical flood information and systematic flood discharge data. The method exploits the linguistic characteristics of historical source material to construct membership functions, which may be wider or narrower, depending on the vagueness of the statements. The membership functions are either included in the prior distribution or the likelihood function to obtain a fuzzy version of the flood frequency curve. The viability of the approach is demonstrated by three case studies that differ in terms of their hydromorphological conditions (from an Alpine river with bedrock profile to a flat lowland river with extensive flood plains) and historical source material (including narratives, town and county meeting protocols, flood marks and damage accounts). The case studies are presented in order of increasing fuzziness (the Rhine at Basel, Switzerland; the Werra at Meiningen, Germany; and the Tisza at Szeged, Hungary). Incorporating imprecise historical information is found to reduce the range between the 5% and 95% Bayesian credibility bounds of the 100 year floods by 45% and 61% for the Rhine and Werra case studies, respectively. The strengths and limitations of the framework are discussed relative to alternative (non-fuzzy) methods. The fuzzy Bayesian inference framework provides a flexible methodology that fits the imprecise nature of linguistic information on historical floods as available in historical written documentation.

  12. A fuzzy Bayesian approach to flood frequency estimation with imprecise historical information.

    PubMed

    Salinas, José Luis; Kiss, Andrea; Viglione, Alberto; Viertl, Reinhard; Blöschl, Günter

    2016-09-01

    This paper presents a novel framework that links imprecision (through a fuzzy approach) and stochastic uncertainty (through a Bayesian approach) in estimating flood probabilities from historical flood information and systematic flood discharge data. The method exploits the linguistic characteristics of historical source material to construct membership functions, which may be wider or narrower, depending on the vagueness of the statements. The membership functions are either included in the prior distribution or the likelihood function to obtain a fuzzy version of the flood frequency curve. The viability of the approach is demonstrated by three case studies that differ in terms of their hydromorphological conditions (from an Alpine river with bedrock profile to a flat lowland river with extensive flood plains) and historical source material (including narratives, town and county meeting protocols, flood marks and damage accounts). The case studies are presented in order of increasing fuzziness (the Rhine at Basel, Switzerland; the Werra at Meiningen, Germany; and the Tisza at Szeged, Hungary). Incorporating imprecise historical information is found to reduce the range between the 5% and 95% Bayesian credibility bounds of the 100 year floods by 45% and 61% for the Rhine and Werra case studies, respectively. The strengths and limitations of the framework are discussed relative to alternative (non-fuzzy) methods. The fuzzy Bayesian inference framework provides a flexible methodology that fits the imprecise nature of linguistic information on historical floods as available in historical written documentation.

  13. A method for direct assessment of tissue-nonspecific alkaline phosphatase (TNAP) inhibitors in blood samples.

    PubMed

    Sergienko, Eduard A; Sun, Qing; Ma, Chen-Ting

    2013-01-01

    Tissue nonspecific alkaline phosphatase (TNAP) is one of four human alkaline phosphatases (AP), a family of exocytic enzymes that catalyze hydrolysis of phospho-monoesters in bone, liver, kidney, and various other tissues. Overexpression of TNAP gives rise to excessive bone and soft tissue mineralization, including blood vessel calcification. Our prior screening campaigns have found several leads against this attractive therapeutic target using in vitro assay with a recombinant enzyme; these compounds were further optimized using medicinal chemistry approaches. To prioritize compounds for their use in animal models, we have designed and developed a biomarker assay for in situ detection of TNAP activity within human and mouse blood samples at physiological pH. This assay is suitable for screening compounds in 1,536-well plates using blood plasma from different mammalian species. The user may choose from two different substrates based on the need for greater assay simplicity or sensitivity.

  14. Estimating the long-term historic evolution of exposure to flooding of coastal populations

    NASA Astrophysics Data System (ADS)

    Stevens, A. J.; Clarke, D.; Nicholls, R. J.; Wadey, M. P.

    2015-06-01

    Coastal managers face the task of assessing and managing flood risk. This requires knowledge of the area of land, the number of people, properties and other infrastructure potentially affected by floods. Such analyses are usually static; i.e. they only consider a snapshot of the current situation. This misses the opportunity to learn about the role of key drivers of historical changes in flood risk, such as development and population rise in the coastal flood plain, as well as sea-level rise. In this paper, we develop and apply a method to analyse the temporal evolution of residential population exposure to coastal flooding. It uses readily available data in a GIS environment. We examine how population and sea-level change have modified exposure over two centuries in two neighbouring coastal sites: Portsea and Hayling Islands on the UK south coast. The analysis shows that flood exposure changes as a result of increases in population, changes in coastal population density and sea level rise. The results indicate that to date, population change is the dominant driver of the increase in exposure to flooding in the study sites, but climate change may outweigh this in the future. A full analysis of changing flood risk is not possible as data on historic defences and wider vulnerability are not available. Hence, the historic evolution of flood exposure is as close as we can get to a historic evolution of flood risk. The method is applicable anywhere that suitable floodplain geometry, sea level and population data sets are available and could be widely applied, and will help inform coastal managers of the time evolution in coastal flood drivers.

  15. Interactive Web-based Floodplain Simulation System for Realistic Experiments of Flooding and Flood Damage

    NASA Astrophysics Data System (ADS)

    Demir, I.

    2013-12-01

    Recent developments in web technologies make it easy to manage and visualize large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to create realistic environments, and interact with data to gain insight from simulations and environmental observations. The floodplain simulation system is a web-based 3D interactive flood simulation environment to create real world flooding scenarios. The simulation systems provides a visually striking platform with realistic terrain information, and water simulation. Students can create and modify predefined scenarios, control environmental parameters, and evaluate flood mitigation techniques. The web-based simulation system provides an environment to children and adults learn about the flooding, flood damage, and effects of development and human activity in the floodplain. The system provides various scenarios customized to fit the age and education level of the users. This presentation provides an overview of the web-based flood simulation system, and demonstrates the capabilities of the system for various flooding and land use scenarios.

  16. Alkaline Phosphatase: MedlinePlus Lab Test Information

    MedlinePlus

    ... Test Information → Alkaline Phosphatase URL of this page: https://medlineplus.gov/labtests/alkalinephosphatase.html Alkaline Phosphatase To ... 2017 Mar 13]; [about 3 screens]. Available from: http://www.liverfoundation.org/abouttheliver/info/liverfunctiontests/ Centers for ...

  17. Influence Assessment of Multiple Large-sized Reservoirs on Flooding in the Huai River Watershed, China

    NASA Astrophysics Data System (ADS)

    Wan, X. Y.

    2017-12-01

    The extensive constructions of reservoirs change the hydrologic characteristics of the associated watersheds, which obviously increases the complexity of watershed flood control decisions. By evaluating the impacts of the multi-reservoir system on the flood hydrograph, it becomes possible to improve the effectiveness of the flood control decisions. In this paper we compare the non-reservoir flood hydrograph with the actual observed flood hydrograph using the Lutaizi upstream of Huai river in East China as a representative case, where 20 large-scale/large-sized reservoirs have been built. Based on the total impact of the multi-reservoir system, a novel strategy, namely reservoir successively added (RSA) method, is presented to evaluate the contribution of each reservoir to the total impact. According each reservoir contribution, the "highly effective" reservoirs for watershed flood control are identified via hierarchical clustering. Moreover, we estimate further the degree of impact of the reservoir current operation rules on the flood hydrograph on the base of the impact of dams themselves. As a result, we find that the RSA method provides a useful method for analysis of multi-reservoir systems by partitioning the contribution of each reservoir to the total impacts on the flooding at the downstream section. For all the historical large floods examined, the multi-reservoir system in the Huai river watershed has a significant impact on flooding at the downstream Lutaizi section, on average reducing the flood volume and peak discharge by 13.92 × 108 m3 and 18.7% respectively. It is more informative to evaluate the maximum impact of each reservoir (on flooding at the downstream section) than to examine the average impact. Each reservoir has a different impact on the flood hydrograph at the Lutaizi section. In particular, the Meishan, Xianghongdian, Suyahu, Nanwan, Nianyushan and Foziling reservoirs exert a strong influence on the flood hydrograph, and are therefore

  18. Robust Flood Monitoring Using Sentinel-1 SAR Time Series

    NASA Astrophysics Data System (ADS)

    DeVries, B.; Huang, C.; Armston, J.; Huang, W.

    2017-12-01

    The 2017 hurricane season in North and Central America has resulted in unprecedented levels of flooding that have affected millions of people and continue to impact communities across the region. The extent of casualties and damage to property incurred by these floods underscores the need for reliable systems to track flood location, timing and duration to aid response and recovery efforts. While a diverse range of data sources provide vital information on flood status in near real-time, only spaceborne Synthetic Aperture Radar (SAR) sensors can ensure wall-to-wall coverage over large areas, mostly independently of weather conditions or site accessibility. The European Space Agency's Sentinel-1 constellation represents the only SAR mission currently providing open access and systematic global coverage, allowing for a consistent stream of observations over flood-prone regions. Importantly, both the data and pre-processing software are freely available, enabling the development of improved methods, tools and data products to monitor floods in near real-time. We tracked flood onset and progression in Southeastern Texas, Southern Florida, and Puerto Rico using a novel approach based on temporal backscatter anomalies derived from times series of Sentinel-1 observations and historic baselines defined for each of the three sites. This approach was shown to provide a more objective measure of flood occurrence than the simple backscatter thresholds often employed in operational flood monitoring systems. Additionally, the use of temporal anomaly measures allowed us to partially overcome biases introduced by varying sensor view angles and image acquisition modes, allowing increased temporal resolution in areas where additional targeted observations are available. Our results demonstrate the distinct advantages offered by data from operational SAR missions such as Sentinel-1 and NASA's planned NISAR mission, and call attention to the continuing need for SAR Earth Observation

  19. Evaluation of ASR potential of quartz-rich rocks by alkaline etching of polished rock sections

    NASA Astrophysics Data System (ADS)

    Šachlová, Šárka; Kuchařová, Aneta; Pertold, Zdeněk; Přikryl, Richard

    2015-04-01

    Damaging effect of alkali-silica reaction (ASR) on concrete structures has been observed in various countries all over the World. Civil engineers and real state owners are demanding reliable methods in the assessment of ASR potential of aggregates before they are used in constructions. Time feasible methods are expected, as well as methods which enable prediction of long-term behaviour of aggregates in concrete. The most frequently employed accelerated mortar bar test (AMBT) quantifies ASR potential of aggregates according to the expansion values of mortar bars measured after fourteen days testing period. Current study aimed to develop a new methodical approach facilitating identification and quantification of ASR potential of aggregates. Polished rock sections of quartz and amorphous SiO2 (coming from orthoquartzite, quartz meta-greywacke, pegmatite, phyllite, chert, and flint) were subjected to experimental leaching in 1M NaOH solution at 80°C. After 14 days of alkaline etching, the rock sections were analyzed employing scanning electron microscope combined with energy dispersive spectrometer. Representative areas were documented in back scattered electron (BSE) images and measured using fully-automatic petrographic image analysis (PIA). Several features connected to alkaline etching were observed on the surface of polished rock sections: deep alkaline etching, partial leach-out of quartz and amorphous particles, alkaline etching connected to quartz grain boundaries, and alkaline etching without any connection to grain boundaries. All features mentioned above had significant influence on grey-scale spectrum of BSE images. A specific part of the grey-scale spectrum (i.e. grey-shade 0-70) was characteristic of areas affected by alkaline etching (ASR area). By measuring such areas we quantified the extent of alkaline etching in studied samples. Very good correlation was found between the ASR area and ASR potential of investigated rocks measured according to the

  20. 21 CFR 862.1050 - Alkaline phosphatase or isoenzymes test system.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Alkaline phosphatase or isoenzymes test system... Test Systems § 862.1050 Alkaline phosphatase or isoenzymes test system. (a) Identification. An alkaline phosphatase or isoenzymes test system is a device intended to measure alkaline phosphatase or its isoenzymes...