Science.gov

Sample records for alkaline flooding methods

  1. Alkaline flooding for enhanced oil recovery

    SciTech Connect

    Gittler, W.E.

    1983-09-01

    There are over 12 active projects of varying size using one of 3 major types of alkaline agents. These include sodium silicate, caustic soda, and soda ash. Among the largest pilots currently is the THUMS project in the Wilmington field, California. Plans called for the injection of a 4% weight concentration of sodium orthosilicate over a 60% PV. Through the first 3 yr, over 27 million bbl of chemicals have been injected. Gulf Oil is operating several alkaline floods, one of which is located off shore in the Quarantine Bay field, Louisiana. In this pilot, sodium hydroxide in a weight concentration of 5 to 12% is being injected. Belco Petroleum Corp. has reported that their pilot operating in the Isenhour Unit in Wyoming is using a .5% weight concentration of soda ash in conjunction with a polymer. Other uses for alkaline agents in chemical flooding include the use of silicate as a preflush or sacrificial agent in micellar/polymer and surfactant recovery systems. In addition, caustic has been tested in the surface-mixed caustic emulsion process while orthosilicate has been tested in a recovery method known as mobility-controlled caustic floods.

  2. Alkaline flood prediction studies, Ranger VII pilot, Wilmington Field, California

    SciTech Connect

    Mayer, E.H.; Breit, V.S.

    1982-01-01

    The paper discusses: (1) The design of a simulator to model alkaline displacement mechanisms and the current state-of-the-art understanding of in-situ caustic consumption. (2) Assimilation of laboratory core flood and rock consumption data. Use of this data in 1-D and 2-D limited area simulations, and a 3-D model of the entire pilot project. (3) Simulation studies of alkaline flood behavior in a small 2-D area of the field for various concentrations, slug sizes, long term consumption functions and two relative permeability adjustment mechanisms. (4) Scale up of 2-D simulation results, and their use in a 271 acre 1.097 x 10/sup 6/m/sup 2/), 7 layered 3-D model of the pilot. (5) Comparison of 3-D simulator results with initial field alkaline flood performance. (6) Recommended additional application of the simulator methods developed in this pilot and in other alkaline floods. 10 refs.

  3. Surfactant-enhanced alkaline flooding field project

    SciTech Connect

    French, T.R.

    1991-10-01

    The Tucker sand of Helper (KS) field is a candidate for surfactant-enhanced alkaline flooding. The geology of the Helper site is typical of many DOE Class I reservoirs. The Tucker sand of Helper field was deposited in a fluvial dominated deltaic environment. Helper oil can be mobilized with either chemical system 2 or chemical system 3, as described in this report. Oil fields in the Gulf Coast region are also good candidates for surfactant-enhanced alkaline flooding. The results from laboratory tests conducted in Berea sandstone cores with oil brine from Helper (KS) field are encouraging. The crude oil is viscous and non-acidic and, yet, was mobilized by the chemical formulations described in this report. Significant amounts of the oil were mobilized under simulated reservoir conditions. The results in Berea sandstone cores were encouraging and should be verified by tests with field core. Consumption of alkali, measured with field core, was very low. Surfactant loss appeared to be acceptable. Despite the good potential for mobilization of Helper oil, certain reservoir characteristics such as low permeability, compartmentalization, and shallow depth place constraints on applications of any chemical system in the Tucker sand. These constraints are typical of many DOE Class I reservoirs. Although Hepler field is not a perfect reservoir in which to apply surfactant- enhanced alkaline flooding, Hepler oil is particularly amenable to mobilization by surfactant-enhanced alkaline systems. A field test is recommended, dependent upon final evaluation of well logs and cores from the proposed pilot area. 14 refs., 21 figs., 10 tabs.

  4. Alkaline flood prediction studies, Ranger VII pilot, Wilmington Field, California

    SciTech Connect

    Mayer, E.H.; Breit, V.S.

    1986-01-01

    This paper discusses the design of a simulator to model alkaline displacement mechanisms, along with the current understanding of in-situ caustic consumption. Assimilation of laboratory coreflood and rock consumption data, and their use in one- and two-dimensional (1D and 2D) limited area simulations and in three-dimensional (3D) models of the entire pilot project are given. This paper also reports simulation studies of alkaline flood behavior in a small 2D area of a field for various concentrations, slug sizes, long-term consumption functions, and two relative-permeability adjustment mechanisms. The scale-up of 2D simulation results and their use in a 271-acre (1096.7-ha), seven-layered, 3D model of the pilot are also discussed and 3D simulator results are compared with initial field alkaline flood performance. Finally, recommended additional applications of the simulator methods developed in this pilot and in other alkaline floods are discussed.

  5. Performed surfactant-optimized aqueous alkaline flood

    SciTech Connect

    Thigpen, D.R.; Lawson, J.B.; Nelson, R.C.

    1991-11-26

    This paper describes improvement in a process for recovering oil from an acidic oil reservoir by injecting an aqueous alkaline solution comprising water, sodium chloride, and alkaline material for reacting with the reservoir oil forming a petroleum acid soap to form an in-situ surfactant system. The improvement comprises: selecting a preformed cosurfactant which is soluble in both the aqueous solution and the reservoir oil and has a solubility ratio which is grater than the solubility ratio of the petroleum acid soap where the solubility ratio is the ratio of solubility in the aqueous alkaline solution to the solubility in the reservoir oil; combining with the alkaline solution an amount of the preformed cosurfactant which will result in the in-situ surfacant system having a salinity about equal to a salinity which results in minimal interfacial tension between the oil in the reservoir and the in-situ surfactant system at reservoir temperature, wherein the amount of the preformed cosurfactant is about 0.3 percent by weight in the aqueous alkaline solution; and injecting the cosurfactant-aqueous alkaline solution mixture into the reservoir to displace oil toward a fluid production location.

  6. Surfactant-enhanced low-pH alkaline flooding

    SciTech Connect

    Peru, D.A. and Co., Columbia, MD . Research Div.); Lorenz, P.B. )

    1990-08-01

    This paper reports sodium bicarbonate investigated as a potential alkaline agent in surfactant-enhanced alkaline flooding because it has very little tendency to dissolve silicate minerals. In experiments performed with Wilmington, CA, crude oil and three types of surfactants, the bicarbonate/surfactant combination caused a marked lowering of interfacial tension (IFT). Bicarbonate protected the surfactant against divalent cations and reduced adsorption of surfactant and polymer on various minerals. Coreflood test confirm that sodium bicarbonate plus surfactant can be an effective alternative to the high-pH flooding process.

  7. Surfactant-enhanced alkaline flooding with weak alkalis

    SciTech Connect

    French, T.R.; Josephson, C.B.

    1991-02-01

    The objective of Project BE4B in FY90 was to develop cost-effective and efficient chemical flooding formulations using surfactant-enhanced, lower pH (weak) alkaline chemical systems. Chemical systems were studied that mitigate the deleterious effects of divalent ions. The experiments were conducted with carbonate mixtures and carbonate/phosphate mixtures of pH 10.5, where most of the phosphate ions exist as the monohydrogen phosphate species. Orthophosphate did not further reduce the deleterious effect of divalent ions on interfacial tension behavior in carbonate solutions, where the deleterious effect of the divalent ions is already very low. When added to a carbonate mixture, orthophosphate did substantially reduce the adsorption of an atomic surfactant, which was an expected result; however, there was no correlation between the amount of reduction and the divalent ion levels. For acidic oils, a variety of surfactants are available commercially that have potential for use between pH 8.3 and pH 9.5. Several of these surfactants were tested with oil from Wilmington (CA) field and found to be suitable for use in that field. Two low-acid crude oils, with acid numbers of 0.01 and 0.27 mg KOH/g of oil, were studied. It was shown that surfactant-enhanced alkaline flooding does have merit for use with these low-acid crude oils. However, each low-acid oil tested was found to behave differently, and it was concluded that the applicability of the method must be experimentally determined for any given low-acid crude oil. 19 refs., 10 figs. 4 tabs.

  8. Surfactant-enhanced alkaline flooding field project. Annual report, Revision

    SciTech Connect

    French, T.R.

    1991-10-01

    The Tucker sand of Helper (KS) field is a candidate for surfactant-enhanced alkaline flooding. The geology of the Helper site is typical of many DOE Class I reservoirs. The Tucker sand of Helper field was deposited in a fluvial dominated deltaic environment. Helper oil can be mobilized with either chemical system 2 or chemical system 3, as described in this report. Oil fields in the Gulf Coast region are also good candidates for surfactant-enhanced alkaline flooding. The results from laboratory tests conducted in Berea sandstone cores with oil brine from Helper (KS) field are encouraging. The crude oil is viscous and non-acidic and, yet, was mobilized by the chemical formulations described in this report. Significant amounts of the oil were mobilized under simulated reservoir conditions. The results in Berea sandstone cores were encouraging and should be verified by tests with field core. Consumption of alkali, measured with field core, was very low. Surfactant loss appeared to be acceptable. Despite the good potential for mobilization of Helper oil, certain reservoir characteristics such as low permeability, compartmentalization, and shallow depth place constraints on applications of any chemical system in the Tucker sand. These constraints are typical of many DOE Class I reservoirs. Although Hepler field is not a perfect reservoir in which to apply surfactant- enhanced alkaline flooding, Hepler oil is particularly amenable to mobilization by surfactant-enhanced alkaline systems. A field test is recommended, dependent upon final evaluation of well logs and cores from the proposed pilot area. 14 refs., 21 figs., 10 tabs.

  9. Interfacial activity in alkaline flooding enhanced oil recovery

    SciTech Connect

    Chan, M.K.

    1981-01-01

    The ionization of long-chained organic acids in the crude oil to form soaps was shown to be primarily responsible for the lowering of oil-water interfacial tension at alkaline pH. These active acids can be concentrated by silica gel chromatography into a minor polar fraction. An equilibrium chemical model was proposed based on 2 competing reactions: the ionization of acids to form active anions, and the formation of undissociated soap between acid anions and sodium ions. It correlates the interfacial activity with the interfacial concentration of active acid anions which is expressed in terms of the concentrations of the chemical species in the system. The model successfully predicts the observed oil-alkaline solution interfacial phenomenon, including its dependence on pH, alkali and salt concentrations, type of acid present and type of soap formed. Flooding at different alkali concentrations to activate different acid species present in the crude was shown to give better recovery than flooding at a single high alkali concentration. Treating the crude oil with a dilute solution of mineral acids liberates additional free active acids and yields better interfacial activity during subsequent alkali contact.

  10. Aqueous flooding methods for tertiary oil recovery

    SciTech Connect

    Peru, Deborah A.

    1989-01-01

    A method of aqueous flooding of subterranean oil bearing formation for tertiary oil recovery involves injecting through a well into the formation a low alkaline pH aqueous sodium bicarbonate flooding solution. The flooding solution's pH ranges from about 8.25 to 9.25 and comprises from 0.25 to 5 weight percent and preferably about 0.75 to 3.0 weight percent of sodium bicarbonate and includes a petroleum recovery surfactant of 0.05 to 1.0 weight percent and between 1 and 20 weight percent of sodium chloride. After flooding, an oil and water mixture is withdrawn from the well and the oil is separated from the oil and water mixture.

  11. Predicting Phosphorus Release from Anaerobic, Alkaline, Flooded Soils.

    PubMed

    Amarawansha, Geethani; Kumaragamage, Darshani; Flaten, Don; Zvomuya, Francis; Tenuta, Mario

    2016-07-01

    Anaerobic conditions induced by prolonged flooding often lead to an enhanced release of phosphorus (P) to floodwater; however, this effect is not consistent across soils. This study aimed to develop an index to predict P release potential from alkaline soils under simulated flooded conditions. Twelve unamended or manure-amended surface soils from Manitoba were analyzed for basic soil properties, Olsen P (Ols-P), Mehlich-3 extractable total P (M3P), Mehlich-3 extractable molybdate-reactive P (M3P), water extractable P (WEP), soil P fractions, single-point P sorption capacity (P), and Mehlich-3 extractable Ca (M3Ca), and Mg (M3Mg). Degree of P saturation (DPS) was calculated using Ols-P, M3P or M3P as the intensity factor, and an estimated adsorption maximum based on either P or M3Ca + M3Mg as the capacity factor. To develop the model, we used the previously reported floodwater dissolved reactive P (DRP) concentration changes during 8 wk of flooding for the same unamended and manured soils. Relative changes in floodwater DRP concentration (DRP), calculated as the ratio of maximum to initial DRP concentration, ranged from 2 to 15 across ten of the soils, but were ≤1.5 in the two soils with the greatest clay content. Partial least squares analysis indicated that DPS3 calculated using M3P as the intensity factor and (2 × P) + M3P as the capacity factor with clay percentage can effectively predict DRP ( = 0.74). Results suggest that P release from a soil to floodwater may be predicted using simple and easily measurable soil properties measured before flooding, but validation with more soils is needed. PMID:27380097

  12. Surfactant-enhanced alkaline flooding for light oil recovery. Annual report, 1992--1993

    SciTech Connect

    Wasan, D.T.

    1994-08-01

    In this report, the authors present the results of experimental and theoretical studies in surfactant-enhanced alkaline flooding for light oil recovery. The overall objective of this work is to develop a very cost-effective method for formulating a successful surfactant-enhanced alkaline flood by appropriately choosing mixed alkalis which form inexpensive buffers to obtain the desired pH (between 8.5 and 12.0) for ultimate spontaneous emulsification and ultralow interfacial tension. In addition, the authors have (1) developed a theoretical interfacial activity model for determining equilibrium interfacial tension, (2) investigated the mechanisms for spontaneous emulsification, (3) developed a technique to monitor low water content in oil, and (4) developed a technique to study water-in-oil emulsion film properties.

  13. Surfactant-enhanced alkaline flooding for light oil recovery. [Annual report], 1993--1994

    SciTech Connect

    Wasan, D.T.

    1995-03-01

    In this report, we present the results of our experimental and theoretical studies in surfactant-enhanced alkaline flooding for light oil recovery. The overall objective of this work is to develop a very cost-effective method for formulating a successful surfactant-enhanced alkaline flood by appropriately choosing mixed alkalis which form inexpensive buffers to obtain the desired pH (between 8.5 and 12.0) for ultimate spontaneous emulsification and ultralow interfacial tension. In addition, we have (1) investigated the effect of surfactant on the equilibrium and transient interfacial tension, (2) investigated the kinetics of oil removal from a silica surface, and (3) developed a theoretical interfacial activity model for determining equilibrium interfacial tension. The results of the studies conducted during the course of this project are presented.

  14. Surfactant-enhanced alkaline flooding for light oil recovery. Quarterly report, April 1, 1995--June 30, 1995

    SciTech Connect

    Wasan, D.T.

    1995-09-01

    The overall objective of this project is to develop a very cost-effective method for formulating a successful surfactant-enhanced alkaline flood by appropriately choosing mixed alkalis which form inexpensive buffers to obtain the desired pH (between 8.5 and 12.0) for ultimate spontaneous emulsification and ultra-low tension. In addition, the novel concept of pH gradient design to optimize flood water conditions will be tested.

  15. Surfactant-enhanced alkaline flooding field project. Annual report

    SciTech Connect

    French, T.R.; Josephson, C.B.

    1993-12-01

    The Tucker sand from Hepler field, Crawford County, Kansas, was characterized using routine and advanced analytical methods. The characterization is part of a chemical flooding pilot test to be conducted in the field, which is classified as a DOE Class I (fluvial-dominated delta) reservoir. Routine and advanced methods of characterization were compared. Traditional wireline logs indicate that the reservoir is vertically compartmentalized on the foot scale. Routine core analysis, X-ray computed tomography (CT), minipermeameter measurement, and petrographic analysis indicate that compartmentalization and lamination extend to the microscale. An idealized model of how the reservoir is probably structured (complex layering with small compartments) is presented. There was good agreement among the several methods used for characterization, and advanced characterization methods adequately explained the coreflood and tracer tests conducted with short core plugs. Tracer and chemical flooding tests were conducted in short core plugs while monitoring with CT to establish flow patterns and to monitor oil saturations in different zones of the core plugs. Channeling of injected fluids occurred in laboratory experiments because, on core plug scale, permeability streaks extended the full length of the core plugs. A graphic example of how channeling in field core plugs can affect oil recovery during chemical injection is presented. The small scale of compartmentalization indicated by plugs of the Tucker sand may actually help improve sweep between wells. The success of field-scale waterflooding and the fluid flow patterns observed in highly heterogeneous outcrop samples are reasons to expect that reservoir flow patterns are different from those observed with short core plugs, and better sweep efficiency may be obtained in the field than has been observed in laboratory floods conducted with short core plugs.

  16. Surfactant-enhanced alkaline flooding for light oil recovery. Final report 1994--1995

    SciTech Connect

    Wasan, D.T.

    1995-12-01

    In this report, the authors present the results of their experimental and theoretical studies in surfactant-enhanced alkaline flooding for light oil recovery. The overall objective of this work is to develop a very cost-effective method for formulating a successful surfactant-enhanced alkaline flood by appropriately choosing mixed alkalis which form inexpensive buffers to obtain the desired pH (between 8.5 and 12.0) for ultimate spontaneous emulsification and ultralow interfacial tension. In addition, the authors have (1) developed a theoretical interfacial activity model for determining equilibrium interfacial tension, (2) investigated the mechanisms for spontaneous emulsification, (3) developed a technique to monitor low water content in oil, and (4) developed a technique to study water-in-oil emulsion film properties, (5) investigated the effect of surfactant on the equilibrium and transient interfacial tension, (6) investigated the kinetics of oil removal from a silica surface, and (7) developed a theoretical interfacial activity model for determining equilibrium interfacial tension, accounting for added surfactant. The results of the studies conducted during the course of this project are summarized.

  17. Surfactant-enhanced alkaline flooding for light oil recovery. Final report

    SciTech Connect

    Wasan, D.T.

    1996-05-01

    In this report, we present the results of our experimental and theoretical studies in surfactant-enhanced alkaline flooding for light oil recovery. The overall objective of this work is to develop a very cost-effective method for formulating a successful surfactant-enhanced alkaline flood by appropriately choosing mixed alkalis which form inexpensive buffers to obtain the desired pH (between 8.5 and 12. 0) for ultimate spontaneous emulsification and ultralow interfacial tension. In addition, we have (1) developed a theoretical interfacial activity model for determining equilibrium interfacial tension, (2) investigated the mechanisms for spontaneous emulsification, (3) developed a technique to monitor low water content in oil and (4) developed a technique to study water-in-oil emulsion film properties, (5) investigated the effect of surfactant on the equilibrium and transient interfacial tension, (6) investigated the kinetics of oil removal from a silica surface, and (7) developed a theoretical interfacial activity model for determining equilibrium interfacial tension, accounting for added surfactant. The results of the studies conducted during the course of this project are discussed.

  18. Surfactant-enhanced alkaline flooding for light oil recovery. Quarterly report, January 1--March 31, 1994

    SciTech Connect

    Wasan, D.T.

    1994-06-01

    The overall objective of this project is to develop a very cost-effective method for formulating a successful surfactant-enhanced alkaline flood by appropriately choosing mixed alkalis which form inexpensive buffers to obtain the desired pH (between 8.5 and 12.0) for ultimate spontaneous emulsification and ultra-low tension. In addition, the novel concept of pH gradient design to optimize flood water conditions will be tested. Last quarter we investigated the phase behavior and the regions where in the middle phase occurs. The optimum phase was found to go through a maximum with pH, sodium concentration and surfactant concentration. The optimum pH is about 12.0 to 13.5, the optimum sodium concentration is about 0.513 mol/liter, and the optimum surfactant concentration is about 0.2%. The effect of surfactant type was also investigated. Petrostep B-105 was found to give the most middle phase production. This quarter, we investigated the contact angle of Long Beach oil, Adena oil, and a model oil on a solid glass surface in contact with an aqueous alkaline solution both with and without added preformed surfactant. The contact angle with Long Beach and Adena oils showed oil-wet conditions, whereas the model oil showed both oil-wet and water-wet conditions depending on the pH of the aqueous phase. The addition of surfactant to the alkaline solution resulted in making the system less oil-wet. Spreading of the oil on the glass surface was observed in all three systems investigated.

  19. Surfactant-enhanced alkaline flooding for light oil recovery. Quarterly report, October 1--December 30, 1994

    SciTech Connect

    Wasan, D.T.

    1994-12-31

    The overall objective of this project is to develop a very cost-effective method for formulating a successful surfactant-enhanced alkaline flood by appropriately choosing mixed alkalis which form inexpensive buffers to obtain the desired pH (between 8.5 and 12.0) for ultimate spontaneous emulsification and ultra-low tension. In addition, the novel concept of pH gradient design to optimize flood water conditions will be tested. The problem of characterizing emulsions in porous media is very important in enhanced oil recovery applications. This is usually accomplished by externally added or insitu generated surfactants that sweep the oil out of the reservoir. Emulsification of the trapped oil is one of the mechanisms of recovery. The ability to detect emulsions in the porous medium is therefore crucial to designing profitable flood systems. The capability of microwave dielectric techniques to detect emulsions in porous medium is demonstrated by mathematical modelling and by experiments. This quarter the dielectric properties of porous media are shown to be predicted adequately by treating it an an O/W type dispersion of sand grains in water. Dielectric measurements of emulsion flow in porous media show that dielectric techniques may be applied to determine emulsion characteristics in porous media. The experimental observations were confirmed by theoretical analysis.

  20. Phosphorus Mobilization from Manure-Amended and Unamended Alkaline Soils to Overlying Water during Simulated Flooding.

    PubMed

    Amarawansha, E A G S; Kumaragamage, D; Flaten, D; Zvomuya, F; Tenuta, M

    2015-07-01

    Anaerobic soil conditions resulting from flooding often enhance release of phosphorus (P) to overlying water. Enhanced P release is well documented for flooded acidic soils; however, there is little information for flooded alkaline soils. We examined the effect of flooding and anaerobic conditions on P mobilization using 12 alkaline soils from Manitoba that were either unamended or amended with solid cattle manure. Pore water and floodwater were analyzed over 8 wk of simulated flooding for dissolved reactive P (DRP), Ca, Mg, Fe, and Mn. As expected, manured soils had significantly greater pore and floodwater DRP concentrations than unamended. Flooding increased pore water DRP concentrations significantly in all soils and treatments except one manured clay in which concentrations increased initially and then decreased. Floodwater DRP concentrations increased significantly by two- to 15-fold in 10 soils regardless of amendment treatment but remained relatively stable in the two soils with greatest clay content. Phosphorus release at the onset of flooding was associated with the release of Ca, Mg, and Mn, suggesting that P release may be controlled by the dissolution of Mg and Ca phosphates and reductive dissolution of Mn phosphates. Thereafter, P release was associated with release of Fe, suggesting the reductive dissolution of Fe phosphates. Differences in pore water and floodwater DRP concentrations among soils and amendment treatments and the high variability in P mobilization from pore water to floodwater among soils indicate the need to further investigate chemical reactions responsible for P release and mobility under anaerobic conditions. PMID:26437107

  1. Phosphorus Mobilization from Manure-Amended and Unamended Alkaline Soils to Overlying Water during Simulated Flooding.

    PubMed

    Amarawansha, E A G S; Kumaragamage, D; Flaten, D; Zvomuya, F; Tenuta, M

    2015-07-01

    Anaerobic soil conditions resulting from flooding often enhance release of phosphorus (P) to overlying water. Enhanced P release is well documented for flooded acidic soils; however, there is little information for flooded alkaline soils. We examined the effect of flooding and anaerobic conditions on P mobilization using 12 alkaline soils from Manitoba that were either unamended or amended with solid cattle manure. Pore water and floodwater were analyzed over 8 wk of simulated flooding for dissolved reactive P (DRP), Ca, Mg, Fe, and Mn. As expected, manured soils had significantly greater pore and floodwater DRP concentrations than unamended. Flooding increased pore water DRP concentrations significantly in all soils and treatments except one manured clay in which concentrations increased initially and then decreased. Floodwater DRP concentrations increased significantly by two- to 15-fold in 10 soils regardless of amendment treatment but remained relatively stable in the two soils with greatest clay content. Phosphorus release at the onset of flooding was associated with the release of Ca, Mg, and Mn, suggesting that P release may be controlled by the dissolution of Mg and Ca phosphates and reductive dissolution of Mn phosphates. Thereafter, P release was associated with release of Fe, suggesting the reductive dissolution of Fe phosphates. Differences in pore water and floodwater DRP concentrations among soils and amendment treatments and the high variability in P mobilization from pore water to floodwater among soils indicate the need to further investigate chemical reactions responsible for P release and mobility under anaerobic conditions.

  2. Surfactant-enhanced alkaline flooding for light oil recovery. [Quarterly] report, March 31--June 30, 1993

    SciTech Connect

    Wasan, D.T.

    1993-09-01

    The overall objective of this project is to develop a very cost- effective method for formulating a successful surfactant-enhanced alkaline flood by appropriately choosing mixed alkalis which form inexpensive buffers to obtain the desired pH (between 8.5 and 12.0) for ultimate spontaneous emulsification and ultra-low tension. In addition, the novel concept of pH gradient design to optimize flood water conditions will be tested. Last quarter we have investigated the mechanisms responsible for spontaneous emulsification in alkali/acidic crude oil systems with and without added surfactant. We have observed that the roll cell size and formation time depend strongly on the pH and ionic strength of the alkaline solution. For a particular roll cell size, the addition of surfactant causes the cells to take longer to form, causing an interfacial resistance to mass transfer and making the interface more rigid. We have shown that interfacial turbulence is a necessary but not sufficient condition for spontaneous emulsification. Low interfacial tension is also a necessary condition. This quarter a microwave interferometric procedure was developed for the determination of low water content (0. 5 to 10 vol%) of water-in-oil macroemulsions. The apparatus operates at a frequency of 23.48 GHz in the K-band microwave region. The procedure is based on the large differences in dielectric properties between water and oil, and it utilizes the variation in phase shift as sample path length is varied. Measurements are accurate to within 0.5 vol% water.

  3. Speciation and Release Kinetics of Cadmium in an Alkaline Paddy Soil Under Various Flooding Periods and Draining Conditions

    SciTech Connect

    S Khaokaew; R Chaney; G Landrot; M Ginder-Vogel; D Sparks

    2011-12-31

    This study determined Cd speciation and release kinetics in a Cd-Zn cocontaminated alkaline paddy soil, under various flooding periods and draining conditions, by employing synchrotron-based techniques, and a stirred-flow kinetic method. Results revealed that varying flooding periods and draining conditions affected Cd speciation and its release kinetics. Linear least-squares fitting (LLSF) of bulk X-ray absorption fine structure (XAFS) spectra of the air-dried, and the 1 day-flooded soil samples, showed that at least 50% of Cd was bound to humic acid. Cadmium carbonates were found as the major species at most flooding periods, while a small amount of cadmium sulfide was found after the soils were flooded for longer periods. Under all flooding and draining conditions, at least 14 mg/kg Cd was desorbed from the soil after a 2-hour desorption experiment. The results obtained by micro X-ray fluorescence ({mu}-XRF) spectroscopy showed that Cd was less associated with Zn than Ca, in most soil samples. Therefore, it is more likely that Cd and Ca will be present in the same mineral phases rather than Cd and Zn, although the source of these two latter elements may originate from the same surrounding Zn mines in the Mae Sot district.

  4. Surfactant-enhanced alkaline flooding for light oil recovery. Quarterly report, July 1--September 30, 1995

    SciTech Connect

    Wasan, D.T.

    1995-12-01

    The overall objective of this project is to develop a very cost-effective method for formulating a successful surfactant-enhanced alkaline flood by appropriately choosing mixed alkalis which form inexpensive buffers to obtain the desired pH (between 8.5 and 12.0) for ultimate spontaneous emulsification and ultra-low tension. In addition, the novel concept of pH gradient design to optimize flood water conditions will be tested. The problem of characterizing emulsions in porous media is very important in enhanced oil recovery applications. This is usually accomplished by externally added or in situ generated surfactants that sweep the oil out of the reservoir. Emulsification of the trapped oil is one of the mechanisms of recovery. The ability to detect emulsions in the porous medium is therefore crucial to designing profitable flood systems. The capability of microwave dielectric techniques to detect emulsions in porous medium is demonstrated by mathematical modeling and by experiments. This quarter the shape dependence of the complex dielectric properties of W/O and O/W type dispersions in the microwave frequency region were analyzed using the generalized effective medium theory of Hanai. The computations show that the authors earlier finding for spherical dispersions can now be extended to include nonspherical geometries. The computed results show that the difference in dielectric behavior of the two emulsion types are a strong function of the shape of the dispersions, with the differences vanishing when the two phases are oriented as layers parallel and perpendicular to the electromagnetic field.

  5. Alkaline electrochemical cells and method of making

    NASA Technical Reports Server (NTRS)

    Hoyt, H. E.; Pfluger, H. L. (Inventor)

    1970-01-01

    Equilibrated cellulose ether membranes of increased electrolytic conductivity for use as separators in concentrated alkaline electrochemical cells are investigated. The method of making such membranes by equilibration to the degree desired in an aqueous alkali solution mantained at a temperature below about 10 C is described.

  6. Pilot test of alkaline surfactant polymer flooding in Daqing Oil Field

    SciTech Connect

    Wang Demin; Zhang Zhenhua; Cheng Jiecheng; Yang Jingchun; Gao Shutang; Li Lin

    1996-12-31

    After the success of polymer flooding in Daqing, two alkaline-surfactant-polymer (ASP) floods have been conducted to (1) increase oil recovery further (2) study the feasibility of ASP flooding (3) provide technical and practical experience for expanding the ASP pilots. Inverted five spot pattern is adopted in both pilots. Pilot 1 (PO) is located in the West Central area of Daqing Oil Field and consists of 4 injectors and 9 producers. Pilot 2 (XF) is located in the South area of Daqing Oil Field and has 1 injector and 4 producers. The crude oil of both pilots have high paraffin content and low acid value. Compared to PO, XF has characteristics of lower heterogeneity, lighter oil and higher recovery by water flooding. For each pilot, after extensive screening, an ASP system has been determined. The ASP systems all feature very low surfactant concentration and wide range of ultra low interfacial tension with change of concentration of any of the three components. Core flooding and numerical simulation show more than 20% OOIP incremental recovery by ASP over water flooding for both pilots. By the end of May, 1995, 100% of ASP slug and 100% of the polymer buffer have been injected in the pilots. Production wells showed good responses in terms of large decrease in water cut and increase in oil production. The performance of each pilot has followed the numerical simulation predication very well, or even a bit better. Emulsions showed up in producers, but the emulsions are easy to be broken by a special de-emulsifier. No formation damage and scaling have been detected. The ASP flood pilot tests are technically successful and, based on the preliminary evaluation, economically feasible. Therefore, in the near future, much larger scale ASP flood field tests are going to be performed at several districts in Daqing Oil Field.

  7. A method for making an alkaline battery electrode plate

    NASA Technical Reports Server (NTRS)

    Chida, K.; Ezaki, T.

    1983-01-01

    A method is described for making an alkaline battery electrode plate where the desired active substances are filled into a nickel foam substrate. In this substrate an electrolytic oxidation reduction occurs in an alkaline solution containing lithium hydroxide.

  8. The effect of polymer-surfactant interaction on the rheological properties of surfactant enhanced alkaline flooding formulations

    SciTech Connect

    French, T.R.; Josephson, C.B.

    1993-02-01

    Surfactant-enhanced, lower pH (weak) alkaline chemicals are effective for mobilizing residual oil. Polymer is used for mobility control because if mobility control is lost, then oil recovery is reduced. The ability to maintain mobility control during surfactant-alkaline flooding can be adversely affected by chemical interaction. In this work, interaction between polymers and surfactants was shown to be affected by pH, ionic strength, crude oil, and the properties of the polymers and surfactants. Polymer-surfactant interaction (phase separation, precipitation, and viscosity loss) occurred between most of the polymers and surfactants that were tested. Polymer-surfactant interaction is difficult to eliminate, and no method was found for completely eliminating interaction. Polymer-surfactant interaction occurred at optimal salinity and below optimal salinity. Polymer-surfactant interaction had an adverse effect on polymer rheology; however, the adverse effect of interaction on polymer rheology was lessened when oil was present. Increasing the pH of chemical systems further reduced the adverse effects of interaction on polymer rheology.

  9. Process, mechanism and impacts of scale formation in alkaline flooding by a variable porosity and permeability model

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen; Li, Jiachun

    2016-06-01

    In spite of the role of alkali in enhancing oil recovery (EOR), the formation of precipitation during alkaline-surfactant-polymer (ASP) flooding can severely do harm to the stratum of oil reservoirs, which has been observed in situ tests of oil fields such as scale deposits found in oil stratum and at the bottom of oil wells. On the other hand, remarkable variation of stratum parameters, e.g., pore radius, porosity, and permeability due to scale formation considerably affects seepage flow and alkaline flooding process in return. The objective of this study is to firstly examine these mutual influential phenomena and corresponding mechanisms along with EOR during alkaline flooding when the effects of precipitation are no longer negligible. The chemical kinetic theory is applied for the specific fundamental reactions to describe the process of rock dissolution in silica-based reservoirs. The solubility product principle is used to analyze the mechanism of alkali scale formation in flooding. Then a 3D alkaline flooding coupling model accounting for the variation of porosity and permeability is established to quantitatively estimate the impact of alkali scales on reservoir stratum. The reliability of the present model is verified in comparison with indoor experiments and field tests of the Daqing oil field. Then, the numerical simulations on a 1/4 well group in a 5-spot pattern show that the precipitation grows with alkali concentration, temperature, and injection pressure and, thus, reduces reservoir permeability and oil recovery correspondingly. As a result, the selection of alkali with a weak base is preferable in ASP flooding by tradeoff strategy.

  10. A rainfall design method for spatial flood risk assessment: considering multiple flood sources

    NASA Astrophysics Data System (ADS)

    Jiang, X.; Tatano, H.

    2015-08-01

    Information about the spatial distribution of flood risk is important for integrated urban flood risk management. Focusing on urban areas, spatial flood risk assessment must reflect all risk information derived from multiple flood sources: rivers, drainage, coastal flooding etc. that may affect the area. However, conventional flood risk assessment deals with each flood source independently, which leads to an underestimation of flood risk in the floodplain. Even in floodplains that have no risk from coastal flooding, flooding from river channels and inundation caused by insufficient drainage capacity should be considered simultaneously. For integrated flood risk management, it is necessary to establish a methodology to estimate flood risk distribution across a floodplain. In this paper, a rainfall design method for spatial flood risk assessment, which considers the joint effects of multiple flood sources, is proposed. The concept of critical rainfall duration determined by the concentration time of flooding is introduced to connect response characteristics of different flood sources with rainfall. A copula method is then adopted to capture the correlation of rainfall amount with different critical rainfall durations. Rainfall events are designed taking advantage of the copula structure of correlation and marginal distribution of rainfall amounts within different critical rainfall durations. A case study in the Otsu River Basin, Osaka prefecture, Japan was conducted to demonstrate this methodology.

  11. ALKALINE-SURFACTANT-POLYMER FLOODING AND RESERVOIR CHARACTERIZATION OF THE BRIDGEPORT AND CYPRESS RESERVOIRS OF THE LAWRENCE FIELD

    SciTech Connect

    Malcolm Pitts; Ron Damm; Bev Seyler

    2003-03-01

    Feasibility of alkaline-surfactant-polymer flood for the Lawrence Field in Lawrence County, Illinois is being studied. Two injected formulations are being designed; one for the Bridgeport A and Bridgeport B reservoirs and one for Cypress and Paint Creek reservoirs. Fluid-fluid and coreflood evaluations have developed a chemical solution that produces incremental oil in the laboratory from the Cypress and Paint Creek reservoirs. A chemical formulation for the Bridgeport A and Bridgeport B reservoirs is being developed. A reservoir characterization study is being done on the Bridgeport A, B, & D sandstones, and on the Cypress sandstone. The study covers the pilot flood area and the Lawrence Field.

  12. ALKALINE-SURFACTANT-POLYMER FLOODING AND RESERVOIR CHARACTERIZATION OF THE BRIDGEPORT AND CYPRESS RESERVOIRS OF THE LAWRENCE FIELD

    SciTech Connect

    Malcolm Pitts; Ron Damm; Bev Seyler

    2003-04-01

    Feasibility of alkaline-surfactant-polymer flood for the Lawrence Field in Lawrence County, Illinois is being studied. Two injected formulations are being designed; one for the Bridgeport A and Bridgeport B reservoirs and one for Cypress and Paint Creek reservoirs. Fluid-fluid and coreflood evaluations have developed a chemical solution that produces incremental oil in the laboratory from the Cypress and Paint Creek reservoirs. A chemical formulation for the Bridgeport A and Bridgeport B reservoirs is being developed. A reservoir characterization study is being done on the Bridgeport A, B, & D sandstones, and on the Cypress sandstone. The study covers the pilot flood area and the Lawrence Field.

  13. Method of increasing the sulfation capacity of alkaline earth sorbents

    DOEpatents

    Shearer, J.A.; Turner, C.B.; Johnson, I.

    1980-03-13

    A system and method for increasing the sulfation capacity of alkaline earth carbonates to scrub sulfur dioxide produced during the fluidized bed combustion of coal in which partially sulfated alkaline earth carbonates are hydrated in a fluidized bed to crack the sulfate coating and convert the alkaline earth oxide to the hydroxide. Subsequent dehydration of the sulfate-hydroxide to a sulfate-oxide particle produces particles having larger pore size, increased porosity, decreased grain size and additional sulfation capacity. A continuous process is disclosed.

  14. Method of increasing the sulfation capacity of alkaline earth sorbents

    DOEpatents

    Shearer, John A.; Turner, Clarence B.; Johnson, Irving

    1982-01-01

    A system and method for increasing the sulfation capacity of alkaline earth carbonates to scrub sulfur dioxide produced during the fluidized bed combustion of coal in which partially sulfated alkaline earth carbonates are hydrated in a fluidized bed to crack the sulfate coating and convert the alkaline earth oxide to the hydroxide. Subsequent dehydration of the sulfate-hydroxide to a sulfate-oxide particle produces particles having larger pore size, increased porosity, decreased grain size and additional sulfation capacity. A continuous process is disclosed.

  15. Chemical composition profiles during alkaline flooding at different temperatures and extended residence times

    SciTech Connect

    Aflaki, R.; Handy, L.L.

    1992-12-01

    The objective of this work was to investigate whether or not caustic sweeps the major portion of the reservoir efficiently during an alkaline flood process. It was also the objective of this work to study the state of final equilibrium during a caustic flood through determination of the pH and chemical composition profiles along the porous medium. For this purpose, a long porous medium which provided extended residence times was required. It was necessary to set up the porous medium such that the changes in the pH and chemical composition of the solution could be monitored. Four Berea sandstone cores (8 in. length and1 in. diameter) placed in series provided the desired length and the opportunity for sampling in-between cores. This enabled establishment of pH and chemical composition profiles. The experiments were run at, temperatures up.to 180{degrees}C, and the flow rates varied from 4.8 to 0.2 ft/day. The samples were analyzed for pH and for Si and Al concentrations.The results show that caustic consumption is insignificant for temperatures up to 100{degrees}C. Above 100{degrees}C consumption increases and is accompanied by a significant decrease in pH. The sharp decline in pH also coincides with a sharp decline in concentration of silica in solution. The results also show that alumina is removed from the solution and solubility of alumina ultimately reaches zero. Sharp silica and pH declines take place even in the absence of any alumina in solution. As a result, removal of silica from solution is attributed to the irreversible caustic/rock interaction. This interaction is in the form of chemisorption reactions in which silica is adsorbed onto the rock surface consuming hydroxyl ion. Once these reactions were satisfied, caustic breakthrough occurs at a high pH. However, significant pore volumes of caustic must be injected for completion of the chemisorption.

  16. Chemical composition profiles during alkaline flooding at different temperatures and extended residence times

    SciTech Connect

    Aflaki, R.; Handy, L.L.

    1992-12-01

    The objective of this work was to investigate whether or not caustic sweeps the major portion of the reservoir efficiently during an alkaline flood process. It was also the objective of this work to study the state of final equilibrium during a caustic flood through determination of the pH and chemical composition profiles along the porous medium. For this purpose, a long porous medium which provided extended residence times was required. It was necessary to set up the porous medium such that the changes in the pH and chemical composition of the solution could be monitored. Four Berea sandstone cores (8 in. length and1 in. diameter) placed in series provided the desired length and the opportunity for sampling in-between cores. This enabled establishment of pH and chemical composition profiles. The experiments were run at, temperatures up.to 180[degrees]C, and the flow rates varied from 4.8 to 0.2 ft/day. The samples were analyzed for pH and for Si and Al concentrations.The results show that caustic consumption is insignificant for temperatures up to 100[degrees]C. Above 100[degrees]C consumption increases and is accompanied by a significant decrease in pH. The sharp decline in pH also coincides with a sharp decline in concentration of silica in solution. The results also show that alumina is removed from the solution and solubility of alumina ultimately reaches zero. Sharp silica and pH declines take place even in the absence of any alumina in solution. As a result, removal of silica from solution is attributed to the irreversible caustic/rock interaction. This interaction is in the form of chemisorption reactions in which silica is adsorbed onto the rock surface consuming hydroxyl ion. Once these reactions were satisfied, caustic breakthrough occurs at a high pH. However, significant pore volumes of caustic must be injected for completion of the chemisorption.

  17. Evolution of methods for evaluating the occurrence of floods

    USGS Publications Warehouse

    Benson, M.A.

    1962-01-01

    A brief summary is given of the history of methods of expressing flood potentialities, proceeding from simple flood formulas to statistical methods of flood-frequency analysis on a regional basis. Current techniques are described and evaluated. Long-term flood records in the United States show no justification for the adoption of a single type of theoretical distribution of floods. The significance and predictive values of flood-frequency relations are considered. Because of the length of flood records available and the interdependence of flood events within a region, the probable long-term average magnitudes of floods of a given recurrence interval are uncertain. However, if the magnitudes defined by the records available are accepted, the relative effects of drainage-basin characteristics and climatic variables can be determined with a reasonable degree of assurance.

  18. Alkaline solution absorption of carbon dioxide method and apparatus

    DOEpatents

    Hobbs, D.T.

    1991-01-01

    Disclosed is a method for measuring the concentration of hydroxides (or pH) in alkaline solutions, using the tendency of hydroxides to adsorb CO{sub 2}. The method comprises passing CO{sub 2} over the surface of an alkaline solution in a remote tank before and after measurements of the CO{sub 2} concentration. Comparison of the measurements yields the adsorption fraction from which the hydroxide concentration can be calculated using a correlation of hydroxide or pH to adsorption fraction. A schematic is given of a process system according to a preferred embodiment of the invention. 2 figs.

  19. Enzymatic method of determining lead using alkaline phosphatase

    SciTech Connect

    Shekhovtsova, T.N.; Kucheryaeva, V.V.; Dolmanova, I.F.

    1986-03-20

    The purpose of this work was to determine the possibility of using alkaline phosphatase to determine trace amounts of ions of a number of metals - Mg, Ba, Ca, Sr, Cd, Pb - for which there are virtually no sensitive and simple methods of determination.

  20. A comparison of computer methods for seawater alkalinity titrations

    NASA Astrophysics Data System (ADS)

    Barron, J. L.; Dyrssen, D.; Jones, E. P.; Wedborg, M.

    1983-04-01

    Potentiometric hydrochloric acid titration of seawater provides a powerful technique for determining components of the carbonate system. Recently, questions have been raised regarding older computer procedures for extracting the carbonate system parameters from the titration curve. We compare four evaluation methods, an early Gran method, the GEOSECS Gran method, a new modified Gran method, and a curve-fitting method. We conclude that the new modified Gran method and the curve-fitting can result in a precision of better than 0.1% but because of possible problems associated with representing all relevant chemical reactions during titration, an alkalinity standard must be established before accuracies of 0.1% can be achieved.

  1. Two mantle sources, two plumbing systems: Tholeiitic and alkaline magmatism of the Maymecha River basin, Siberian flood volcanic province

    USGS Publications Warehouse

    Arndt, N.; Chauvel, C.; Czamanske, G.; Fedorenko, V.

    1998-01-01

    Rocks of two distinctly different magma series are found in a ???4000-m-thick sequence of lavas and tuffs in the Maymecha River basin which is part of the Siberian flood-volcanic province. The tholeiites are typical low-Ti continental flood basalts with remarkably restricted, petrologically evolved compositions. They have basaltic MgO contents, moderate concentrations of incompatible trace elements, moderate fractionation of incompatible from compatible elements, distinct negative Ta(Nb) anomalies, and ??Nd values of 0 to + 2. The primary magmas were derived from a relatively shallow mantle source, and evolved in large crustal magma chambers where they acquired their relatively uniform compositions and became contaminated with continental crust. An alkaline series, in contrast, contains a wide range of rock types, from meymechite and picrite to trachytes, with a wide range of compositions (MgO from 0.7 to 38 wt%, SiO2 from 40 to 69 wt%, Ce from 14 to 320 ppm), high concentrations of incompatible elements and extreme fractionation of incompatible from compatible elements (Al2O3/TiO2 ??? 1; Sm/Yb up to 11). These rocks lack Ta(Nb) anomalies and have a broad range of ??Nd values, from -2 to +5. The parental magmas are believed to have formed by low-degree melting at extreme mantle depths (>200 km). They bypassed the large crustal magma chambers and ascended rapidly to the surface, a consequence, perhaps, of high volatile contents in the primary magmas. The tholeiitic series dominates the lower part of the sequence and the alkaline series the upper part; at the interface, the two types are interlayered. The succession thus provides evidence of a radical change in the site of mantle melting, and the simultaneous operation of two very different crustal plumbing systems, during the evolution of this flood-volcanic province. ?? Springer-Verlag 1998.

  2. A method for mapping flood hazard along roads.

    PubMed

    Kalantari, Zahra; Nickman, Alireza; Lyon, Steve W; Olofsson, Bo; Folkeson, Lennart

    2014-01-15

    A method was developed for estimating and mapping flood hazard probability along roads using road and catchment characteristics as physical catchment descriptors (PCDs). The method uses a Geographic Information System (GIS) to derive candidate PCDs and then identifies those PCDs that significantly predict road flooding using a statistical modelling approach. The method thus allows flood hazards to be estimated and also provides insights into the relative roles of landscape characteristics in determining road-related flood hazards. The method was applied to an area in western Sweden where severe road flooding had occurred during an intense rain event as a case study to demonstrate its utility. The results suggest that for this case study area three categories of PCDs are useful for prediction of critical spots prone to flooding along roads: i) topography, ii) soil type, and iii) land use. The main drivers among the PCDs considered were a topographical wetness index, road density in the catchment, soil properties in the catchment (mainly the amount of gravel substrate) and local channel slope at the site of a road-stream intersection. These can be proposed as strong indicators for predicting the flood probability in ungauged river basins in this region, but some care is needed in generalising the case study results other potential factors are also likely to influence the flood hazard probability. Overall, the method proposed represents a straightforward and consistent way to estimate flooding hazards to inform both the planning of future roadways and the maintenance of existing roadways.

  3. Role of dynamic interfacial tensions in numerical simulation of cosurfactant/alkaline polymer floods

    SciTech Connect

    Islam, M.R. ); Chakma, A.

    1988-01-01

    This paper presents a new mathematical formulation that provides a realistic and complete representation of surfactant-enhanced alkaline and alkaline/polymer processes. The model accounts for transient interfacial tension (IFT) and non-equilibrium mass transfer phenomena. The proposed model uses a newly developed surface excess model for adsorption and incorporates dispersion and diffusion in both oleic and aqueous phases. Also considered are the resistance factor and the modification in IFT behaviour due to the presence of polymer. The mathematical model is tested against experimental results, showing good agreement both in alkaline/cosurfactant and the new surface excess model for adsorption is much more effective than the conventional Langmuir-type model. Numerical runs are also conducted to investigate the impact of dynamic interfacial tension on oil recovery. A detailed study has been performed to investigate the effect of surfactant and polymer concentrations, slug sizes, and oil viscosity.

  4. Use of indexed historical floods in flood frequency estimation with Fuzzy Bayesian methods

    NASA Astrophysics Data System (ADS)

    Salinas, Jose; Viglione, Alberto; Kiss, Andrea; Bloeschl, Guenter

    2015-04-01

    Efforts of the historical environmental extremes community during the last decades have resulted in the existence of long time series of floods, for example in Central Europe and the Mediterranean region, which in some cases range longer than 500 years in the past. In most of the cases the flood time series are presented in terms of indices, representing a combination of socio-economic indicators for the flood impact, e.g. economic damage, flood duration and extension, ... In hydrological engineering, historical floods are very useful because they give additional information which will reduce the uncertainty in estimates of discharges with low annual exceedance probabilities, i.e. with high return periods. In order to use the historical floods in formal flood frequency analysis, the precise value of the peak discharges would ideally be known, but as commented, they are most usually given in term of indices. This work presents a novel method on how to obtain a prior distribution for the parameters of the annual peak discharges distribution from indexed historical floods time series. The prior distribution is incorporated in the flood frequency estimation via Bayesian methods (see e.g. Viglione et al., 2013) in order to reduce the uncertainties in the design flood estimates. The historical data used is subject to a high degree of uncertainty and unpreciseness. In this sense, a framework is presented where the discharge thresholds between flood indices are modeled as fuzzy numbers. These fuzzy thresholds will define a fuzzy prior distribution, which will requires to apply Fuzzy Bayesian Inference (Viertl, 2008ab) to obtain fuzzy credibility intervals for the design floods. Viertl, R. (2008a) Foundations of Fuzzy Bayesian Inference, Journal of Uncertain Systems, 2, 187-191. Viertl, R. (2008b) Fuzzy Bayesian Inference. In: Soft Methods For Handling Variability And Imprecision. Advances In Soft Computing. Vol. 48. Springer-Verlag Berlin, pp 10-15. Viglione, A., R. Merz

  5. Changes in the bacterial populations of the highly alkaline saline soil of the former lake Texcoco (Mexico) following flooding.

    PubMed

    Valenzuela-Encinas, César; Neria-González, Isabel; Alcántara-Hernández, Rocio J; Estrada-Alvarado, Isabel; Zavala-Díaz de la Serna, Francisco Javier; Dendooven, Luc; Marsch, Rodolfo

    2009-07-01

    Flooding an extreme alkaline-saline soil decreased alkalinity and salinity, which will change the bacterial populations. Bacterial 16S rDNA libraries were generated of three soils with different electrolytic conductivity (EC), i.e. soil with EC 1.7 dS m(-1) and pH 7.80 (LOW soil), with EC 56 dS m(-1) and pH 10.11 (MEDIUM soil) and with EC 159 dS m(-1) and pH 10.02 (HIGH soil), using universal bacterial oligonucleotide primers, and 463 clone 16S rDNA sequences were analyzed phylogenetically. Library proportions and clone identification of the phyla Proteobacteria, Actinobacteria, Acidobacteria, Cyanobacteria, Bacteroidetes, Firmicutes and Cloroflexi showed that the bacterial communities were different. Species and genera of the Rhizobiales, Rhodobacterales and Xanthomonadales orders of the alpha- and gamma-subdivision of Proteobacteria were found at the three sites. Species and genera of the Rhodospirillales, Sphingobacteriales, Clostridiales, Oscillatoriales and Caldilineales were found only in the HIGH soil, Sphingomonadales, Burkholderiales and Pseudomonadales in the MEDIUM soil, Myxococcales in the LOW soil, and Actinomycetales in the MEDIUM and LOW soils. It was found that the largest diversity at the order and species level was found in the MEDIUM soil as bacteria of both the HIGH and LOW soils were found in it.

  6. The effect of polymer-surfactant interaction on the rheological properties of surfactant enhanced alkaline flooding formulations. [Phase separation, precipitation and viscosity loss

    SciTech Connect

    French, T.R.; Josephson, C.B.

    1993-02-01

    Surfactant-enhanced, lower pH (weak) alkaline chemicals are effective for mobilizing residual oil. Polymer is used for mobility control because if mobility control is lost, then oil recovery is reduced. The ability to maintain mobility control during surfactant-alkaline flooding can be adversely affected by chemical interaction. In this work, interaction between polymers and surfactants was shown to be affected by pH, ionic strength, crude oil, and the properties of the polymers and surfactants. Polymer-surfactant interaction (phase separation, precipitation, and viscosity loss) occurred between most of the polymers and surfactants that were tested. Polymer-surfactant interaction is difficult to eliminate, and no method was found for completely eliminating interaction. Polymer-surfactant interaction occurred at optimal salinity and below optimal salinity. Polymer-surfactant interaction had an adverse effect on polymer rheology; however, the adverse effect of interaction on polymer rheology was lessened when oil was present. Increasing the pH of chemical systems further reduced the adverse effects of interaction on polymer rheology.

  7. Flood risk assessment in France: comparison of extreme flood estimation methods (EXTRAFLO project, Task 7)

    NASA Astrophysics Data System (ADS)

    Garavaglia, F.; Paquet, E.; Lang, M.; Renard, B.; Arnaud, P.; Aubert, Y.; Carre, J.

    2013-12-01

    In flood risk assessment the methods can be divided in two families: deterministic methods and probabilistic methods. In the French hydrologic community the probabilistic methods are historically preferred to the deterministic ones. Presently a French research project named EXTRAFLO (RiskNat Program of the French National Research Agency, https://extraflo.cemagref.fr) deals with the design values for extreme rainfall and floods. The object of this project is to carry out a comparison of the main methods used in France for estimating extreme values of rainfall and floods, to obtain a better grasp of their respective fields of application. In this framework we present the results of Task 7 of EXTRAFLO project. Focusing on French watersheds, we compare the main extreme flood estimation methods used in French background: (i) standard flood frequency analysis (Gumbel and GEV distribution), (ii) regional flood frequency analysis (regional Gumbel and GEV distribution), (iii) local and regional flood frequency analysis improved by historical information (Naulet et al., 2005), (iv) simplify probabilistic method based on rainfall information (i.e. Gradex method (CFGB, 1994), Agregee method (Margoum, 1992) and Speed method (Cayla, 1995)), (v) flood frequency analysis by continuous simulation approach and based on rainfall information (i.e. Schadex method (Paquet et al., 2013, Garavaglia et al., 2010), Shyreg method (Lavabre et al., 2003)) and (vi) multifractal approach. The main result of this comparative study is that probabilistic methods based on additional information (i.e. regional, historical and rainfall information) provide better estimations than the standard flood frequency analysis. Another interesting result is that, the differences between the various extreme flood quantile estimations of compared methods increase with return period, staying relatively moderate up to 100-years return levels. Results and discussions are here illustrated throughout with the example

  8. Efficiency of alkaline hydrolysis method in environment protection.

    PubMed

    Kricka, Tajana; Toth, Ivan; Kalambura, Sanja; Jovicić, Nives

    2014-06-01

    Development of new technologies for the efficient use of proteins of animal origin, apart from heat treatment in rendering facilities that was used to date, has become the primary goal of the integral waste management system. The emergence of bovine spongiform encephalopathy in Europe and in the World in the 1990s opened up new questions regarding medical safety and use of meat bone meal in the animal feed, which is produced by processing animal waste. Animal waste is divided into three categories, out of which the first category is high-risk waste. Alkaline hydrolysis is alternative method for management of animal by-products not intended for human diet and imposes itself as one of the solutions for disposal of high-risk proteins. The paper will present the analyses of animal by-products not intended for human diet treated in laboratory reactor for alkaline hydrolysis, as one of the two recognized methods in EU for the disposal of this type of material and use in fertilization. PMID:25144977

  9. Floods

    MedlinePlus

    Floods are common in the United States. Weather such as heavy rain, thunderstorms, hurricanes, or tsunamis can ... is breached, or when a dam breaks. Flash floods, which can develop quickly, often have a dangerous ...

  10. Quality assurance flood source and method of making

    SciTech Connect

    Fisher, Darrell R; Alexander, David L; Satz, Stanley

    2002-12-03

    Disclosed is a is an improved flood source, and method of making the same, which emits an evenly distributed flow of energy from a gamma emitting radionuclide dispersed throughout the volume of the flood source. The flood source is formed by filling a bottom pan with a mix of epoxy resin with cobalt-57, preferably at 10 to 20 millicuries and then adding a hardener. The pan is secured to a flat, level surface to prevent the pan from warping and to act as a heat sink for removal of heat from the pan during the curing of the resin-hardener mixture.

  11. Separator for alkaline batteries and method of making same

    NASA Technical Reports Server (NTRS)

    Hoyt, H. E.; Pfluger, H. L. (Inventor)

    1970-01-01

    The preparation of membranes suitable for use as separators in concentrated alkaline battery cells by selective solvolysis of copolymers of methacrylate esters with acrylate esters followed by addition of a base and to the resultant products is described. The method of making copolymers by first copolymerizing a methacrylate ester (or esters) with a more readily hydrolyzable ester, followed by a selective saponification whereby the methacrylate ester moieties remain essentially intact and the readily hydrolyzable ester moiety is suponified and to the partial or complete neutralization of the relatively brittle copolymer acid with a base to make membranes which are sufficiently flexible in the dry state so that they may be wrapped around electrodes without damage by handling is described.

  12. A GIS-based method for flood risk assessment

    NASA Astrophysics Data System (ADS)

    Kalogeropoulos, Kleomenis; Stathopoulos, Nikos; Psarogiannis, Athanasios; Penteris, Dimitris; Tsiakos, Chrisovalantis; Karagiannopoulou, Aikaterini; Krikigianni, Eleni; Karymbalis, Efthimios; Chalkias, Christos

    2016-04-01

    Floods are physical global hazards with negative environmental and socio-economic impacts on local and regional scale. The technological evolution during the last decades, especially in the field of geoinformatics, has offered new advantages in hydrological modelling. This study seeks to use this technology in order to quantify flood risk assessment. The study area which was used is an ungauged catchment and by using mostly GIS hydrological and geomorphological analysis together with a GIS-based distributed Unit Hydrograph model, a series of outcomes have risen. More specifically, this paper examined the behaviour of the Kladeos basin (Peloponnese, Greece) using real rainfall data, as well hypothetical storms. The hydrological analysis held using a Digital Elevation Model of 5x5m pixel size, while the quantitative drainage basin characteristics were calculated and were studied in terms of stream order and its contribution to the flood. Unit Hydrographs are, as it known, useful when there is lack of data and in this work, based on time-area method, a sequences of flood risk assessments have been made using the GIS technology. Essentially, the proposed methodology estimates parameters such as discharge, flow velocity equations etc. in order to quantify flood risk assessment. Keywords Flood Risk Assessment Quantification; GIS; hydrological analysis; geomorphological analysis.

  13. Flooding and Flood Management

    USGS Publications Warehouse

    Brooks, K.N.; Fallon, J.D.; Lorenz, D.L.; Stark, J.R.; Menard, Jason; Easter, K.W.; Perry, Jim

    2011-01-01

    Floods result in great human disasters globally and nationally, causing an average of $4 billion of damages each year in the United States. Minnesota has its share of floods and flood damages, and the state has awarded nearly $278 million to local units of government for flood mitigation projects through its Flood Hazard Mitigation Grant Program. Since 1995, flood mitigation in the Red River Valley has exceeded $146 million. Considerable local and state funding has been provided to manage and mitigate problems of excess stormwater in urban areas, flooding of farmlands, and flood damages at road crossings. The cumulative costs involved with floods and flood mitigation in Minnesota are not known precisely, but it is safe to conclude that flood mitigation is a costly business. This chapter begins with a description of floods in Minneosta to provide examples and contrasts across the state. Background material is presented to provide a basic understanding of floods and flood processes, predication, and management and mitigation. Methods of analyzing and characterizing floods are presented because they affect how we respond to flooding and can influence relevant practices. The understanding and perceptions of floods and flooding commonly differ among those who work in flood forecasting, flood protection, or water resource mamnagement and citizens and businesses affected by floods. These differences can become magnified following a major flood, pointing to the need for better understanding of flooding as well as common language to describe flood risks and the uncertainty associated with determining such risks. Expectations of accurate and timely flood forecasts and our ability to control floods do not always match reality. Striving for clarity is important in formulating policies that can help avoid recurring flood damages and costs.

  14. Passive aerobic treatment of net-alkaline, iron-laden drainage from a flooded underground anthracite mine, Pennsylvania, USA

    USGS Publications Warehouse

    Cravotta, C.A.

    2007-01-01

    This report evaluates the results of a continuous 4.5-day laboratory aeration experiment and the first year of passive, aerobic treatment of abandoned mine drainage (AMD) from a typical flooded underground anthracite mine in eastern Pennsylvania, USA. During 1991-2006, the AMD source, locally known as the Otto Discharge, had flows from 20 to 270 L/s (median 92 L/s) and water quality that was consistently suboxic (median 0.9 mg/L O2) and circumneutral (pH ??? 6.0; net alkalinity >10) with moderate concentrations of dissolved iron and manganese and low concentrations of dissolved aluminum (medians of 11, 2.2, and <0.2 mg/L, respectively). In 2001, the laboratory aeration experiment demonstrated rapid oxidation of ferrous iron (Fe 2+) without supplemental alkalinity; the initial Fe2+ concentration of 16.4 mg/L decreased to less than 0.5 mg/L within 24 h; pH values increased rapidly from 5.8 to 7.2, ultimately attaining a steady-state value of 7.5. The increased pH coincided with a rapid decrease in the partial pressure of carbon dioxide (PCO2) from an initial value of 10 -1.1atm to a steady-state value of 10-3.1atm. From these results, a staged aerobic treatment system was conceptualized consisting of a 2 m deep pond with innovative aeration and recirculation to promote rapid oxidation of Fe2+, two 0.3 m deep wetlands to facilitate iron solids removal, and a supplemental oxic limestone drain for dissolved manganese and trace-metal removal. The system was constructed, but without the aeration mechanism, and began operation in June 2005. During the first 12 months of operation, estimated detention times in the treatment system ranged from 9 to 38 h. However, in contrast with 80-100% removal of Fe2+ over similar elapsed times during the laboratory aeration experiment, the treatment system typically removed less than 35% of the influent Fe2+. Although concentrations of dissolved CO2 decreased progressively within the treatment system, the PCO2 values for treated effluent

  15. Evaluation of alkaline phosphatase detection in dairy products using a modified rapid chemiluminescent method and official methods.

    PubMed

    Albillos, S M; Reddy, R; Salter, R

    2011-07-01

    Alkaline phosphatase is a ubiquitous milk enzyme that historically has been used to verify adequate pasteurization of milk for public health purposes. Current approved methods for detection of alkaline phosphatase in milk include the use of enzyme photoactivated substrates to give readings in milliunits per liter. The U.S. and European public health limit for alkaline phosphatase in pasteurized drinks is 350 mU/liter. A modified chemiluminescent method, fast alkaline phosphatase, was compared with the approved fluorometric and chemiluminescent alkaline phosphatase methods to determine whether the modified method was equivalent to the approved methods and suitable for detecting alkaline phosphatase in milk. Alkaline phosphatase concentrations in cow's, goat's, and sheep's milk and in flavored drinks and cream were determined by three methods. Evaluations in each matrix were conducted with pasteurized samples spiked with raw milk to produce alkaline phosphatase concentrations of 2 to 5,000 mU/liter. The tests were performed by the method developer and then reproduced at a laboratory at the National Center for Food Safety and Technology following the criteria for a single laboratory validation. The results indicated that the fast alkaline phosphatase method was not significantly different from the approved chemiluminescent method, with a limit of detection of 20 to 50 mU/liter in all the studied matrices. This modified chemiluminescent method detects alkaline phosphatase in the 350 mU/liter range with absolute differences from triplicate data that are lower and within the range of the allowed intralaboratory repeatability values published for the approved chemiluminescent method.

  16. Laboratory methods for enhanced oil recovery core floods

    SciTech Connect

    Robertson, E.P.; Bala, G.A.; Thomas, C.P.

    1994-03-01

    Current research at the Idaho National Engineering Laboratory (INEL) is investigating microbially enhanced oil recovery (MEOR) systems for application to oil reservoirs. Laboratory corefloods are invaluable in developing technology necessary for a field application of MEOR. Methods used to prepare sandstone cores for experimentation, coreflooding techniques, and quantification of coreflood effluent are discussed in detail. A technique to quantify the small volumes of oil associated with laboratory core floods is described.

  17. Method of determining pH by the alkaline absorption of carbon dioxide

    DOEpatents

    Hobbs, David T.

    1992-01-01

    A method for measuring the concentration of hydroxides in alkaline solutions in a remote location using the tendency of hydroxides to absorb carbon dioxide. The method includes the passing of carbon dioxide over the surface of an alkaline solution in a remote tank before and after measurements of the carbon dioxide solution. A comparison of the measurements yields the absorption fraction from which the hydroxide concentration can be calculated using a correlation of hydroxide or pH to absorption fraction.

  18. Human papillomavirus DNA from warts for typing by endonuclease restriction patterns: purification by alkaline plasmid methods.

    PubMed

    Chinami, M; Tanikawa, E; Hachisuka, H; Sasai, Y; Shingu, M

    1990-01-01

    The alkaline plasmid DNA extraction method of Birnboim and Doly was applied for the isolation of human papillomavirus (HPV) from warts. Tissue from common and plantar warts was digested with proteinase K, and the extrachromosomal circular covalently-closed form of HPV-DNA was rapidly extracted by alkaline sodium dodecyl sulphate and phenol-chloroform treatment. Recovery of HPV-DNA from the tissue was sufficient for determination of endonuclease restriction patterns by agarose gel electrophoresis.

  19. Fluctuation Flooding Method (FFM) for accelerating conformational transitions of proteins

    NASA Astrophysics Data System (ADS)

    Harada, Ryuhei; Takano, Yu; Shigeta, Yasuteru

    2014-03-01

    A powerful conformational sampling method for accelerating structural transitions of proteins, "Fluctuation Flooding Method (FFM)," is proposed. In FFM, cycles of the following steps enhance the transitions: (i) extractions of largely fluctuating snapshots along anisotropic modes obtained from trajectories of multiple independent molecular dynamics (MD) simulations and (ii) conformational re-sampling of the snapshots via re-generations of initial velocities when re-starting MD simulations. In an application to bacteriophage T4 lysozyme, FFM successfully accelerated the open-closed transition with the 6 ns simulation starting solely from the open state, although the 1-μs canonical MD simulation failed to sample such a rare event.

  20. Analysis of flood modeling through innovative geomatic methods

    NASA Astrophysics Data System (ADS)

    Zazo, Santiago; Molina, José-Luis; Rodríguez-Gonzálvez, Pablo

    2015-05-01

    A suitable assessment and management of the exposure level to natural flood risks necessarily requires an exhaustive knowledge of the terrain. This study, primarily aimed to evaluate flood risk, firstly assesses the suitability of an innovative technique, called Reduced Cost Aerial Precision Photogrammetry (RC-APP), based on a motorized technology ultra-light aircraft ULM (Ultra-Light Motor), together with the hybridization of reduced costs sensors, for the acquisition of geospatial information. Consequently, this research generates the RC-APP technique which is found to be a more accurate-precise, economical and less time consuming geomatic product. This technique is applied in river engineering for the geometric modeling and risk assessment to floods. Through the application of RC-APP, a high spatial resolution image (orthophoto of 2.5 cm), and a Digital Elevation Model (DEM) of 0.10 m mesh size and high density points (about 100 points/m2), with altimetric accuracy of -0.02 ± 0.03 m have been obtained. These products have provided a detailed knowledge of the terrain, afterward used for the hydraulic simulation which has allowed a better definition of the inundated area, with important implications for flood risk assessment and management. In this sense, it should be noted that the achieved spatial resolution of DEM is 0.10 m which is especially interesting and useful in hydraulic simulations through 2D software. According to the results, the developed methodology and technology allows for a more accurate riverbed representation, compared with other traditional techniques such as Light Detection and Ranging (LiDAR), with a Root-Mean-Square Error (RMSE ± 0.50 m). This comparison has revealed that RC-APP has one lower magnitude order of error than the LiDAR method. Consequently, this technique arises as an efficient and appropriate tool, especially in areas with high exposure to risk of flooding. In hydraulic terms, the degree of detail achieved in the 3D model

  1. A study of farmers' flood perceptions based on the entropy method: an application from Jianghan Plain, China.

    PubMed

    Luo, Xiaofeng; Lone, Todd; Jiang, Songying; Li, Rongrong; Berends, Patrick

    2016-07-01

    Using survey data from 280 farmers in Jianghan Plain, China, this paper establishes an evaluation index system for three dimensions of farmers' flood perceptions and then uses the entropy method to estimate their overall flood perception. Farmers' flood perceptions exhibit the following characteristics: (i) their flood-occurrence, flood-prevention, and overall flood perceptions gradually increase with age, whereas their flood-effects perception gradually decreases; (ii) their flood-occurrence and flood-effects perceptions gradually increase with a higher level of education, whereas their flood-prevention perception gradually decreases and their overall flood perception shows nonlinear change; (iii) flood-occurrence, flood-effects, and overall flood perceptions are higher among farmers who serve in public offices than among those who do not do so; (iv) the flood-occurrence, flood-effects, and overall flood perceptions of farmers who work off-farm are higher than those of farmers who work solely on-farm, contrary to the flood-prevention perception; and (v) the flood-effects and flood-prevention perceptions of male farmers are lower than those of female farmers, but the flood-occurrence and overall flood perceptions of male farmers are higher than those of female farmers. PMID:26576512

  2. A high-resolution, fluorescence-based method for localization of endogenous alkaline phosphatase activity.

    PubMed

    Cox, W G; Singer, V L

    1999-11-01

    We describe a high-resolution, fluorescence-based method for localizing endogenous alkaline phosphatase in tissues and cultured cells. This method utilizes ELF (Enzyme-Labeled Fluorescence)-97 phosphate, which yields an intensely fluorescent yellow-green precipitate at the site of enzymatic activity. We compared zebrafish intestine, ovary, and kidney cryosections stained for endogenous alkaline phosphatase using four histochemical techniques: ELF-97 phosphate, Gomori method, BCIP/NBT, and naphthol AS-MX phosphate coupled with Fast Blue BB (colored) and Fast Red TR (fluorescent) diazonium salts. Each method localized endogenous alkaline phosphatase to the same specific sample regions. However, we found that sections labeled using ELF-97 phosphate exhibited significantly better resolution than the other samples. The enzymatic product remained highly localized to the site of enzymatic activity, whereas signals generated using the other methods diffused. We found that the ELF-97 precipitate was more photostable than the Fast Red TR azo dye adduct. Using ELF-97 phosphate in cultured cells, we detected an intracellular activity that was only weakly labeled with the other methods, but co-localized with an antibody against alkaline phosphatase, suggesting that the ELF-97 phosphate provided greater sensitivity. Finally, we found that detecting endogenous alkaline phosphatase with ELF-97 phosphate was compatible with the use of antibodies and lectins. (J Histochem Cytochem 47:1443-1455, 1999)

  3. Method of determining pH by the alkaline absorption of carbon dioxide

    DOEpatents

    Hobbs, D.T.

    1992-10-06

    A method is described for measuring the concentration of hydroxides in alkaline solutions in a remote location using the tendency of hydroxides to absorb carbon dioxide. The method includes the passing of carbon dioxide over the surface of an alkaline solution in a remote tank before and after measurements of the carbon dioxide solution. A comparison of the measurements yields the absorption fraction from which the hydroxide concentration can be calculated using a correlation of hydroxide or pH to absorption fraction. 2 figs.

  4. Removal of dissolved actinides from alkaline solutions by the method of appearing reagents

    DOEpatents

    Krot, Nikolai N.; Charushnikova, Iraida A.

    1997-01-01

    A method of reducing the concentration of neptunium and plutonium from alkaline radwastes containing plutonium and neptunium values along with other transuranic values produced during the course of plutonium production. The OH.sup.- concentration of the alkaline radwaste is adjusted to between about 0.1M and about 4M. [UO.sub.2 (O.sub.2).sub.3 ].sup.4- ion is added to the radwastes in the presence of catalytic amounts of Cu.sup.+2, Co.sup.+2 or Fe.sup.+2 with heating to a temperature in excess of about 60.degree. C. or 85.degree. C., depending on the catalyst, to coprecipitate plutonium and neptunium from the radwaste. Thereafter, the coprecipitate is separated from the alkaline radwaste.

  5. Alkaline resistant phosphate glasses and method of preparation and use thereof

    DOEpatents

    Brow, Richard K.; Reis, Signo T.; Velez, Mariano; Day, Delbert E.

    2010-01-26

    A substantially alkaline resistant calcium-iron-phosphate (CFP) glass and methods of making and using thereof. In one application, the CFP glass is drawn into a fiber and dispersed in cement to produce glass fiber reinforced concrete (GFRC) articles having the high compressive strength of concrete with the high impact, flexural and tensile strength associated with glass fibers.

  6. Frequency, predisposition, and triggers of floods in flysch Carpathians: regional study using dendrogeomorphic methods

    NASA Astrophysics Data System (ADS)

    Šilhán, Karel

    2015-04-01

    Dangerous overland flood events in the foothills of the flysch Carpathians often result from a cumulative effect of floods in high-gradient channels. Detailed understanding of the origin of floods in these catchments is only possible if the occurrence of these floods in the past has thoroughly been studied. Yet, no gauging stations can be found in the local catchments. The reconstruction of floods in ungauged catchments has so far been performed using dendrogeomorphic methods. Stems or branches floating in the floodwater can affect stems or roots of living trees and injure them. Trees are able to record these signals in their tree-ring series. Within the flysch Carpathians, the floods have been reconstructed based on the analysis of 446 cross sections from scarred tree roots and 192 increment cores from the stems of affected trees in the studied area of 10 catchments surrounding the highest peak of the Moravskoslezské Beskydy Mts, the Lysá hora Mt. The dating comprised 64 floods (in different catchments) in 28 flood years for the maximal period of 1883-2012. Most catchments (nine out of ten) were affected by floods in the year 1997. Above-average frequency of floods has also been found for the last two decades, namely thanks to numerous samples taken from young tree roots that revealed more flood impacts. By contrast, although tree-ring series enabled the reconstruction of a longer time series, they only recorded the major floods. The most significant factor affecting the frequency of floods is the orientation of catchments toward the prevailing wind direction. Positive influence of catchment gradient on flood frequency and higher occurrence of floods in the period of intensive slope deforestation show that the floods of the 1950s to 1970s could have a character of flash floods. This assumption is also supported by the character of probable triggering precipitation (high magnitude-short duration precipitation) of this period. Generally, the most frequent probable

  7. Reservoir Characterization of Bridgeport and Cypress Sandstones in Lawrence Field Illinois to Improve Petroleum Recovery by Alkaline-Surfactant-Polymer Flood

    SciTech Connect

    Seyler, Beverly; Grube, John; Huff, Bryan; Webb, Nathan; Damico, James; Blakley, Curt; Madhavan, Vineeth; Johanek, Philip; Frailey, Scott

    2012-12-21

    Within the Illinois Basin, most of the oilfields are mature and have been extensively waterflooded with water cuts that range up to 99% in many of the larger fields. In order to maximize production of significant remaining mobile oil from these fields, new recovery techniques need to be researched and applied. The purpose of this project was to conduct reservoir characterization studies supporting Alkaline-Surfactant-Polymer Floods in two distinct sandstone reservoirs in Lawrence Field, Lawrence County, Illinois. A project using alkaline-surfactantpolymer (ASP) has been established in the century old Lawrence Field in southeastern Illinois where original oil in place (OOIP) is estimated at over a billion barrels and 400 million barrels have been recovered leaving more than 600 million barrels as an EOR target. Radial core flood analysis using core from the field demonstrated recoveries greater than 20% of OOIP. While the lab results are likely optimistic to actual field performance, the ASP tests indicate that substantial reserves could be recovered even if the field results are 5 to 10% of OOIP. Reservoir characterization is a key factor in the success of any EOR application. Reservoirs within the Illinois Basin are frequently characterized as being highly compartmentalized resulting in multiple flow unit configurations. The research conducted on Lawrence Field focused on characteristics that define reservoir compartmentalization in order to delineate preferred target areas so that the chemical flood can be designed and implemented for the greatest recovery potential. Along with traditional facies mapping, core analyses and petrographic analyses, conceptual geological models were constructed and used to develop 3D geocellular models, a valuable tool for visualizing reservoir architecture and also a prerequisite for reservoir simulation modeling. Cores were described and potential permeability barriers were correlated using geophysical logs. Petrographic analyses

  8. Why does Japan use the probability method to set design flood?

    NASA Astrophysics Data System (ADS)

    Nakamura, S.; Oki, T.

    2015-12-01

    Design flood is hypothetical flood to make flood prevention plan. In Japan, a probability method based on precipitation data is used to define the scale of design flood: Tone River, the biggest river in Japan, is 1 in 200 years, Shinano River is 1 in 150 years, and so on. It is one of important socio-hydrological issue how to set reasonable and acceptable design flood in a changing world. The method to set design flood vary among countries. Although the probability method is also used in Netherland, but the base data is water level or discharge data and the probability is 1 in 1250 years (in fresh water section). On the other side, USA and China apply the maximum flood method which set the design flood based on the historical or probable maximum flood. This cases can leads a question: "what is the reason why the method vary among countries?" or "why does Japan use the probability method?" The purpose of this study is to clarify the historical process which the probability method was developed in Japan based on the literature. In the late 19the century, the concept of "discharge" and modern river engineering were imported by Dutch engineers, and modern flood prevention plans were developed in Japan. In these plans, the design floods were set based on the historical maximum method. Although the historical maximum method had been used until World War 2, however, the method was changed to the probability method after the war because of limitations of historical maximum method under the specific socio-economic situations: (1) the budget limitation due to the war and the GHQ occupation, (2) the historical floods: Makurazaki typhoon in 1945, Kathleen typhoon in 1947, Ione typhoon in 1948, and so on, attacked Japan and broke the record of historical maximum discharge in main rivers and the flood disasters made the flood prevention projects difficult to complete. Then, Japanese hydrologists imported the hydrological probability statistics from the West to take account of

  9. An at-site flood estimation method in the context of nonstationarity II. Statistical analysis of floods in Quebec

    NASA Astrophysics Data System (ADS)

    Gado, Tamer A.; Nguyen, Van-Thanh-Van

    2016-04-01

    This paper, the second of a two-part paper, investigates the nonstationary behaviour of flood peaks in Quebec (Canada) by analyzing the annual maximum flow series (AMS) available for the common 1966-2001 period from a network of 32 watersheds. Temporal trends in the mean of flood peaks were examined by the nonparametric Mann-Kendall test. The significance of the detected trends over the whole province is also assessed by a bootstrap test that preserves the cross-correlation structure of the network. Furthermore, The LM-NS method (introduced in the first part) is used to parametrically model the AMS, investigating its applicability to real data, to account for temporal trends in the moments of the time series. In this study two probability distributions (GEV & Gumbel) were selected to model four different types of time-varying moments of the historical time series considered, comprising eight competing models. The selected models are: two stationary models (GEV0 & Gumbel0), two nonstationary models in the mean as a linear function of time (GEV1 & Gumbel1), two nonstationary models in the mean as a parabolic function of time (GEV2 & Gumbel2), and two nonstationary models in the mean and the log standard deviation as linear functions of time (GEV11 & Gumbel11). The eight models were applied to flood data available for each watershed and their performance was compared to identify the best model for each location. The comparative methodology involves two phases: (1) a descriptive ability based on likelihood-based optimality criteria such as the Bayesian Information Criterion (BIC) and the deviance statistic; and (2) a predictive ability based on the residual bootstrap. According to the Mann-Kendall test and the LM-NS method, a quarter of the analyzed stations show significant trends in the AMS. All of the significant trends are negative, indicating decreasing flood magnitudes in Quebec. It was found that the LM-NS method could provide accurate flood estimates in the

  10. Methods for estimating peak discharge and flood boundaries of streams in Utah

    USGS Publications Warehouse

    Thomas, B.E.; Lindskov, K.L.

    1983-01-01

    Equations for estimating 2-, 5-, 10-, 25-, 50-, and 100-year peak discharges and flood depths at ungaged sites in Utah were developed using multiple-regression techniques. Ratios of 500- to 100-year values also were determined. The peak discharge equations are applicable to unregulated streams and the flood depth equations are applicable to the unregulated flow in natural stream channels. The flood depth data can be used to approximate flood prone areas. Drainage area and mean basin elevation are the two basin characteristics needed to use these equations. The standard error of estimate ranges from 38% to 74% for the 100-year peak discharge and from 23% to 33% for the 100-year flood depth. Five different flood mapping methods are described. Streams are classified into four categories as a basis for selecting a flood mapping method. Procedures for transferring flood depths obtained from the regression equations to a flood boundary map are outlined. Also, previous detailed flood mapping by government agencies and consultants is summarized to assist the user in quality control and to minimize duplication of effort. Methods are described for transferring flood frequency data from gaged to ungaged sites on the same stream. Peak discharge and flood depth frequency relations and selected basin characteristics data, updated through the 1980 water year, are tabulated for more than 300 gaging stations in Utah and adjoining states. In addition, weighted estimates of peak discharge relations based on the station data and the regression estimates are provided for each gaging station used in the regression analysis. (Author 's abstract)

  11. Characterization of rice starch and protein obtained by a fast alkaline extraction method.

    PubMed

    Souza, Daiana de; Sbardelotto, Arthur Francisco; Ziegler, Denize Righetto; Marczak, Ligia Damasceno Ferreira; Tessaro, Isabel Cristina

    2016-01-15

    This study evaluated the characteristics of rice starch and protein obtained by a fast alkaline extraction method on rice flour (RF) derived from broken rice. The extraction was conducted using 0.18% NaOH at 30°C for 30min followed by centrifugation to separate the starch rich and the protein rich fractions. This fast extraction method allowed to obtain an isoelectric precipitation protein concentrate (IPPC) with 79% protein and a starchy product with low protein content. The amino acid content of IPPC was practically unchanged compared to the protein in RF. The proteins of the IPPC underwent denaturation during extraction and some of the starch suffered the cold gelatinization phenomenon, due to the alkaline treatment. With some modifications, the fast method can be interesting in a technological point of view as it enables process cost reduction and useful ingredients obtention to the food and chemical industries. PMID:26258699

  12. Comparison of floods non-stationarity detection methods: an Austrian case study

    NASA Astrophysics Data System (ADS)

    Salinas, Jose Luis; Viglione, Alberto; Blöschl, Günter

    2016-04-01

    Non-stationarities in flood regimes have a huge impact in any mid and long term flood management strategy. In particular the estimation of design floods is very sensitive to any kind of flood non-stationarity, as they should be linked to a return period, concept that can be ill defined in a non-stationary context. Therefore it is crucial when analyzing existent flood time series to detect and, where possible, attribute flood non-stationarities to changing hydroclimatic and land-use processes. This works presents the preliminary results of applying different non-stationarity detection methods on annual peak discharges time series over more than 400 gauging stations in Austria. The kind of non-stationarities analyzed include trends (linear and non-linear), breakpoints, clustering beyond stochastic randomness, and detection of flood rich/flood poor periods. Austria presents a large variety of landscapes, elevations and climates that allow us to interpret the spatial patterns obtained with the non-stationarity detection methods in terms of the dominant flood generation mechanisms.

  13. New methods to assess severity and likelihood of urban flood risk from intense rainfall

    NASA Astrophysics Data System (ADS)

    Fewtrell, Tim; Foote, Matt; Bates, Paul; Ntelekos, Alexandros

    2010-05-01

    the construction of appropriate probabilistic flood models. This paper will describe new research being undertaken to assess the practicality of ultra-high resolution, ground based laser-scanner data for flood modelling in urban centres, using new hydraulic propagation methods to determine the feasibility of such data to be applied within stochastic event models. Results from the collection of ‘point cloud' data collected from a mobile terrestrial laser-scanner system in a key urban centre, combined with appropriate datasets, will be summarized here and an initial assessment of the potential for the use of such data in stochastic event sets will be made. Conclusions are drawn from comparisons with previous studies and underlying DEM products of similar resolutions in terms of computational time, flood extent and flood depth. Based on the above, the study provides some current recommendations on the most appropriate resolution of input data for urban hydraulic modelling.

  14. Novel Flood Detection and Analysis Method Using Recurrence Property

    NASA Astrophysics Data System (ADS)

    Wendi, Dadiyorto; Merz, Bruno; Marwan, Norbert

    2016-04-01

    Temporal changes in flood hazard are known to be difficult to detect and attribute due to multiple drivers that include processes that are non-stationary and highly variable. These drivers, such as human-induced climate change, natural climate variability, implementation of flood defence, river training, or land use change, could impact variably on space-time scales and influence or mask each other. Flood time series may show complex behavior that vary at a range of time scales and may cluster in time. This study focuses on the application of recurrence based data analysis techniques (recurrence plot) for understanding and quantifying spatio-temporal changes in flood hazard in Germany. The recurrence plot is known as an effective tool to visualize the dynamics of phase space trajectories i.e. constructed from a time series by using an embedding dimension and a time delay, and it is known to be effective in analyzing non-stationary and non-linear time series. The emphasis will be on the identification of characteristic recurrence properties that could associate typical dynamic behavior to certain flood situations.

  15. A simple statistical method for analyzing flood susceptibility with incorporating rainfall and impervious surface

    NASA Astrophysics Data System (ADS)

    Chiang, Shou-Hao; Chen, Chi-Farn

    2016-04-01

    Flood, as known as the most frequent natural hazard in Taiwan, has induced severe damages of residents and properties in urban areas. The flood risk is even more severe in Tainan since 1990s, with the significant urban development over recent decades. Previous studies have indicated that the characteristics and the vulnerability of flood are affected by the increase of impervious surface area (ISA) and the changing climate condition. Tainan City, in southern Taiwan is selected as the study area. This study uses logistic regression to functionalize the relationship between rainfall variables, ISA and historical flood events. Specifically, rainfall records from 2001 to 2014 were collected and mapped, and Landsat images of year 2001, 2004, 2007, 2010 and 2014 were used to generate the ISA with SVM (support vector machine) classifier. The result shows that rainfall variables and ISA are significantly correlated to the flood occurrence in Tainan City. With applying the logistic function, the likelihood of flood occurrence can be estimated and mapped over the study area. This study suggests the method is simple and feasible for rapid flood susceptibility mapping, when real-time rainfall observations can be available, and it has potential for future flood assessment, with incorporating climate change projections and urban growth prediction.

  16. Hot-Alkaline DNA Extraction Method for Deep-Subseafloor Archaeal Communities

    PubMed Central

    Terada, Takeshi; Hoshino, Tatsuhiko; Inagaki, Fumio

    2014-01-01

    A prerequisite for DNA-based microbial community analysis is even and effective cell disruption for DNA extraction. With a commonly used DNA extraction kit, roughly two-thirds of subseafloor sediment microbial cells remain intact on average (i.e., the cells are not disrupted), indicating that microbial community analyses may be biased at the DNA extraction step, prior to subsequent molecular analyses. To address this issue, we standardized a new DNA extraction method using alkaline treatment and heating. Upon treatment with 1 M NaOH at 98°C for 20 min, over 98% of microbial cells in subseafloor sediment samples collected at different depths were disrupted. However, DNA integrity tests showed that such strong alkaline and heat treatment also cleaved DNA molecules into short fragments that could not be amplified by PCR. Subsequently, we optimized the alkaline and temperature conditions to minimize DNA fragmentation and retain high cell disruption efficiency. The best conditions produced a cell disruption rate of 50 to 80% in subseafloor sediment samples from various depths and retained sufficient DNA integrity for amplification of the complete 16S rRNA gene (i.e., ∼1,500 bp). The optimized method also yielded higher DNA concentrations in all samples tested compared with extractions using a conventional kit-based approach. Comparative molecular analysis using real-time PCR and pyrosequencing of bacterial and archaeal 16S rRNA genes showed that the new method produced an increase in archaeal DNA and its diversity, suggesting that it provides better analytical coverage of subseafloor microbial communities than conventional methods. PMID:24441163

  17. A method of estimating flood volumes in western Kansas

    USGS Publications Warehouse

    Perry, C.A.

    1984-01-01

    Relationships between flood volume and peak discharge in western Kansas were developed considering basin and climatic characteristics in order to evaluate the availability of surface water in the area. Multiple-regression analyses revealed a relationship between flood volume, peak discharge, channel slope , and storm duration for basins smaller than 1,503 square miles. The equation VOL=0.536 PEAK1.71 SLOPE-0.85 DUR0.24, had a correlation coefficient of R=0.94 and a standard error of 0.33 log units (-53 and +113 percent). A better relationship for basins smaller than 228 square miles resulted in the equation VOL=0.483 PEAK0.98 SLOPE-0.74 AREA0.30, which had a correlation coefficient of R=0.90 and a standard error of 0.23 log units (-41 and +70 percent). (USGS)

  18. An improved method for analysis of hydroxide and carbonate in alkaline electrolytes containing zinc

    NASA Technical Reports Server (NTRS)

    Reid, M. A.

    1978-01-01

    A simplified method for titration of carbonate and hydroxide in alkaline battery electrolyte is presented involving a saturated KSCN solution as a complexing agent for zinc. Both hydroxide and carbonate can be determined in one titration, and the complexing reagent is readily prepared. Since the pH at the end point is shifted from 8.3 to 7.9-8.0, m-cresol purple or phenol red are used as indicators rather than phenolphthalein. Bromcresol green is recommended for determination of the second end point of a pH of 4.3 to 4.4.

  19. An improved method for analysis of hydroxide and carbonate in alkaline electrolytes containing zinc

    NASA Technical Reports Server (NTRS)

    Reid, M. A.

    1978-01-01

    A simplified method for titration of carbonate and hydroxide in alkaline battery electrolyte is presented involving a saturated KSCN solution as a complexing agent for zinc. Both hydroxide and carbonate can be determined in one titration, and the complexing reagent is readily prepared. Since the pH at the end point is shifted from 8.3 to 7.9 - 8.0, m-cresol purple or phenol red are used as indicators rather than phenolphthalein. Bromcresol green is recommended for determination of the second end point of a pH of 4.3 to 4.4.

  20. Hydrophilic Electrode For An Alkaline Electrochemical Cell, And Method Of Manufacture

    DOEpatents

    Senyarich, Stephane; Cocciantelli, Jean-Michel

    2000-03-07

    A negative electrode for an alkaline electrochemical cell. The electrode comprises an active material and a hydrophilic agent constituted by small cylindrical rods of polyolefin provided with hydrophilic groups. The mean length of the rods is less than 50 microns and the mean diameter thereof is less than 20 microns. A method of manufacturing a negative electrode in which hydrophilic rods are made by fragmenting long polyolefin fibers having a mean diameter of less than 20 microns by oxidizing them, with the rods being mixed with the active material and the mixture being applied to a current conductor.

  1. Flood profiles of the Alafia River, west-central Florida, computed by step-backwater method

    USGS Publications Warehouse

    Robertson, A.F.

    1977-01-01

    The Alafia River is a coastal stream that discharges into Hillsborough Bay. The river and its two principal tributaries, North Prong Alafia River and South Prong Alafia River, drain an area of 420 sq mi of predominantly rural land. However, near the coast, urban residential developments are increasing. The flood plain of the river is subject to flooding, particularly during large regional storms. Peak-discharge frequencies have been determined for data available at two gaging stations in the basin. The flood profiles for peak discharges of recurrence intervals of 2.33, 5, 10, 25, 50, 100, and 200 years have been determined using the step-backwater method. These profiles can be used in conjunction with topographic maps to delineate the area of flooding. Flood profiles were not determined for the tidally affected area near the mouth of the river. Flood marks were located that can be associated with the 1960 flood which occurred when Hurricane Donna passed over the area. (Woodard-USGS)

  2. Determination of Lutein from Fruit and Vegetables Through an Alkaline Hydrolysis Extraction Method and HPLC Analysis.

    PubMed

    Fratianni, Alessandra; Mignogna, Rossella; Niro, Serena; Panfili, Gianfranco

    2015-12-01

    A simple and rapid analytical method for the determination of lutein content, successfully used for cereal matrices, was evaluated in fruit and vegetables. The method involved the determination of lutein after an alkaline hydrolysis of the sample matrix, followed by extraction with solvents and analysis by normal phase HPLC. The optimized method was simple, precise, and accurate and it was characterized by few steps that could prevent loss of lutein and its degradation. The optimized method was used to evaluate the lutein amounts in several fruit and vegetables. Rich sources of lutein were confirmed to be green vegetables such as parsley, spinach, chicory, chard, broccoli, courgette, and peas, even if in a range of variability. Taking into account the suggested reference values these vegetables can be stated as good sources of lutein. PMID:26540023

  3. Determination of Lutein from Fruit and Vegetables Through an Alkaline Hydrolysis Extraction Method and HPLC Analysis.

    PubMed

    Fratianni, Alessandra; Mignogna, Rossella; Niro, Serena; Panfili, Gianfranco

    2015-12-01

    A simple and rapid analytical method for the determination of lutein content, successfully used for cereal matrices, was evaluated in fruit and vegetables. The method involved the determination of lutein after an alkaline hydrolysis of the sample matrix, followed by extraction with solvents and analysis by normal phase HPLC. The optimized method was simple, precise, and accurate and it was characterized by few steps that could prevent loss of lutein and its degradation. The optimized method was used to evaluate the lutein amounts in several fruit and vegetables. Rich sources of lutein were confirmed to be green vegetables such as parsley, spinach, chicory, chard, broccoli, courgette, and peas, even if in a range of variability. Taking into account the suggested reference values these vegetables can be stated as good sources of lutein.

  4. Management of hazardous waste at RCRA facilities during the flood of `93 -- Methods used and lessons learned

    SciTech Connect

    Martin, T.; Jacko, R.B.

    1996-11-01

    During the summer of 1993, the state of Iowa experienced severe flooding that caused the release of many hazardous materials into the environment. Six months after the flood, the Iowa section of the RCRA branch, US EPA Region 7, sent inspectors to survey every RCRA facility in Iowa. Information was gathered through questionnaires to determine the flood`s impact and to learn potential lessons that could be beneficial in future flood disasters. The objective of this project was to use the information gathered to determine effective storage methods and emergency procedures for handling hazardous material during flood disasters. Additional data were obtained through record searches, phone interviews, and site visits. Data files and statistics were analyzed, then the evident trends and specific insights observed were utilized to create recommendations for RCRA facilities in the flood plain and for the federal EPA and state regulatory agencies. The recommendations suggest that RCRA regulated facilities in the flood plain should: employ the safest storage methods possible; have a flood emergency plan that includes the most effective release prevention available; and take advantage of several general suggestions for flood protection. The recommendations suggest that the federal EPA and state regulatory agencies consider: including a provision requiring large quantity generators of hazardous waste in the flood plain to include flood procedures in the contingency plans; establishing remote emergency storage areas during the flood disasters; encouraging small quantity generators (SQGs) within the flood plain to establish flood contingency plans; and promoting sound flood protection engineering practices for all RCRA facilities in the flood plain.

  5. An at-site flood estimation method in the context of nonstationarity I. A simulation study

    NASA Astrophysics Data System (ADS)

    Gado, Tamer A.; Nguyen, Van-Thanh-Van

    2016-04-01

    The stationarity of annual flood peak records is the traditional assumption of flood frequency analysis. In some cases, however, as a result of land-use and/or climate change, this assumption is no longer valid. Therefore, new statistical models are needed to capture dynamically the change of probability density functions over time, in order to obtain reliable flood estimation. In this study, an innovative method for nonstationary flood frequency analysis was presented. Here, the new method is based on detrending the flood series and applying the L-moments along with the GEV distribution to the transformed "stationary" series (hereafter, this is called the LM-NS). The LM-NS method was assessed through a comparative study with the maximum likelihood (ML) method for the nonstationary GEV model, as well as with the stationary (S) GEV model. The comparative study, based on Monte Carlo simulations, was carried out for three nonstationary GEV models: a linear dependence of the mean on time (GEV1), a quadratic dependence of the mean on time (GEV2), and linear dependence in both the mean and log standard deviation on time (GEV11). The simulation results indicated that the LM-NS method performs better than the ML method for most of the cases studied, whereas the stationary method provides the least accurate results. An additional advantage of the LM-NS method is to avoid the numerical problems (e.g., convergence problems) that may occur with the ML method when estimating parameters for small data samples.

  6. Evaluation of Outlier Detection and Modification Methods Used in Flood Frequency Analysis

    NASA Astrophysics Data System (ADS)

    Senarath, S. U. S.

    2014-12-01

    Accurate and reliable information on flood frequency and magnitude is vital for the design of hydraulic structures and for floodplain mapping. The time-series data used for flood frequency analysis are obtained from both gauged systematic records and also historical floods. These time-series data sets, however, may include outliers. Outliers are defined as values that are much larger or smaller than most of the values in a given data set. Therefore, by definition, outliers include both very high and very low values. However, the main focus of this study is only on high outliers (hereafter referred to as simply outliers). The availability of a wide-array of tests and techniques makes the outlier detection and modification procedure anything but straightforward. The set of tests and techniques that is selected to detect and modify these outliers can have a direct impact on flood frequency and magnitude estimates. Many different approaches, based on the principles of hypothesis testing, are available for the detection of outliers. Several such methods are examined in detail in this study by using data from Southeast Europe. Outliers detected using these approaches can be eliminated, modified or disregarded (i.e., by treating them as non-outliers) prior to being used in flood frequency analyses. Eliminating or disregarding the outliers typically requires additional gauge- or event-specific information in order to be fully implemented. In the absence of such information, the practice of using modified outliers in flood frequency analysis provides the best compromise. Nevertheless, the effects of all three options on flood frequency and magnitude estimates are investigated to provide a more meaningful comparison. Flood magnitude-frequency estimates based on the log Pearson type 3 distribution are used in this study to evaluate the effect of outlier modification. The study findings have important implications for estimation of flood frequency and magnitude with time

  7. Alkaline pretreatment methods followed by acid hydrolysis of Saccharum spontaneum for bioethanol production.

    PubMed

    Chaudhary, Gaurav; Singh, Lalit Kumar; Ghosh, Sanjoy

    2012-11-01

    Different alkaline pretreatment methods (NaOH, NaOH+10% urea and aqueous ammonia) were optimized for maximum delignification of Saccharum spontaneum at 30°C. Maximum delignification were obtained as 47.8%, 51% and 48% from NaOH (7% NaOH, 48h, and 10% biomass loading), NaOH+urea (7% NaOH+10% urea, 48 h and 10% biomass loading) and 30% ammonia (40 days and 10% biomass loading) respectively. H(2)SO(4) 60% (v/v), 10% biomass loading at 30°C for 4h, were optimized conditions to solubilize the cellulose and hemicellulose from solid residue obtained after different optimized alkaline pretreatments. Slurry thus obtained was diluted to obtain final acid concentration of 10% (v/v) for real hydrolysis of cellulose and hemicellulose at 100°C for 1h. Among all pretreatment methods applied, the best result 0.58 g (85%) reducing sugars/g of initial biomass after acid hydrolysis was obtained from aqueous ammonia pretreated biomass. Scheffersomyces stipitis CBS6054 was used to ferment the hydrolysate; ethanol yield (Y(p/s)) and productivity (r(p)) were found to be 0.35 g/g and 0.22 g/L/h respectively.

  8. A new DNA extraction method by controlled alkaline treatments from consolidated subsurface sediments.

    PubMed

    Kouduka, Mariko; Suko, Takeshi; Morono, Yuki; Inagaki, Fumio; Ito, Kazumasa; Suzuki, Yohey

    2012-01-01

    Microbial communities that thrive in subterranean consolidated sediments are largely unknown owing to the difficulty of extracting DNA. As this difficulty is often attributed to DNA binding onto the silica-bearing sediment matrix, we developed a DNA extraction method for consolidated sediment from the deep subsurface in which silica minerals were dissolved by being heated under alkaline conditions. NaOH concentrations (0.07 and 0.33 N), incubation temperatures (65 and 94 °C) and incubation times (30-90 min) before neutralization were evaluated based on the copy number of extracted prokaryotic DNA. Prokaryotic DNA was detected by quantitative PCR analysis after heating the sediment sample at 94 °C in 0.33 N NaOH solution for 50-80 min. Results of 16S rRNA gene sequence analysis of the extracted DNA were all consistent with regard to the dominant occurrence of the metallophilic bacterium, Cupriavidus metallidurans, and Pseudomonas spp. Mineralogical analysis revealed that the dissolution of a silica mineral (opal-CT) during alkaline treatment was maximized at 94 °C in 0.33 N NaOH solution for 50 min, which may have resulted in the release of DNA into solution. Because the optimized protocol for DNA extraction is applicable to subterranean consolidated sediments from a different locality, the method developed here has the potential to expand our understanding of the microbial community structure of the deep biosphere.

  9. Improving the quantification of flash flood hydrographs and reducing their uncertainty using noncontact streamgauging methods

    NASA Astrophysics Data System (ADS)

    Branger, Flora; Dramais, Guillaume; Horner, Ivan; Le Boursicaud, Raphaël; Le Coz, Jérôme; Renard, Benjamin

    2015-04-01

    Continuous river discharge data are crucial for the study and management of floods. In most river discharge monitoring networks, these data are obtained at gauging stations, where the stage-discharge relation is modelled with a rating curve to derive discharge from the measurement of water level in the river. Rating curves are usually established using individual ratings (or gaugings). However, using traditional gauging methods during flash floods is challenging for many reasons including hazardous flow conditions (for both equipment and people), short duration of the flood events, transient flows during the time needed to perform the gauging, etc. The lack of gaugings implies that the rating curve is often extrapolated well beyond the gauged range for the highest floods, inducing large uncertainties in the computed discharges. We deployed two remote techniques for gauging floods and improving stage-discharge relations for high flow conditions at several hydrometric stations throughout the Ardèche river catchment in France : (1) permanent video-recording stations enabling the implementation of the image analysis LS-PIV technique (Large Scale Particle Image Velocimetry) ; (2) and mobile gaugings using handheld Surface Velocity Radars (SVR). These gaugings were used to estimate the rating curve and its uncertainty using the Bayesian method BaRatin (Le Coz et al., 2014). Importantly, this method explicitly accounts for the uncertainty of individual gaugings, which is especially relevant for remote gaugings since their uncertainty is generally much higher than that of standard intrusive gauging methods. Then, the uncertainty of streamflow records was derived by combining the uncertainty of the rating curve and the uncertainty of stage records. We assessed the impact of these methodological developments for peak flow estimation and for flood descriptors at various time steps. The combination of field measurement innovation and statistical developments allows

  10. Augmented digestion of lignocellulose by steam explosion, acid and alkaline pretreatment methods: a review.

    PubMed

    Singh, Joginder; Suhag, Meenakshi; Dhaka, Anil

    2015-03-01

    Lignocellulosic materials can be explored as one of the sustainable substrates for bioethanol production through microbial intervention as they are abundant, cheap and renewable. But at the same time, their recalcitrant structure makes the conversion process more cumbersome owing to their chemical composition which adversely affects the efficiency of bioethanol production. Therefore, the technical approaches to overcome recalcitrance of biomass feedstock has been developed to remove the barriers with the help of pretreatment methods which make cellulose more accessible to the hydrolytic enzymes, secreted by the microorganisms, for its conversion to glucose. Pretreatment of lignocellulosic biomass in cost effective manner is a major challenge to bioethanol technology research and development. Hence, in this review, we have discussed various aspects of three commonly used pretreatment methods, viz., steam explosion, acid and alkaline, applied on various lignocellulosic biomasses to augment their digestibility alongwith the challenges associated with their processing.

  11. Comparison of methods for estimating flood magnitudes on small streams in Georgia

    USGS Publications Warehouse

    Hess, Glen W.; Price, McGlone

    1989-01-01

    The U.S. Geological Survey has collected flood data for small, natural streams at many sites throughout Georgia during the past 20 years. Flood-frequency relations were developed for these data using four methods: (1) observed (log-Pearson Type III analysis) data, (2) rainfall-runoff model, (3) regional regression equations, and (4) map-model combination. The results of the latter three methods were compared to the analyses of the observed data in order to quantify the differences in the methods and determine if the differences are statistically significant.

  12. The effect of alkaline pretreatment methods on cellulose structure and accessibility

    DOE PAGES

    Bali, Garima; Meng, Xianzhi; Deneff, Jacob I.; Sun, Qining; Ragauskas, Arthur J.

    2014-11-24

    The effects of different alkaline pretreatments on cellulose structural features and accessibility are compared and correlated with the enzymatic hydrolysis of Populus. The pretreatments are shown to modify polysaccharides and lignin content to enhance the accessibility for cellulase enzymes. The highest increase in the cellulose accessibility was observed in dilute sodium hydroxide, followed by methods using ammonia soaking and lime (Ca(OH)2). The biggest increase of cellulose accessibility occurs during the first 10 min of pretreatment, with further increases at a slower rate as severity increases. Low temperature ammonia soaking at longer residence times dissolved a major portion of hemicellulose andmore » exhibited higher cellulose accessibility than high temperature soaking. Moreover, the most significant reduction of degree of polymerization (DP) occurred for dilute sodium hydroxide (NaOH) and ammonia pretreated Populus samples. The study thus identifies important cellulose structural features and relevant parameters related to biomass recalcitrance.« less

  13. The effect of alkaline pretreatment methods on cellulose structure and accessibility

    SciTech Connect

    Bali, Garima; Meng, Xianzhi; Deneff, Jacob I.; Sun, Qining; Ragauskas, Arthur J.

    2014-11-24

    The effects of different alkaline pretreatments on cellulose structural features and accessibility are compared and correlated with the enzymatic hydrolysis of Populus. The pretreatments are shown to modify polysaccharides and lignin content to enhance the accessibility for cellulase enzymes. The highest increase in the cellulose accessibility was observed in dilute sodium hydroxide, followed by methods using ammonia soaking and lime (Ca(OH)2). The biggest increase of cellulose accessibility occurs during the first 10 min of pretreatment, with further increases at a slower rate as severity increases. Low temperature ammonia soaking at longer residence times dissolved a major portion of hemicellulose and exhibited higher cellulose accessibility than high temperature soaking. Moreover, the most significant reduction of degree of polymerization (DP) occurred for dilute sodium hydroxide (NaOH) and ammonia pretreated Populus samples. The study thus identifies important cellulose structural features and relevant parameters related to biomass recalcitrance.

  14. Single well surfactant test to evaluate surfactant floods using multi tracer method

    DOEpatents

    Sheely, Clyde Q.

    1979-01-01

    Data useful for evaluating the effectiveness of or designing an enhanced recovery process said process involving mobilizing and moving hydrocarbons through a hydrocarbon bearing subterranean formation from an injection well to a production well by injecting a mobilizing fluid into the injection well, comprising (a) determining hydrocarbon saturation in a volume in the formation near a well bore penetrating formation, (b) injecting sufficient mobilizing fluid to mobilize and move hydrocarbons from a volume in the formation near the well bore, and (c) determining the hydrocarbon saturation in a volume including at least a part of the volume of (b) by an improved single well surfactant method comprising injecting 2 or more slugs of water containing the primary tracer separated by water slugs containing no primary tracer. Alternatively, the plurality of ester tracers can be injected in a single slug said tracers penetrating varying distances into the formation wherein the esters have different partition coefficients and essentially equal reaction times. The single well tracer method employed is disclosed in U.S. Pat. No. 3,623,842. This method designated the single well surfactant test (SWST) is useful for evaluating the effect of surfactant floods, polymer floods, carbon dioxide floods, micellar floods, caustic floods and the like in subterranean formations in much less time and at much reduced cost compared to conventional multiwell pilot tests.

  15. Theoretical considerations and a simple method for measuring alkalinity and acidity in low-pH waters by gran titration

    USGS Publications Warehouse

    Barringer, J.L.; Johnsson, P.A.

    1996-01-01

    Titrations for alkalinity and acidity using the technique described by Gran (1952, Determination of the equivalence point in potentiometric titrations, Part II: The Analyst, v. 77, p. 661-671) have been employed in the analysis of low-pH natural waters. This report includes a synopsis of the theory and calculations associated with Gran's technique and presents a simple and inexpensive method for performing alkalinity and acidity determinations. However, potential sources of error introduced by the chemical character of some waters may limit the utility of Gran's technique. Therefore, the cost- and time-efficient method for performing alkalinity and acidity determinations described in this report is useful for exploring the suitability of Gran's technique in studies of water chemistry.

  16. Global river flood hazard maps: hydraulic modelling methods and appropriate uses

    NASA Astrophysics Data System (ADS)

    Townend, Samuel; Smith, Helen; Molloy, James

    2014-05-01

    Flood hazard is not well understood or documented in many parts of the world. Consequently, the (re-)insurance sector now needs to better understand where the potential for considerable river flooding aligns with significant exposure. For example, international manufacturing companies are often attracted to countries with emerging economies, meaning that events such as the 2011 Thailand floods have resulted in many multinational businesses with assets in these regions incurring large, unexpected losses. This contribution addresses and critically evaluates the hydraulic methods employed to develop a consistent global scale set of river flood hazard maps, used to fill the knowledge gap outlined above. The basis of the modelling approach is an innovative, bespoke 1D/2D hydraulic model (RFlow) which has been used to model a global river network of over 5.3 million kilometres. Estimated flood peaks at each of these model nodes are determined using an empirically based rainfall-runoff approach linking design rainfall to design river flood magnitudes. The hydraulic model is used to determine extents and depths of floodplain inundation following river bank overflow. From this, deterministic flood hazard maps are calculated for several design return periods between 20-years and 1,500-years. Firstly, we will discuss the rationale behind the appropriate hydraulic modelling methods and inputs chosen to produce a consistent global scaled river flood hazard map. This will highlight how a model designed to work with global datasets can be more favourable for hydraulic modelling at the global scale and why using innovative techniques customised for broad scale use are preferable to modifying existing hydraulic models. Similarly, the advantages and disadvantages of both 1D and 2D modelling will be explored and balanced against the time, computer and human resources available, particularly when using a Digital Surface Model at 30m resolution. Finally, we will suggest some

  17. Multireservoir real-time operations for flood control using balanced water level index method.

    PubMed

    Wei, Chih-Chiang; Hsu, Nien-Sheng

    2008-09-01

    This paper presents a real-time simulation-optimization operation procedure for determining the reservoir releases at each time step during a flood. The proposed procedure involves two models, i.e., a hydrological forecasting model and a reservoir operation model. In the reservoir operation model, this paper compares two flood-control operation strategies for a multipurpose multireservoir system. While Strategy 1 is the real-time joint reservoir operations without using the balanced water level index (BWLI) method, Strategy 2 involves real-time joint reservoir operations using the BWLI method. The two strategies presented are formulated as mixed-integer linear programming (MILP) problems. The idea of using the BWLI method is derived from the HEC-5 program developed by the US Army Corps of Engineers. The proposed procedure has been applied to the Tanshui River Basin system in Taiwan using the 6h ahead forecast data of six typhoons. A comparison of the results obtained from the two strategies reveals that Strategy 2 performs much better than Strategy 1 in determining the reservoir real-time releases throughout the system during flood emergencies in order to minimize flooding, while maintaining all reservoirs in the system in balance if possible. Consequently, the proposed model using the BWLI method demonstrates its effectiveness in estimating real-time releases.

  18. Multireservoir real-time operations for flood control using balanced water level index method.

    PubMed

    Wei, Chih-Chiang; Hsu, Nien-Sheng

    2008-09-01

    This paper presents a real-time simulation-optimization operation procedure for determining the reservoir releases at each time step during a flood. The proposed procedure involves two models, i.e., a hydrological forecasting model and a reservoir operation model. In the reservoir operation model, this paper compares two flood-control operation strategies for a multipurpose multireservoir system. While Strategy 1 is the real-time joint reservoir operations without using the balanced water level index (BWLI) method, Strategy 2 involves real-time joint reservoir operations using the BWLI method. The two strategies presented are formulated as mixed-integer linear programming (MILP) problems. The idea of using the BWLI method is derived from the HEC-5 program developed by the US Army Corps of Engineers. The proposed procedure has been applied to the Tanshui River Basin system in Taiwan using the 6h ahead forecast data of six typhoons. A comparison of the results obtained from the two strategies reveals that Strategy 2 performs much better than Strategy 1 in determining the reservoir real-time releases throughout the system during flood emergencies in order to minimize flooding, while maintaining all reservoirs in the system in balance if possible. Consequently, the proposed model using the BWLI method demonstrates its effectiveness in estimating real-time releases. PMID:17923249

  19. Methods for delineating flood-prone areas in the Great Basin of Nevada and adjacent states

    USGS Publications Warehouse

    Burkham, D.E.

    1988-01-01

    The Great Basin is a region of about 210,000 square miles having no surface drainage to the ocean; it includes most of Nevada and parts of Utah, California, Oregon, Idaho, and Wyoming. The area is characterized by many parallel mountain ranges and valleys trending north-south. Stream channels usually are well defined and steep within the mountains, but on reaching the alluvial fan at the canyon mouth, they may diverge into numerous distributary channels, be discontinuous near the apex of the fan, or be deeply entrenched in the alluvial deposits. Larger rivers normally have well-defined channels to or across the valley floors, but all terminate at lakes or playas. Major floods occur in most parts of the Great Basin and result from snowmelt, frontal-storm rainfall, and localized convective rainfall. Snowmelt floods typically occur during April-June. Floods resulting from frontal rain and frontal rain on snow generally occur during November-March. Floods resulting from convective-type rainfall during localized thunderstorms occur most commonly during the summer months. Methods for delineating flood-prone areas are grouped into five general categories: Detailed, historical, analytical, physiographic, and reconnaissance. The detailed and historical methods are comprehensive methods; the analytical and physiographic are intermediate; and the reconnaissance method is only approximate. Other than the reconnaissance method, each method requires determination of a T-year discharge (the peak rate of flow during a flood with long-term average recurrence interval of T years) and T-year profile and the development of a flood-boundary map. The procedure is different, however, for each method. Appraisal of the applicability of each method included consideration of its technical soundness, limitations and uncertainties, ease of use, and costs in time and money. Of the five methods, the detailed method is probably the most accurate, though most expensive. It is applicable to

  20. Methods for estimating flood frequency in Montana based on data through water year 1998

    USGS Publications Warehouse

    Parrett, Charles; Johnson, Dave R.

    2004-01-01

    Annual peak discharges having recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years (T-year floods) were determined for 660 gaged sites in Montana and in adjacent areas of Idaho, Wyoming, and Canada, based on data through water year 1998. The updated flood-frequency information was subsequently used in regression analyses, either ordinary or generalized least squares, to develop equations relating T-year floods to various basin and climatic characteristics, equations relating T-year floods to active-channel width, and equations relating T-year floods to bankfull width. The equations can be used to estimate flood frequency at ungaged sites. Montana was divided into eight regions, within which flood characteristics were considered to be reasonably homogeneous, and the three sets of regression equations were developed for each region. A measure of the overall reliability of the regression equations is the average standard error of prediction. The average standard errors of prediction for the equations based on basin and climatic characteristics ranged from 37.4 percent to 134.1 percent. Average standard errors of prediction for the equations based on active-channel width ranged from 57.2 percent to 141.3 percent. Average standard errors of prediction for the equations based on bankfull width ranged from 63.1 percent to 155.5 percent. In most regions, the equations based on basin and climatic characteristics generally had smaller average standard errors of prediction than equations based on active-channel or bankfull width. An exception was the Southeast Plains Region, where all equations based on active-channel width had smaller average standard errors of prediction than equations based on basin and climatic characteristics or bankfull width. Methods for weighting estimates derived from the basin- and climatic-characteristic equations and the channel-width equations also were developed. The weights were based on the cross correlation of residuals from the

  1. Review Article: Economic evaluation of flood damage to agriculture - review and analysis of existing methods

    NASA Astrophysics Data System (ADS)

    Brémond, P.; Grelot, F.; Agenais, A.-L.

    2013-10-01

    In Europe, economic evaluation of flood management projects is increasingly used to help decision making. At the same time, the management of flood risk is shifting towards new concepts such as giving more room to water by restoring floodplains. Agricultural areas are particularly targeted by projects following those concepts since they are frequently located in floodplain areas and since the potential damage to such areas is expected to be lower than to cities or industries for example. Additional or avoided damage to agriculture may have a major influence on decisions concerning these projects and the economic evaluation of flood damage to agriculture is thus an issue that needs to be tackled. The question of flood damage to agriculture can be addressed in different ways. This paper reviews and analyzes existing studies which have developed or used damage functions for agriculture in the framework of an economic appraisal of flood management projects. A conceptual framework of damage categories is proposed for the agricultural sector. The damage categories were used to structure the review. Then, a total of 42 studies are described, with a detailed review of 26 of them, based on the following criteria: types of damage considered, the influential flood parameters chosen, and monetized damage indicators used. The main recommendations resulting from this review are that even if existing methods have already focused on damage to crops, still some improvement is needed for crop damage functions. There is also a need to develop damage functions for other agricultural damage categories, including farm buildings and their contents. Finally, to cover all possible agricultural damage, and in particular loss of activity, a farm scale approach needs to be used.

  2. A flood map based DOI decoding method for block detector: a GATE simulation study.

    PubMed

    Shi, Han; Du, Dong; Su, Zhihong; Peng, Qiyu

    2014-01-01

    Positron Emission Tomography (PET) systems using detectors with Depth of Interaction (DOI) capabilities could achieve higher spatial resolution and better image quality than those without DOI. Up till now, most DOI methods developed are not cost-efficient for a whole body PET system. In this paper, we present a DOI decoding method based on flood map for low-cost conventional block detector with four-PMT readout. Using this method, the DOI information can be directly extracted from the DOI-related crystal spot deformation in the flood map. GATE simulations are then carried out to validate the method, confirming a DOI sorting accuracy of 85.27%. Therefore, we conclude that this method has the potential to be applied in conventional detectors to achieve a reasonable DOI measurement without dramatically increasing their complexity and cost of an entire PET system.

  3. Development of Flood Forecasting Using Statistical Method in Four River Basins in Terengganu, Malaysia

    NASA Astrophysics Data System (ADS)

    Noor, M. S. F. M.; Sidek, L. M.; Basri, H.; Husni, M. M. M.; Jaafar, A. S.; Kamaluddin, M. H.; Majid, W. H. A. W. A.; Mohammad, A. H.; Osman, S.

    2016-03-01

    One of the critical regions in Malaysia is Terengganu which is located at east coast of Peninsular Malaysia. In Terengganu, flood is experienced regularly because of attributed topography and climate including northeast monsoon. Moreover, rainfall is with high intensity during the November to February in Terengganu as forcing factor to produce of flood. In this study, main objectives are water stage forecasting and deriving the related equations based on least squared method. For this study, it is used two methods which called inclusion of residual (Method A) and non-inclusion residual (Method B) respectively. Result depicts that Method B outperformed to forecast the water stage at selected case studies (Besut, Dungun, Kemaman, Terengganu).

  4. Estimation of flood frequency by SCHADEX method - in Nysa Kłodzka catchment

    NASA Astrophysics Data System (ADS)

    Osuch, M.; Romanowicz, R. J.; Paquet, E.; Garavaglia, F.

    2012-04-01

    Estimation of design flood using Continuous Simulation (CS) has emerged as a very active research topic across academic institutions in Europe. CS is based on the use of rainfall-runoff models, of various complexity, for transforming precipitation data into river flow. By coupling a rainfall-runoff model with a stochastic rainfall model, Monte Carlo simulations can generate long series of synthetic rainfall being transformed into river flow from which flood frequency characteristics can be deducted. This approach is favoured by politicians and water managers, as it allows the influence of water management and climatic changes to be taken into account during the estimation of flood frequency curves. The other approach to FFA is based on the available historical maximum annual or seasonal flow data and consists of fitting theoretical cumulative distributions to observations. These theoretical, parameterised distributions are used in practical applications to derive flow quantiles with a desired probability of exceedence for the purpose of water management. The aim of this work is an application of a continuous simulation approach to flood frequency analysis (FFA) using the Nysa Kłodzka catchment as a case study. The applied method is SCHADEX, a probabilistic method for extreme floods estimation which combines a weather pattern based rainfall probabilistic model and a conceptual rainfall-runoff model, within a stochastic event simulation framework. In that method, the distribution of areal precipitation is described by a compound probabilistic distribution based on weather patterns sub-sampling (MEWP distribution). These patterns represent synoptic situation and allow for disagreggation of heavy rainfall data into homogenous subsamples (Garavaglia et al. 2010 a and b). Extreme flood estimation is then achieved by stochastic simulation using MORDOR rainfall-runoff model. The resulting FFA curve is compared to an outcome of a seasonal maxima approach (recommended

  5. Combined fluvial and pluvial urban flood hazard analysis: method development and application to Can Tho City, Mekong Delta, Vietnam

    NASA Astrophysics Data System (ADS)

    Apel, H.; Trepat, O. M.; Hung, N. N.; Chinh, D. T.; Merz, B.; Dung, N. V.

    2015-08-01

    Many urban areas experience both fluvial and pluvial floods, because locations next to rivers are preferred settlement areas, and the predominantly sealed urban surface prevents infiltration and facilitates surface inundation. The latter problem is enhanced in cities with insufficient or non-existent sewer systems. While there are a number of approaches to analyse either fluvial or pluvial flood hazard, studies of combined fluvial and pluvial flood hazard are hardly available. Thus this study aims at the analysis of fluvial and pluvial flood hazard individually, but also at developing a method for the analysis of combined pluvial and fluvial flood hazard. This combined fluvial-pluvial flood hazard analysis is performed taking Can Tho city, the largest city in the Vietnamese part of the Mekong Delta, as example. In this tropical environment the annual monsoon triggered floods of the Mekong River can coincide with heavy local convective precipitation events causing both fluvial and pluvial flooding at the same time. Fluvial flood hazard was estimated with a copula based bivariate extreme value statistic for the gauge Kratie at the upper boundary of the Mekong Delta and a large-scale hydrodynamic model of the Mekong Delta. This provided the boundaries for 2-dimensional hydrodynamic inundation simulation for Can Tho city. Pluvial hazard was estimated by a peak-over-threshold frequency estimation based on local rain gauge data, and a stochastic rain storm generator. Inundation was simulated by a 2-dimensional hydrodynamic model implemented on a Graphical Processor Unit (GPU) for time-efficient flood propagation modelling. All hazards - fluvial, pluvial and combined - were accompanied by an uncertainty estimation considering the natural variability of the flood events. This resulted in probabilistic flood hazard maps showing the maximum inundation depths for a selected set of probabilities of occurrence, with maps showing the expectation (median) and the uncertainty by

  6. An automated method for predicting full-scale CO/sub 2/ flood performance based on detailed pattern flood simulations

    SciTech Connect

    Rester, S.; Todd, M.R.

    1984-04-01

    A procedure is described for estimating the response of a field scale CO/sub 2/ flood from a limited number of simulations of pattern flood symmetry elements. This procedure accounts for areally varying reservoir properties, areally varying conditions when CO/sub 2/ injection is initiated, phased conversion of injectors to CO/sub 2/, and shut in criteria for producers. Examples of the use of this procedure are given.

  7. Sparsity-weighted outlier FLOODing (OFLOOD) method: Efficient rare event sampling method using sparsity of distribution.

    PubMed

    Harada, Ryuhei; Nakamura, Tomotake; Shigeta, Yasuteru

    2016-03-30

    As an extension of the Outlier FLOODing (OFLOOD) method [Harada et al., J. Comput. Chem. 2015, 36, 763], the sparsity of the outliers defined by a hierarchical clustering algorithm, FlexDice, was considered to achieve an efficient conformational search as sparsity-weighted "OFLOOD." In OFLOOD, FlexDice detects areas of sparse distribution as outliers. The outliers are regarded as candidates that have high potential to promote conformational transitions and are employed as initial structures for conformational resampling by restarting molecular dynamics simulations. When detecting outliers, FlexDice defines a rank in the hierarchy for each outlier, which relates to sparsity in the distribution. In this study, we define a lower rank (first ranked), a medium rank (second ranked), and the highest rank (third ranked) outliers, respectively. For instance, the first-ranked outliers are located in a given conformational space away from the clusters (highly sparse distribution), whereas those with the third-ranked outliers are nearby the clusters (a moderately sparse distribution). To achieve the conformational search efficiently, resampling from the outliers with a given rank is performed. As demonstrations, this method was applied to several model systems: Alanine dipeptide, Met-enkephalin, Trp-cage, T4 lysozyme, and glutamine binding protein. In each demonstration, the present method successfully reproduced transitions among metastable states. In particular, the first-ranked OFLOOD highly accelerated the exploration of conformational space by expanding the edges. In contrast, the third-ranked OFLOOD reproduced local transitions among neighboring metastable states intensively. For quantitatively evaluations of sampled snapshots, free energy calculations were performed with a combination of umbrella samplings, providing rigorous landscapes of the biomolecules.

  8. Flood Frequency Analyses Using a Modified Stochastic Storm Transposition Method

    NASA Astrophysics Data System (ADS)

    Fang, N. Z.; Kiani, M.

    2015-12-01

    Research shows that areas with similar topography and climatic environment have comparable precipitation occurrences. Reproduction and realization of historical rainfall events provide foundations for frequency analysis and the advancement of meteorological studies. Stochastic Storm Transposition (SST) is a method for such a purpose and enables us to perform hydrologic frequency analyses by transposing observed historical storm events to the sites of interest. However, many previous studies in SST reveal drawbacks from simplified Probability Density Functions (PDFs) without considering restrictions for transposing rainfalls. The goal of this study is to stochastically examine the impacts of extreme events on all locations in a homogeneity zone. Since storms with the same probability of occurrence on homogenous areas do not have the identical hydrologic impacts, the authors utilize detailed precipitation parameters including the probability of occurrence of certain depth and the number of occurrence of extreme events, which are both incorporated into a joint probability function. The new approach can reduce the bias from uniformly transposing storms which erroneously increases the probability of occurrence of storms in areas with higher rainfall depths. This procedure is iterated to simulate storm events for one thousand years as the basis for updating frequency analysis curves such as IDF and FFA. The study area is the Upper Trinity River watershed including the Dallas-Fort Worth metroplex with a total area of 6,500 mi2. It is the first time that SST method is examined in such a wide scale with 20 years of radar rainfall data.

  9. Mapping flood prone areas in southern Brazil: a combination of frequency analysis, HAND algorithm and remote sensing methods

    NASA Astrophysics Data System (ADS)

    Fabris Goerl, Roberto; Borges Chaffe, Pedro Luiz; Marcel Pellerin, Joel Robert; Altamirano Flores, Juan Antonio; Josina Abreu, Janete; Speckhann, Gustavo Andrei; Mattos Sanchez, Gerly

    2015-04-01

    Floods disaster damages several people around the world. There is a worldwide increasing trend of natural disasters frequency and their negative impacts related to the population growth and high urbanization in natural hazards zones. In Santa Catarina state, such as almost all southern Brazilian territory, floods are a frequent hydrological disaster. In this context, flood prone areas map is a important tool to avoid the construction of new settlements in non-urbanizations areas. The present work aimed to map flood prone areas in Palhoça City, Southern Brazil combining high resolution digital elevations data, remote sensing information, frequency analysis and High Above Nearest Drainage (HAND) algorithm. We used 17 years of daily discharge and stage data to calculate flood probability and return period. Remote Sensing (RS) with CBERS HRC image with 2,7m resolution was used. This image was taken one day after one flood occurrence and a band difference was used to extract the flood extent. HAND using DEM to calculate the altimetric difference between channel pixel and adjacent terrain values. All morphometric attributes used in HAND were extracted directly from the high resolution DEM (1m). Through CBERS image areas where flood level was higher than 0.5m were mapped. There is some kind of uncertain in establish HAND classes, since only distance to the channel was take in account. Thus, using other hydrological or spatial information can reduce this uncertain. To elaborate the final flood prone map, all this methods were combined. This map was classified in three main classes based on return period. It was notices that there is a strong spatial correlation between high susceptibility flood areas and geomorphological features like floodplains and Holocene beach ridges, places where water table emerges frequently. The final map was classified using three different colors (red, yellow and green) related to high, medium an law susceptibility flood areas. This mapping

  10. A simple solvent method for the recovery of LixCoO2 and its applications in alkaline rechargeable batteries

    NASA Astrophysics Data System (ADS)

    Xu, Yanan; Song, Dawei; Li, Li; An, Cuihua; Wang, Yijing; Jiao, LiFang; Yuan, Huatang

    2014-04-01

    A simple solvent method is proposed for the recovery of waste LixCoO2 from lithium-ion batteries, which employs inexpensive DMF to remove the binder of PVDF. This method is convenient to manipulate and low-cost to apply. Electrochemical investigations indicate that recovered LixCoO2 materials with a small amount of S-doping exhibit excellent properties as negative materials for alkaline rechargeable Ni/Co batteries. At the discharge current density of 100 mA g-1, the LixCoO2 + 1% S electrode displays the max discharge capacity of 357 mAh g-1 and outstanding capacity retention rate of 85.5% after 100 cycles. It could overcome not only the sophisticated, energy-intensive shortcomings of conventional recycling methods, but also the high-cost restriction on alkaline rechargeable Ni/Co batteries.

  11. Methods of use of calcium hexa aluminate refractory linings and/or chemical barriers in high alkali or alkaline environments

    SciTech Connect

    McGowan, Kenneth A; Cullen, Robert M; Keiser, James R; Hemrick, James G; Meisner, Roberta A

    2013-10-22

    A method for improving the insulating character/and or penetration resistance of a liner in contact with at least one of an alkali and/or alkaline environments is provided. The method comprises lining a surface that is subject to wear by an alkali environment and/or an alkaline environment with a refractory composition comprising a refractory aggregate consisting essentially of a calcium hexa aluminate clinker having the formula CA.sub.6, wherein C is equal to calcium oxide, wherein A is equal to aluminum oxide, and wherein the hexa aluminate clinker has from zero to less than about fifty weight percent C.sub.12A.sub.7, and wherein greater than 98 weight percent of the calcium hexa aluminate clinker having a particle size ranging from -20 microns to +3 millimeters, for forming a liner of the surface. This method improves the insulating character/and or penetration resistance of the liner.

  12. Effects of the low-temperature thermo-alkaline method on the rheological properties of sludge.

    PubMed

    Wang, Ruikun; Zhao, Zhenghui; Yin, Qianqian; Liu, Jianzhong

    2016-07-15

    Municipal sewage sludge (hereafter referred to as sludge) in increasing amounts is a serious threat to the environment and human health. Sludge is difficult to dispose because of its complex properties, such as high water content, viscosity, and hazardous compound concentration. The rheological properties of sludge also significantly influence treatment processes, including stirring, mixing, pumping, and conveying. Improving the rheological properties and reducing the apparent viscosity of sludge are conducive to economic and safe sludge treatment. In this study, the low-temperature thermo-alkaline (LTTA) method was used to modify sludge. Compared with the original sludge with an apparent viscosity at 100 s(-1) (η100) of 979.3 mPa s, the sludge modified under 90 °C-Ca(OH)2-1 h and 90 °C-NaOH-1 h conditions exhibited lower η100 values of 208.7 and 110.8 mPa s respectively. The original sludge exhibited a pseudoplastic behavior. After modification, the pseudoplastic behavior was weakened, and the sludge gradually tended to behave as Newton fluids. The hysteresis loop observed during the shear rate cycle was mainly caused by the viscoelasticity of the sludge. The hysteresis loop area (Hla) reflected to a certain extent the energy required to break the elastic solid structure of the sludge. The larger the Hla, the more energy was needed. However, this result should be evaluated comprehensively by considering other sludge parameters, such as yield stress and apparent viscosity. Hla may also reflect the damage degree of the sludge structure after shearing action. The irreversible destruction of the structure during shearing may also increase Hla. PMID:27082259

  13. Adaptive finite volume methods with well-balanced Riemann solvers for modeling floods in rugged terrain: Application to the Malpasset dam-break flood (France, 1959)

    USGS Publications Warehouse

    George, D.L.

    2011-01-01

    The simulation of advancing flood waves over rugged topography, by solving the shallow-water equations with well-balanced high-resolution finite volume methods and block-structured dynamic adaptive mesh refinement (AMR), is described and validated in this paper. The efficiency of block-structured AMR makes large-scale problems tractable, and allows the use of accurate and stable methods developed for solving general hyperbolic problems on quadrilateral grids. Features indicative of flooding in rugged terrain, such as advancing wet-dry fronts and non-stationary steady states due to balanced source terms from variable topography, present unique challenges and require modifications such as special Riemann solvers. A well-balanced Riemann solver for inundation and general (non-stationary) flow over topography is tested in this context. The difficulties of modeling floods in rugged terrain, and the rationale for and efficacy of using AMR and well-balanced methods, are presented. The algorithms are validated by simulating the Malpasset dam-break flood (France, 1959), which has served as a benchmark problem previously. Historical field data, laboratory model data and other numerical simulation results (computed on static fitted meshes) are shown for comparison. The methods are implemented in GEOCLAW, a subset of the open-source CLAWPACK software. All the software is freely available at. Published in 2010 by John Wiley & Sons, Ltd.

  14. A multiple-method approach to flood assessment at a low-level radioactive waste site in southern Nevada

    SciTech Connect

    Miller, J.J.; Gustafson, D.L.; Schmeltzer, J.S.

    1994-12-31

    Flood hazard analysis on alluvial fans using Federal Emergency Management Agency (FEMA) method are not limited to the FEMA Alluvial Fan Methodology (FEMA AFM). Flood hazard delineations using a combination of methods provide a more thorough assessment that using only the FEMA AFM. Other FEMA-accepted methods, such as the HEC-2 model for shallow concentrated flow and the Manning Equation for sheetflow, may be more appropriate. A flood assessment using a multiple-method approach was performed to determine the 100-year flood hazard in this arid region. Understanding the limitations and assumptions of these methods is important to determine which method is applicable and when a method can provide reasonable results.

  15. Understanding flood-induced water chemistry variability extracting temporal patterns with the LDA method

    NASA Astrophysics Data System (ADS)

    Aubert, A. H.; Tavenard, R.; Emonet, R.; De Lavenne, A.; Malinowski, S.; Guyet, T.; Quiniou, R.; Odobez, J.; Merot, P.; Gascuel-odoux, C.

    2013-12-01

    events. The patterns themselves are carefully studied, as well as their repartition along the year and along the 12 years of the dataset. We would recommend the use of such model to any study based on patterns or signature extraction. It could be well suited to compare different geographical locations and analyzing the resulting different pattern distributions. (1) Aubert, A.H., Gascuel-Odoux, C., Gruau, G., Akkal, N., Faucheux, M., Fauvel, Y., Grimaldi, C., Hamon, Y., Jaffrezic, A., Lecoz Boutnik, M., Molenat, J., Petitjean, P., Ruiz, L., Merot, Ph. (2013), Solute transport dynamics in small, shallow groundwater-dominated agricultural catchments: insights from a high-frequency, multisolute 10 yr-long monitoring study. Hydrol. Earth Syst. Sci., 17(4): 1379-1391. (2) Aubert, A.H., Tavenard, R, Emonet, R., de Lavenne, A., Malinowski, S., Guyet, T., Quiniou, R., Odobez, J.-M., Merot, Ph., Gascuel-Odoux, C., submitted to WRR. Clustering with a probabilistic method newly applied in hydrology: application on flood events from water quality time-series.

  16. An improved method to represent DEM uncertainty in glacial lake outburst flood propagation using stochastic simulations

    NASA Astrophysics Data System (ADS)

    Watson, Cameron S.; Carrivick, Jonathan; Quincey, Duncan

    2015-10-01

    Modelling glacial lake outburst floods (GLOFs) or 'jökulhlaups', necessarily involves the propagation of large and often stochastic uncertainties throughout the source to impact process chain. Since flood routing is primarily a function of underlying topography, communication of digital elevation model (DEM) uncertainty should accompany such modelling efforts. Here, a new stochastic first-pass assessment technique was evaluated against an existing GIS-based model and an existing 1D hydrodynamic model, using three DEMs with different spatial resolution. The analysis revealed the effect of DEM uncertainty and model choice on several flood parameters and on the prediction of socio-economic impacts. Our new model, which we call MC-LCP (Monte Carlo Least Cost Path) and which is distributed in the supplementary information, demonstrated enhanced 'stability' when compared to the two existing methods, and this 'stability' was independent of DEM choice. The MC-LCP model outputs an uncertainty continuum within its extent, from which relative socio-economic risk can be evaluated. In a comparison of all DEM and model combinations, the Shuttle Radar Topography Mission (SRTM) DEM exhibited fewer artefacts compared to those with the Advanced Spaceborne Thermal Emission and Reflection Radiometer Global Digital Elevation Model (ASTER GDEM), and were comparable to those with a finer resolution Advanced Land Observing Satellite Panchromatic Remote-sensing Instrument for Stereo Mapping (ALOS PRISM) derived DEM. Overall, we contend that the variability we find between flood routing model results suggests that consideration of DEM uncertainty and pre-processing methods is important when assessing flow routing and when evaluating potential socio-economic implications of a GLOF event. Incorporation of a stochastic variable provides an illustration of uncertainty that is important when modelling and communicating assessments of an inherently complex process.

  17. Stainless steel anodes for alkaline water electrolysis and methods of making

    SciTech Connect

    Soloveichik, Grigorii Lev

    2014-01-21

    The corrosion resistance of stainless steel anodes for use in alkaline water electrolysis was increased by immersion of the stainless steel anode into a caustic solution prior to electrolysis. Also disclosed herein are electrolyzers employing the so-treated stainless steel anodes. The pre-treatment process provides a stainless steel anode that has a higher corrosion resistance than an untreated stainless steel anode of the same composition.

  18. Anodes for alkaline electrolysis

    DOEpatents

    Soloveichik, Grigorii Lev

    2011-02-01

    A method of making an anode for alkaline electrolysis cells includes adsorption of precursor material on a carbonaceous material, conversion of the precursor material to hydroxide form and conversion of precursor material from hydroxide form to oxy-hydroxide form within the alkaline electrolysis cell.

  19. Combining Neural Networks with Existing Methods to Estimate 1 in 100-Year Flood Event Magnitudes

    NASA Astrophysics Data System (ADS)

    Newson, A.; See, L.

    2005-12-01

    Over the last fifteen years artificial neural networks (ANN) have been shown to be advantageous for the solution of many hydrological modelling problems. The use of ANNs for flood magnitude estimation in ungauged catchments, however, is a relatively new and under researched area. In this paper ANNs are used to make estimates of the magnitude of the 100-year flood event (Q100) for a number of ungauged catchments. The data used in this study were provided by the Centre for Ecology and Hydrology's Flood Estimation Handbook (FEH), which contains information on catchments across the UK. Sixteen catchment descriptors for 719 catchments were used to train an ANN, which was split into a training, validation and test data set. The goodness-of-fit statistics on the test data set indicated good model performance, with an r-squared value of 0.8 and a coefficient of efficiency of 79 percent. Data for twelve ungauged catchments were then put through the trained ANN to produce estimates of Q100. Two other accepted methodologies were also employed: the FEH statistical method and the FSR (Flood Studies Report) design storm technique, both of which are used to produce flood frequency estimates. The advantage of developing an ANN model is that it provides a third figure to aid a hydrologist in making an accurate estimate. For six of the twelve catchments, there was a relatively low spread between estimates. In these instances, an estimate of Q100 could be made with a fair degree of certainty. Of the remaining six catchments, three had areas greater than 1000km2, which means the FSR design storm estimate cannot be used. Armed with the ANN model and the FEH statistical method the hydrologist still has two possible estimates to consider. For these three catchments, the estimates were also fairly similar, providing additional confidence to the estimation. In summary, the findings of this study have shown that an accurate estimation of Q100 can be made using the catchment descriptors of

  20. A method for estimating magnitude and frequency of floods in Montana

    USGS Publications Warehouse

    Johnson, M.V.; Omang, R.J.

    1976-01-01

    This report provides methods for estimating flood characteristics at most natural flow sites on rural streams in Montana. It also contains significant flood data and related information for many gaged sites on Montana streams. Frequency curves are provided for 442 gaged sites as defined by log-Pearson Type III analysis. To allow estimates at ungaged sites, mathematical equations relate the 2-, 5-, 10-, 25-, 50-, and 100-year flood magnitudes to basin characteristics. Drainage area, main channel slope, and mean annual precipitation were found to be the most significant estimating variables. Equations presented are limited to use on streams with drainage areas from about 0.1 to 2,600 square miles (0.3 to 6,700 square kilometres), with slope from about 5 to 1,200 feet per mile (1.5 to 366 metres per kilometre), and with precipitation from 10 to 100 inches (250 to 2,500 millimetres). Nomographs provide a simple graphical means of solving the estimating relations, and illustrative examples are presented.

  1. Thermal fluids for CSP systems: Alkaline nitrates/nitrites thermodynamics modelling method

    NASA Astrophysics Data System (ADS)

    Tizzoni, A. C.; Sau, S.; Corsaro, N.; Giaconia, A.; D'Ottavi, C.; Licoccia, S.

    2016-05-01

    Molten salt (MS) mixtures are used for the transport (HTF-heat transfer fluid) and storage of heat (HSM-heat storage material) in Concentration Solar Plants (CSP). In general, alkaline and earth-alkaline nitrate/nitrite mixtures are employed. Along with its upper stability temperature, the melting point (liquidus point) of a MS mixture is one of the main parameters which defines its usefulness as a HTF and HSM medium. As a result, we would like to develop a predictive model which will allow us to forecast freezing points for different MS mixture compositions; thus circumventing the need to determine experimentally the phase diagram for each MS mixture. To model ternary/quaternary phase diagram, parameters for the binary subsystems are to be determined, which is the purpose of the concerned work. In a binary system with components A and B, in phase equilibrium conditions (e.g. liquid and solid) the chemical potentials (partial molar Gibbs energy) for each component in each phase are equal. For an ideal solution it is possible to calculate the mixing (A+B) Gibbs energy:ΔG = ΔH - TΔS = RT(xAlnxA + xBlnxB) In case of non-ideal solid/liquid mixtures, such as the nitrates/nitrites compositions investigated in this work, the actual value will differ from the ideal one by an amount defined as the "mixing" (mix) Gibbs free energy. If the resulting mixtures is assumed, as indicated in the previous literature, to follow a "regular solution" model, where all the non-ideality is considered included in the enthalpy of mixing value and considering, for instance, the A component:Δ G ≡0 =(Δ HA-T Δ SA)+(ΔH¯ m i x AL-T ΔS¯ m i x AL)-(ΔH¯ m i x AS-T ΔS¯ m i x AS)where the molar partial amounts can be calculated from the total value by the Gibbs Duhem equation: (ΔH¯m i x AL=ΔHm i x-XB Ld/Δ Hm i x d XB L ) L;(ΔH¯m i x AS=ΔHm i x-XB Sd/Δ Hm i x d XB S ) S and, in general, it is possible to express the mixing enthalpy for solids and liquids as a function of the mol

  2. Discovering temporal patterns in water quality time series, focusing on floods with the LDA method

    NASA Astrophysics Data System (ADS)

    Hélène Aubert, Alice; Tavenard, Romain; Emonet, Rémi; Malinowski, Simon; Guyet, Thomas; Quiniou, René; Odobez, Jean-Marc; Gascuel-Odoux, Chantal

    2013-04-01

    of several flood patterns. The output of LDA is a set of patterns that can easily be represented in graphics. These patterns correspond to typical reactions to rainfall events. The patterns themselves are carefully studied, as well as their repartition along the year and along the 12 years of the dataset. The novelties are fourfold. First, as a methodological point of view, we learn that hydrological data can be analyzed with this LDA model giving a typology of a multivariate chemical signature of floods. Second, we outline that chemistry parameters are sufficient to obtain meaningful patterns. There is no need to include hydro-meteorological parameters to define the patterns. However, hydro-meteorological parameters are useful to understand the processes leading to these patterns. Third, our hypothesis of seasonal specific reaction to rainfall is verified, moreover detailed; so is our hypothesis of different reactions to rainfall for years with different hydro-meteorological conditions. Fourth, this method allows the consideration of overlapping floods that are usually not studied. We would recommend the use of such model to study chemical reactions of stream after rainfall events, or more broadly after any hydrological events. The typology that has been provided by this method is a kind of bar code of water chemistry during floods. It could be well suited to compare different geographical locations by using the same patterns and analysing the resulting different pattern distributions. (1) Aubert, A.H. et al., 2012. The chemical signature of a livestock farming catchment: synthesis from a high-frequency multi-element long term monitoring. HESSD, 9(8): 9715 - 9741. (2) Aubert, A.H., Gascuel-Odoux, C., Merot, P., 2013. Annual hysteresis of water quality: A method to analyse the effect of intra- and inter-annual climatic conditions. Journal of Hydrology, 478(0): 29-39. (3) Blei, D. M.; Ng, A. Y.; Jordan, M. I., 2003. Latent Dirichlet allocation. Journal of Machine

  3. Alkaline battery operational methodology

    DOEpatents

    Sholklapper, Tal; Gallaway, Joshua; Steingart, Daniel; Ingale, Nilesh; Nyce, Michael

    2016-08-16

    Methods of using specific operational charge and discharge parameters to extend the life of alkaline batteries are disclosed. The methods can be used with any commercial primary or secondary alkaline battery, as well as with newer alkaline battery designs, including batteries with flowing electrolyte. The methods include cycling batteries within a narrow operating voltage window, with minimum and maximum cut-off voltages that are set based on battery characteristics and environmental conditions. The narrow voltage window decreases available capacity but allows the batteries to be cycled for hundreds or thousands of times.

  4. A hybrid method for flood simulation in small catchments combining hydrodynamic and hydrological techniques

    NASA Astrophysics Data System (ADS)

    Bellos, Vasilis; Tsakiris, George

    2016-09-01

    The study presents a new hybrid method for the simulation of flood events in small catchments. It combines a physically-based two-dimensional hydrodynamic model and the hydrological unit hydrograph theory. Unit hydrographs are derived using the FLOW-R2D model which is based on the full form of two-dimensional Shallow Water Equations, solved by a modified McCormack numerical scheme. The method is tested at a small catchment in a suburb of Athens-Greece for a storm event which occurred in February 2013. The catchment is divided into three friction zones and unit hydrographs of 15 and 30 min are produced. The infiltration process is simulated by the empirical Kostiakov equation and the Green-Ampt model. The results from the implementation of the proposed hybrid method are compared with recorded data at the hydrometric station at the outlet of the catchment and the results derived from the fully hydrodynamic model FLOW-R2D. It is concluded that for the case studied, the proposed hybrid method produces results close to those of the fully hydrodynamic simulation at substantially shorter computational time. This finding, if further verified in a variety of case studies, can be useful in devising effective hybrid tools for the two-dimensional flood simulations, which are lead to accurate and considerably faster results than those achieved by the fully hydrodynamic simulations.

  5. Advanced inorganic separators for alkaline batteries and method of making the same

    NASA Technical Reports Server (NTRS)

    Sheibley, D. W. (Inventor)

    1983-01-01

    A flexible, porous battery separator includes a coating applied to a porous, flexible substrate. The coating comprises: (1) a thermoplastic rubber-based resin which is insoluble and unreactive in the alkaline electrolyte, (2) a polar organic plasticizer which is reactive with the alkaline electrolyte to produce a reaction product which contains a hydroxyl group and/or a carboxylic acid group, and (3) a mixture of polar particulate filler materials which are unreactive with the electrode. The mixture comprises at least one first filler material having a surface area of greater than 25 sq meters/gram, at last one second filler material having a surface area of 10 to 25 sq meters/gram. The volume of the mixture of filler materials is less than 45% of the total volume of the fillers and the binder. The filler surface area per gram of binder is about 20 to 60 sq meters/gram, and the amount of plasticizer is sufficient to coat each filler particle.

  6. Quantitative Analysis of Burden of Infectious Diarrhea Associated with Floods in Northwest of Anhui Province, China: A Mixed Method Evaluation

    PubMed Central

    Ding, Guoyong; Zhang, Ying; Gao, Lu; Ma, Wei; Li, Xiujun; Liu, Jing; Liu, Qiyong; Jiang, Baofa

    2013-01-01

    Background Persistent and heavy rainfall in the upper and middle Huaihe River of China brought about severe floods during the end of June and July 2007. However, there has been no assessment on the association between the floods and infectious diarrhea. This study aimed to quantify the impact of the floods in 2007 on the burden of disease due to infectious diarrhea in northwest of Anhui Province. Methods A time-stratified case-crossover analysis was firstly conducted to examine the relationship between daily cases of infectious diarrhea and the 2007 floods in Fuyang and Bozhou of Anhui Province. Odds ratios (ORs) of the flood risk were quantified by conditional logistic regression. The years lived with disability (YLDs) of infectious diarrhea attributable to floods were then estimated based on the WHO framework of the calculating potential impact fraction in the Burden of Disease study. Results A total of 197 infectious diarrheas were notified during the exposure and control periods in the two study areas. The strongest effect was shown with a 2-day lag in Fuyang and a 5-day lag in Bozhou. Multivariable analysis showed that floods were significantly associated with an increased risk of the number cases of infectious diarrhea (OR = 3.175, 95%CI: 1.126–8.954 in Fuyang; OR = 6.754, 95%CI: 1.954–23.344 in Bozhou). Attributable YLD per 1000 of infectious diarrhea resulting from the floods was 0.0081 in Fuyang and 0.0209 in Bozhou. Conclusions Our findings confirm that floods have significantly increased the risks of infectious diarrhea in the study areas. In addition, prolonged moderate flood may cause more burdens of infectious diarrheas than severe flood with a shorter duration. More attention should be paid to particular vulnerable groups, including younger children and elderly, in developing public health preparation and intervention programs. Findings have significant implications for developing strategies to prevent and reduce health impact of floods

  7. Development of a method for evaluating carbon dioxide miscible flooding prospects. Final report

    SciTech Connect

    Green, D.W.; Swift, G.W.

    1985-03-01

    Research was undertaken to develop a method of evaluating reservoirs as prospects for carbon dioxide flooding. Evaluation was to be based on a determination of miscibility pressure and displacement efficiency under idealized conditions. To reach the objective, project work was divided into five areas: (1) conducting of phase-equilibrium studies of carbon dioxide with synthetic oils; (2) application of an equation of state to simulate the phase behavior of carbon dioxide - oil systems; (3) conducting of linear displacements of crude oils and synthetic oils by carbon dioxide in a slim-tube apparatus; (4) application of the equation of state, the phase-behavior data and slim-tube data to develop a method of screening reservoirs for carbon dioxide flooding based on determination of minimum miscibility pressure and displacement efficiency; (5) development of a one-dimensional mathematical model, based on the equation of state, for application in conjunction with the results of Parts 1 to 4. The accomplishments for these five areas are discussed in five chapters. 44 references, 90 figures, 42 tables.

  8. Alkaline injection for enhanced oil recovery: a status report

    SciTech Connect

    Mayer, E.H.; Berg, R.L.; Carmichael, J.D.; Weinbrandt, R.M.

    1983-01-01

    In the past several years, there has been renewed interest in enhanced oil recovery (EOR) by alkaline injection. Alkaline solutions also are being used as preflushes in micellar/polymer projects. Several major field tests of alkaline flooding are planned, are in progress, or recently have been completed. Considerable basic research on alkaline injection has been published recently, and more is in progress. This paper summarizes known field tests and, where available, the amount of alkali injected and the performance results. Recent laboratory work, much sponsored by the U.S. DOE, and the findings are described. Alkaline flood field test plans for new projects are summarized.

  9. A Comparison of Multisensor Precipitation Estimation Methods in Complex Terrain for Flash Flood Warning and Mitigation

    NASA Astrophysics Data System (ADS)

    Cifelli, R.; Chen, H.; Chandrasekar, C. V.; Willie, D.; Reynolds, D.; Campbell, C.; Zhang, Y.; Sukovich, E.

    2012-12-01

    the NWS forecast offices for issuance of flash flood warnings. This study will evaluate the performance of MPE and NMQ QPE products using independent gauges, object identification techniques for spatial verification and impact on surface runoff using a distributed hydrologic model. The effort will consist of baseline evaluations of these QPE systems to determine which combination of algorithm features is appropriate as well as investigate new methods for combining the gage and radar data. The Russian River Basin in California is used to demonstrate the comparison methodology with data collected from several rainfall events in March 2012.

  10. Investigation on the co-precipitation of transuranium elements from alkaline solutions by the method of appearing reagents

    SciTech Connect

    Krot, N.; Shilov, V.; Bessonov, A.; Budantseva, N.; Charushnikova, I.; Perminov, V.; Astafurova, L.

    1996-06-06

    Highly alkaline radioactive waste solutions originating from production of plutonium for military purposes are stored in underground tanks at the U.S. Department of Energy Hanford Site. The purification of alkaline solutions from neptunium and plutonium is important in the treatment and disposal of these wastes. This report describes scoping tests with sodium hydroxide solutions, where precipitation techniques were investigated to perform the separation. Hydroxides of iron (III), manganese (II), cobalt (II, III), and chromium (III); manganese (IV) oxide, and sodium uranate were investigated as carriers. The report describes the optimum conditions that were identified to precipitate these carriers homogeneously throughout the solution by reductive, hydrolytic, or catalytic decomposition of alkali-soluble precursor compounds by a technique called the Method of Appearing Reagents. The coprecipitation of pentavalent and hexavalent neptunium and plutonium was investigated for the candidate agents under optimum conditions and is described in this report along with the following results. Plutonium coprecipitated well with all tested materials except manganese (IV) oxide. Neptunium only coprecipitated well with uranate. The report presents a hypothesis to explain these behaviors. Further tests with more complex solution matrices must be performed.

  11. Designing a Software for Flood Risk Assessment Based on Multi Criteria Desicion Analysis and Information Diffusion Methods

    NASA Astrophysics Data System (ADS)

    Musaoglu, N.; Saral, A.; Seker, D. Z.

    2012-12-01

    Flooding is one of the major natural disasters not only in Turkey but also in all over the world and it causes serious damage and harm. It is estimated that of the total economic loss caused by all kinds of disasters, 40% was due to floods. In July 1995, the Ayamama Creek in Istanbul was flooded, the insurance sector received around 1,200 claims notices during that period, insurance companies had to pay a total of $40 million for claims. In 2009, the same creek was flooded again and killed 31 people over two days and insurance firms paid for damages around cost €150 million for claims. To solve these kinds of problems modern tools such as GIS and Remote Sensing should be utilized. In this study, a software was designed for the flood risk analysis with Analytic Hierarchy Process (AHP) and Information Diffusion( InfoDif) methods.In the developed sofware, five evaluation criterias were taken into account, which were slope, aspect, elevation, geology and land use which were extracted from the satellite sensor data. The Digital Elevation Model (DEM) of the Ayamama River Basin was acquired from the SPOT 5 satellite image with 2.5 meter spatial resolution. Slope and aspect values of the study basin were extracted from this DEM. The land use of the Ayamama Creek was obtained by performing object-oriented nearest neighbor classification method by image segmentation on SPOT 5 image dated 2010. All produced data were used as an input for the part of Multi Criteria Desicion Analysis (MCDA) method of this software. Criterias and their each sub criteras were weighted and flood vulnerability was determined with MCDA-AHP. Also, daily flood data was collected from Florya Meteorological Station, between 1975 to 2009 years and the daily flood peak discharge was calculated with the method of Soil Conservation Service-Curve Number (SCS-CN) and were used as an input in the software for the part of InfoDif.Obtained results were verified using ground truth data and it has been clearly

  12. Density Measurement of Molten Alkaline-Earth Fluorides Using Archimedean Dual-Sinker Method

    NASA Astrophysics Data System (ADS)

    Takeda, Osamu; Yanagase, Kei-ichi; Anbo, Yusuke; Aono, Masahiro; Hoshino, Yosuke; Sato, Yuzuru

    2015-11-01

    The densities of molten alkaline-earth fluorides ({MgF}2, {CaF}2, {SrF}2, and {BaF}2) were measured over the temperature range from 1526 K to 1873 K at ambient pressure using an Archimedean dual-sinker densitometer designed and set up by the authors. The volume difference between two sinkers was precisely determined by considering the wetting conditions between tungsten sinkers and water; appropriate experimental techniques were developed. The wetting condition became unstable when the sinkers were being moved for immersion in water, because the sinkers were moved in a direction that increased the contact angle. The wetting condition became stable when the sinkers were pulled up from the water, because the sinkers were moved in a direction that decreased the contact angle. The force exerted by the surface tension was efficiently canceled, and the volume difference became constant when the sinkers were pulled up. In this study, the total uncertainty was about 0.3 % at a maximum. The densities measured at high temperatures showed good linearity, with small scatter, over a wide temperature range. The densities and molar volumes increased in the following order: {MgF}2, {CaF}2, {SrF}2, and {BaF}2. The thermal-expansion coefficients showed anomalous behavior. The large thermal-expansion coefficient of {MgF}2 is attributed to a decrease in the cohesive force as a result of a partial loss of the coulombic force, because of the high charge density.

  13. Climate-related hazards: a method for global assessment of urban and rural population exposure to cyclones, droughts, and floods.

    PubMed

    Christenson, Elizabeth; Elliott, Mark; Banerjee, Ovik; Hamrick, Laura; Bartram, Jamie

    2014-02-21

    Global climate change (GCC) has led to increased focus on the occurrence of, and preparation for, climate-related extremes and hazards. Population exposure, the relative likelihood that a person in a given location was exposed to a given hazard event(s) in a given period of time, was the outcome for this analysis. Our objectives were to develop a method for estimating the population exposure at the country level to the climate-related hazards cyclone, drought, and flood; develop a method that readily allows the addition of better datasets to an automated model; differentiate population exposure of urban and rural populations; and calculate and present the results of exposure scores and ranking of countries based on the country-wide, urban, and rural population exposures to cyclone, drought, and flood. Gridded global datasets on cyclone, drought and flood occurrence as well as population density were combined and analysis was carried out using ArcGIS. Results presented include global maps of ranked country-level population exposure to cyclone, drought, flood and multiple hazards. Analyses by geography and human development index (HDI) are also included. The results and analyses of this exposure assessment have implications for country-level adaptation. It can also be used to help prioritize aid decisions and allocation of adaptation resources between countries and within a country. This model is designed to allow flexibility in applying cyclone, drought and flood exposure to a range of outcomes and adaptation measures.

  14. Climate-Related Hazards: A Method for Global Assessment of Urban and Rural Population Exposure to Cyclones, Droughts, and Floods

    PubMed Central

    Christenson, Elizabeth; Elliott, Mark; Banerjee, Ovik; Hamrick, Laura; Bartram, Jamie

    2014-01-01

    Global climate change (GCC) has led to increased focus on the occurrence of, and preparation for, climate-related extremes and hazards. Population exposure, the relative likelihood that a person in a given location was exposed to a given hazard event(s) in a given period of time, was the outcome for this analysis. Our objectives were to develop a method for estimating the population exposure at the country level to the climate-related hazards cyclone, drought, and flood; develop a method that readily allows the addition of better datasets to an automated model; differentiate population exposure of urban and rural populations; and calculate and present the results of exposure scores and ranking of countries based on the country-wide, urban, and rural population exposures to cyclone, drought, and flood. Gridded global datasets on cyclone, drought and flood occurrence as well as population density were combined and analysis was carried out using ArcGIS. Results presented include global maps of ranked country-level population exposure to cyclone, drought, flood and multiple hazards. Analyses by geography and human development index (HDI) are also included. The results and analyses of this exposure assessment have implications for country-level adaptation. It can also be used to help prioritize aid decisions and allocation of adaptation resources between countries and within a country. This model is designed to allow flexibility in applying cyclone, drought and flood exposure to a range of outcomes and adaptation measures. PMID:24566046

  15. Automating Flood Hazard Mapping Methods for Near Real-time Storm Surge Inundation and Vulnerability Assessment

    NASA Astrophysics Data System (ADS)

    Weigel, A. M.; Griffin, R.; Gallagher, D.

    2015-12-01

    Storm surge has enough destructive power to damage buildings and infrastructure, erode beaches, and threaten human life across large geographic areas, hence posing the greatest threat of all the hurricane hazards. The United States Gulf of Mexico has proven vulnerable to hurricanes as it has been hit by some of the most destructive hurricanes on record. With projected rises in sea level and increases in hurricane activity, there is a need to better understand the associated risks for disaster mitigation, preparedness, and response. GIS has become a critical tool in enhancing disaster planning, risk assessment, and emergency response by communicating spatial information through a multi-layer approach. However, there is a need for a near real-time method of identifying areas with a high risk of being impacted by storm surge. Research was conducted alongside Baron, a private industry weather enterprise, to facilitate automated modeling and visualization of storm surge inundation and vulnerability on a near real-time basis. This research successfully automated current flood hazard mapping techniques using a GIS framework written in a Python programming environment, and displayed resulting data through an Application Program Interface (API). Data used for this methodology included high resolution topography, NOAA Probabilistic Surge model outputs parsed from Rich Site Summary (RSS) feeds, and the NOAA Census tract level Social Vulnerability Index (SoVI). The development process required extensive data processing and management to provide high resolution visualizations of potential flooding and population vulnerability in a timely manner. The accuracy of the developed methodology was assessed using Hurricane Isaac as a case study, which through a USGS and NOAA partnership, contained ample data for statistical analysis. This research successfully created a fully automated, near real-time method for mapping high resolution storm surge inundation and vulnerability for the

  16. What Happens During a Minor Flood: Observations of Bedload Transport in a Gravel Bed River using New Methods

    NASA Astrophysics Data System (ADS)

    Bray, E. N.; Dunne, T.

    2015-12-01

    The question of "does the streambed change over a flood" does not have a clear answer due to lack of measurement methods during high flows. We seek to inform our understanding of bedload transport by linking field measurements using fiber optic distributed temperature sensing (DTS) cable, calculations of disentrainment over time and distance, and in situ measurements of streambed permeability with sediment transport theory and an existing explicit analytical solution to predict depth of sediment deposition and one-dimensional fluid velocity from amplitude and phase information. The method facilitates the study of gravel transport by using near-bed temperature time series to estimate rates of sediment deposition continuously over the duration of a minor flood coinciding with bar formation, including (1) a field method for measuring local rates of deposition and bed elevation change during a minor flood to compute rates of bedload transport, (2) use of an existing analytical solution to quantify the depth of sediment deposition over distance and time from temperature amplitude and phase information, (3) observational and theoretical evidence that incipient motion occurs during a minor flood, (4) observational evidence that suggests rates of sediment transport are not necessarily constant during a constant flow, and (5) field evidence for the persistence of armor layers in gravel bed rivers during a minor flood. These observations of partial bedload transport, taken along a 2 km gravel bed reach of the San Joaquin River, CA, USA during an experimental flow release, suggest that the discharge needed to create the boundary shear is lower than previous estimates, and that partial transport of grain sizes on the bed, including the median particle size, occurs during a minor flood with a current recurrence interval of approximately 1-2 years.

  17. Effects of Phytophthora cinnamomi isolate, inoculum delivery method, flood, and drought on vigor, disease severity and mortality of blueberry plants

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Four studies evaluated the effects of cultivar, inoculum delivery method, flood, and drought on plant vigor, disease severity, and mortality of blueberry plants grown in pots in the greenhouse. Phytophthora cinnamomi isolates were obtained from the root zone of blueberry plants displaying symptoms...

  18. Technical Note: Initial assessment of a multi-method approach to spring-flood forecasting in Sweden

    NASA Astrophysics Data System (ADS)

    Olsson, J.; Uvo, C. B.; Foster, K.; Yang, W.

    2016-02-01

    Hydropower is a major energy source in Sweden, and proper reservoir management prior to the spring-flood onset is crucial for optimal production. This requires accurate forecasts of the accumulated discharge in the spring-flood period (i.e. the spring-flood volume, SFV). Today's SFV forecasts are generated using a model-based climatological ensemble approach, where time series of precipitation and temperature from historical years are used to force a calibrated and initialized set-up of the HBV model. In this study, a number of new approaches to spring-flood forecasting that reflect the latest developments with respect to analysis and modelling on seasonal timescales are presented and evaluated. Three main approaches, represented by specific methods, are evaluated in SFV hindcasts for the Swedish river Vindelälven over a 10-year period with lead times between 0 and 4 months. In the first approach, historically analogue years with respect to the climate in the period preceding the spring flood are identified and used to compose a reduced ensemble. In the second, seasonal meteorological ensemble forecasts are used to drive the HBV model over the spring-flood period. In the third approach, statistical relationships between SFV and the large-sale atmospheric circulation are used to build forecast models. None of the new approaches consistently outperform the climatological ensemble approach, but for early forecasts improvements of up to 25 % are found. This potential is reasonably well realized in a multi-method system, which over all forecast dates reduced the error in SFV by ˜ 4 %. This improvement is limited but potentially significant for e.g. energy trading.

  19. Development of a national Flash flood warning system in France using the AIGA method: first results and main issues

    NASA Astrophysics Data System (ADS)

    Javelle, Pierre; Organde, Didier; Demargne, Julie; de Saint-Aubin, Céline; Garandeau, Léa; Janet, Bruno; Saint-Martin, Clotilde; Fouchier, Catherine

    2016-04-01

    Developing a national flash flood (FF) warning system is an ambitious and difficult task. On one hand it rises huge expectations from exposed populations and authorities since induced damages are considerable (ie 20 casualties in the recent October 2015 flood at the French Riviera). But on the other hand, many practical and scientific issues have to be addressed and limitations should be clearly stated. The FF warning system to be implemented by 2016 in France by the SCHAPI (French national service in charge of flood forecasting) will be based on a discharge-threshold flood warning method called AIGA (Javelle et al. 2014). The AIGA method has been experimented in real time in the south of France in the RHYTMME project (http://rhytmme.irstea.fr). It consists in comparing discharges generated by a simple conceptual hourly hydrologic model run at a 1-km² resolution to reference flood quantiles of different return periods, at any point along the river network. The hydrologic model ingests operational rainfall radar-gauge products from Météo-France. Model calibration was based on ~700 hydrometric stations over the 2002-2015 period and then hourly discharges were computed at ~76 000 catchment outlets, with areas ranging from 10 to 3 500 km², over the last 19 years. This product makes it possible to calculate reference flood quantiles at each outlet. The on-going evaluation of the FF warnings is currently made at two levels: in a 'classical' way, using discharges available at the hydrometric stations, but also in a more 'exploratory' way, by comparing past flood reports and warnings issued by the system over the 76 000 catchment outlets. The interest of the last method is that it better fit the system objectives since it is designed to monitor small ungauged catchments. Javelle, P., Demargne, J., Defrance, D, .Pansu, J, .Arnaud, P. (2014). Evaluating flash-flood warnings at ungauged locations using post-event surveys: a case study with the AIGA warning system

  20. A quantitative evaluation method of flood risks in low-lying areas associated with increase of heavy rainfall in Japan

    NASA Astrophysics Data System (ADS)

    Minakawa, H.; Masumoto, T.

    2012-12-01

    An increase in flood risk, especially in low-lying areas, is predicted as a consequence of global climate change or other causes. Immediate measures such as strengthening of drainage capacity are needed to minimize the damage caused by more-frequent flooding. Typically, drainage pump capacities of in paddy areas are planned by using a result of drainage analysis with design rainfall (e.g. 3-day rainfall amount with a 10-year return period). However, the result depends on a hyetograph of input rainfall even if a total amount of rainfall is equal, and the flood risk may be different with rainfall patterns. Therefore, it is important to assume various patterns of heavy rainfall for flood risk assessment. On the other hand, a rainfall synthesis simulation is useful to generate many patterns of rainfall data for flood studies. We previously proposed a rainfall simulation method called diurnal rainfall pattern generator which can generate short-time step rainfall and internal pattern of them. This study discusses a quantitative evaluation method for detecting a relationship between flood damage risk and heavy rainfall scale by using the diurnal rainfall pattern generator. In addition, we also approached an estimation of flood damage which focused on rice yield. Our study area was in the Kaga three-lagoon basin in Ishikawa Prefecture, Japan. There are two lagoons in the study area, and the low-lying paddy areas extend over about 4,000 ha in the lower reaches of the basin. First, we developed a drainage analysis model that incorporates kinematic and diffusive runoff models for calculating water level on channels and paddies. Next, the heavy rainfall data for drainage analysis were generated. Here, the 3-day rainfalls amounts with 9 kinds of different return periods (2-, 3-, 5-, 8-, 10-, 15-, 50-, 100-, and 200-year) were derived, and three hundred hyetograph patterns were generated for each rainfall amount by using the diurnal rainfall pattern generator. Finally, all data

  1. Extreme flood estimation by the SCHADEX method in a snow-driven catchment: application to Atnasjø (Norway)

    NASA Astrophysics Data System (ADS)

    Paquet, Emmanuel; Lawrence, Deborah

    2013-04-01

    The SCHADEX method for extreme flood estimation was developed by Paquet et al. (2006, 2013), and since 2008, it is the reference method used by Electricité de France (EDF) for dam spillway design. SCHADEX is a so-called "semi-continuous" stochastic simulation method in that flood events are simulated on an event basis and are superimposed on a continuous simulation of the catchment saturation hazard usingrainfall-runoff modelling. The MORDOR hydrological model (Garçon, 1999) has thus far been used for the rainfall-runoff modelling. MORDOR is a conceptual, lumped, reservoir model with daily areal rainfall and air temperature as the driving input data. The principal hydrological processes represented are evapotranspiration, direct and indirect runoff, ground water, snow accumulation and melt, and routing. The model has been intensively used at EDF for more than 15 years, in particular for inflow forecasts for French mountainous catchments. SCHADEX has now also been applied to the Atnasjø catchment (463 km²), a well-documented inland catchment in south-central Norway, dominated by snowmelt flooding during spring/early summer. To support this application, a weather pattern classification based on extreme rainfall was first established for Norway (Fleig, 2012). This classification scheme was then used to build a Multi-Exponential Weather Pattern distribution (MEWP), as introduced by Garavaglia et al. (2010) for extreme rainfall estimation. The MORDOR model was then calibrated relative to daily discharge data for Atnasjø. Finally, a SCHADEX simulation was run to build a daily discharge distribution with a sufficient number of simulations for assessing the extreme quantiles. Detailed results are used to illustrate how SCHADEX handles the complex and interacting hydrological processes driving flood generation in this snow driven catchment. Seasonal and monthly distributions, as well as statistics for several thousand simulated events reaching a 1000 years return level

  2. Development of a method for the regeneration of an alkaline electrolyte in an air-aluminum chemical power supply

    NASA Astrophysics Data System (ADS)

    Pushkin, K. V.; Sevruk, S. D.; Suvorova, E. V.; Farmakovskaya, A. A.

    2015-12-01

    The results of studying the development of a regeneration technology for the spent alkaline electrolyte in an air-aluminum chemical power supply are presented. The application of this technology is a component of the wasteless and friendly environmental operation of an energy installation based on an air-aluminum chemical power supply. The operability of the energy installation based on the air-aluminum chemical power supply using regenerated alkaline electrolytes is experimentally confirmed. Technical requirements for the technological equipment for alkaline electrolyte regeneration are developed on the basis of the obtained results.

  3. A simple alkaline method for decellularizing human amniotic membrane for cell culture.

    PubMed

    Saghizadeh, Mehrnoosh; Winkler, Michael A; Kramerov, Andrei A; Hemmati, David M; Ghiam, Chantelle A; Dimitrijevich, Slobodan D; Sareen, Dhruv; Ornelas, Loren; Ghiasi, Homayon; Brunken, William J; Maguen, Ezra; Rabinowitz, Yaron S; Svendsen, Clive N; Jirsova, Katerina; Ljubimov, Alexander V

    2013-01-01

    Human amniotic membrane is a standard substratum used to culture limbal epithelial stem cells for transplantation to patients with limbal stem cell deficiency. Various methods were developed to decellularize amniotic membrane, because denuded membrane is poorly immunogenic and better supports repopulation by dissociated limbal epithelial cells. Amniotic membrane denuding usually involves treatment with EDTA and/or proteolytic enzymes; in many cases additional mechanical scraping is required. Although ensuring limbal cell proliferation, these methods are not standardized, require relatively long treatment times and can result in membrane damage. We propose to use 0.5 M NaOH to reliably remove amniotic cells from the membrane. This method was used before to lyse cells for DNA isolation and radioactivity counting. Gently rubbing a cotton swab soaked in NaOH over the epithelial side of amniotic membrane leads to nearly complete and easy removal of adherent cells in less than a minute. The denuded membrane is subsequently washed in a neutral buffer. Cell removal was more thorough and uniform than with EDTA, or EDTA plus mechanical scraping with an electric toothbrush, or n-heptanol plus EDTA treatment. NaOH-denuded amniotic membrane did not show any perforations compared with mechanical or thermolysin denuding, and showed excellent preservation of immunoreactivity for major basement membrane components including laminin α2, γ1-γ3 chains, α1/α2 and α6 type IV collagen chains, fibronectin, nidogen-2, and perlecan. Sodium hydroxide treatment was efficient with fresh or cryopreserved (10% dimethyl sulfoxide or 50% glycerol) amniotic membrane. The latter method is a common way of membrane storage for subsequent grafting in the European Union. NaOH-denuded amniotic membrane supported growth of human limbal epithelial cells, immortalized corneal epithelial cells, and induced pluripotent stem cells. This simple, fast and reliable method can be used to standardize

  4. A novel fluorescence detection method for in situ hybridization, based on the alkaline phosphatase-fast red reaction.

    PubMed

    Speel, E J; Schutte, B; Wiegant, J; Ramaekers, F C; Hopman, A H

    1992-09-01

    We have used naphthol-ASMX-phosphate and Fast Red TR in combination with alkaline phosphatase (APase) to produce fluorescent precipitated reaction products in a non-radioactive in situ hybridization (ISH) method. To obtain optimal and discrete localization of the strongly red fluorescent ISH signals, the enzyme precipitation procedure was optimized. The optimal reaction time and the concentrations of substrate and capture agent were determined. Furthermore, polyvinyl alcohol (PVA) was used to increase the viscosity of the reaction mixture and thus to reduce diffusion of the reaction product. Our results show that the APase-Fast Red detection method has at least the same sensitivity as currently observed in other immunofluorescent detection systems. A single copy DNA sequence of 15.8 KB could be localized with high efficiency in metaphase spreads and in interphase nuclei. Double labeling procedures, in which the FITC- and azo-dye fluorescence are combined, are also feasible. The red fluorescent ISH signals showed hardly any fading as compared with FITC fluorescence on exposure to either light from the mercury-arc lamp or laser light. Therefore, these red fluorescent signals with a virtually permanent character allow a better analysis and three-dimensional localization of such cytochemically detected genomic fractions by means of confocal scanning laser microscopy as compared with the use of FITC, TRITC, or Texas Red as label. PMID:1506667

  5. Fire flood method for recovering petroleum from oil reservoirs of low permeability and temperature

    DOEpatents

    Kamath, Krishna

    1984-08-14

    The present invention is directed to a method of enhanced oil recovery by fire flooding petroleum reservoirs characterized by a temperature of less than the critical temperature of carbon dioxide, a pore pressure greater than the saturated vapor pressure of carbon dioxide at said temperature (87.7.degree. F. at 1070 psia), and a permeability in the range of about 20 to 100 millidarcies. The in situ combustion of petroleum in the reservoir is provided by injecting into the reservoir a combustion supporting medium consisting essentially of oxygen, ozone, or a combination thereof. The heat of combustion and the products of this combustion which consist essentially of gaseous carbon dioxide and water vapor sufficiently decrease the viscosity of oil adjacent to fire front to form an oil bank which moves through the reservoir towards a recovery well ahead of the fire front. The gaseous carbon dioxide and the water vapor are driven into the reservoir ahead of the fire front by pressure at the injection well. As the gaseous carbon dioxide cools to less than about 88.degree. F. it is converted to liquid which is dissolved in the oil bank for further increasing the mobility thereof. By using essentially pure oxygen, ozone, or a combination thereof as the combustion supporting medium in these reservoirs the permeability requirements of the reservoirs are significantly decreased since the liquid carbon dioxide requires substantially less voidage volume than that required for gaseous combustion products.

  6. Physicochemical properties of the alumina produced by alkaline and acidic methods

    NASA Astrophysics Data System (ADS)

    Vetchinkina, T. N.

    2009-04-01

    Crystal-optical, X-ray diffraction, and thermogravimetric methods are used to study the polymorphic transformations in the products of calcination of the aluminum hydroxide produced by the decomposition and carbonization of aluminate solutions; the aluminum oxide produced by the decomposition of pure grade crystal hydrates of aluminum salts; and the alumina extracted upon the beneficiation of the mineral part of coaly rock with sulfuric, hydrochloric, and nitric acids. The morphology of the products of the thermal decomposition of the initial compounds is examined. The effect of impurities and a reducing agent on the formation of the structural modifications of alumina during heat treatment is revealed.

  7. A Mixed-Method Study of Princeville's Rebuilding from the Flood of 1999: Lessons on the Importance of Invisible Community Assets

    ERIC Educational Resources Information Center

    Yoon, Intae

    2009-01-01

    Guided by previous studies and the community assets perspective, a concurrent mixed-method case study was conducted five years after a devastating flood to investigate how invisible community assets played a role in Princeville's rebuilding process from the flood of 1999. The independent variables in this study included retrospectively assessed…

  8. Evaluation of various parameters of calcium-alginate immobilization method for enhanced alkaline protease production by Bacillus licheniformis NCIM-2042 using statistical methods.

    PubMed

    Potumarthi, Ravichandra; Subhakar, Ch; Pavani, A; Jetty, Annapurna

    2008-04-01

    Calcium-alginate immobilization method for the production of alkaline protease by Bacillus licheniformis NCIM-2042 was optimized statistically. Four variables, such as sodium-alginate concentration, calcium chloride concentration, inoculum size and agitation speed were optimized by 2(4) full factorial central composite design and subsequent analysis and model validation by a second-order regression equation. Eleven carbon, 11 organic nitrogen and seven inorganic nitrogen sources were screened by two-level Plackett-Burman design for maximum alkaline protease production by using optimized immobilized conditions. The levels of four variables, such as Na-alginate 2.78%; CaCl(2), 2.15%; inoculum size, 8.10% and agitation, 139 rpm were found to be optimum for maximal production of protease. Glucose, soybean meal and ammonium sulfate were resulted in maximum protease production at 644 U/ml, 720 U/ml, and 806 U/ml when screened for carbon, organic nitrogen and inorganic nitrogen sources, respectively, using optimized immobilization conditions. Repeated fed batch mode of operation, using optimized immobilized conditions, resulted in continuous operation for 12 cycles without disintegration of beads. Cross-sectional scanning electron microscope images have shown the growth pattern of B. licheniformis in Ca-alginate immobilized beads.

  9. Development of Alkaline Oxidative Dissolution Methods for Chromium (III) Compounds Present in Hanford Site Tank Sludges

    SciTech Connect

    NN Krot; VP Shilov; AM Fedoseev; NA Budantseva; MV Nikonov; AB Yusov; AYu Garnov; IA Charushnikova; VP Perminov; LN Astafurova; TS Lapitskaya; VI Makarenkov

    1999-07-02

    The high-level radioactive waste sludge in the underground storage tanks at the Hanford Site contains various chromium(III)solid phases. Dissolution and removal of chromium from tank waste sludges is desirable prior to high-level waste vitrification because increased volume is required to incorporate the residual chromium. Unfortunately, dissolution of chromium from the sludge to form Cr(OH){sub 4}{sup {minus}} through treatment with heated NaOH solution (also used to dissolve aluminum phases and metathesize phosphates to sodium salts) generally has been unsuccessful in tests with both simulated and genuine Hanford waste sludges. Oxidative dissolution of the Cr(III) compounds to form soluble chromate has been proposed as an alternative chromium solid phase dissolution method and results of limited prior testing have been reported.

  10. Flood hazard assessment in areas prone to flash flooding

    NASA Astrophysics Data System (ADS)

    Kvočka, Davor; Falconer, Roger A.; Bray, Michaela

    2016-04-01

    Contemporary climate projections suggest that there will be an increase in the occurrence of high-intensity rainfall events in the future. These precipitation extremes are usually the main cause for the emergence of extreme flooding, such as flash flooding. Flash floods are among the most unpredictable, violent and fatal natural hazards in the world. Furthermore, it is expected that flash flooding will occur even more frequently in the future due to more frequent development of extreme weather events, which will greatly increase the danger to people caused by flash flooding. This being the case, there will be a need for high resolution flood hazard maps in areas susceptible to flash flooding. This study investigates what type of flood hazard assessment methods should be used for assessing the flood hazard to people caused by flash flooding. Two different types of flood hazard assessment methods were tested: (i) a widely used method based on an empirical analysis, and (ii) a new, physically based and experimentally calibrated method. Two flash flood events were considered herein, namely: the 2004 Boscastle flash flood and the 2007 Železniki flash flood. The results obtained in this study suggest that in the areas susceptible to extreme flooding, the flood hazard assessment should be conducted using methods based on a mechanics-based analysis. In comparison to standard flood hazard assessment methods, these physically based methods: (i) take into account all of the physical forces, which act on a human body in floodwater, (ii) successfully adapt to abrupt changes in the flow regime, which often occur for flash flood events, and (iii) rapidly assess a flood hazard index in a relatively short period of time.

  11. A fast method for optical simulation of flood maps of light-sharing detector modules

    PubMed Central

    Shi, Han; Du, Dong; Xu, JianFeng; Moses, William W.; Peng, Qiyu

    2016-01-01

    Optical simulation of the detector module level is highly desired for Position Emission Tomography (PET) system design. Commonly used simulation toolkits such as GATE are not efficient in the optical simulation of detector modules with complicated light-sharing configurations, where a vast amount of photons need to be tracked. We present a fast approach based on a simplified specular reflectance model and a structured light-tracking algorithm to speed up the photon tracking in detector modules constructed with polished finish and specular reflector materials. We simulated conventional block detector designs with different slotted light guide patterns using the new approach and compared the outcomes with those from GATE simulations. While the two approaches generated comparable flood maps, the new approach was more than 200–600 times faster. The new approach has also been validated by constructing a prototype detector and comparing the simulated flood map with the experimental flood map. The experimental flood map has nearly uniformly distributed spots similar to those in the simulated flood map. In conclusion, the new approach provides a fast and reliable simulation tool for assisting in the development of light-sharing-based detector modules with a polished surface finish and using specular reflector materials.

  12. A fast method for optical simulation of flood maps of light-sharing detector modules

    PubMed Central

    Shi, Han; Du, Dong; Xu, JianFeng; Moses, William W.; Peng, Qiyu

    2016-01-01

    Optical simulation of the detector module level is highly desired for Position Emission Tomography (PET) system design. Commonly used simulation toolkits such as GATE are not efficient in the optical simulation of detector modules with complicated light-sharing configurations, where a vast amount of photons need to be tracked. We present a fast approach based on a simplified specular reflectance model and a structured light-tracking algorithm to speed up the photon tracking in detector modules constructed with polished finish and specular reflector materials. We simulated conventional block detector designs with different slotted light guide patterns using the new approach and compared the outcomes with those from GATE simulations. While the two approaches generated comparable flood maps, the new approach was more than 200–600 times faster. The new approach has also been validated by constructing a prototype detector and comparing the simulated flood map with the experimental flood map. The experimental flood map has nearly uniformly distributed spots similar to those in the simulated flood map. In conclusion, the new approach provides a fast and reliable simulation tool for assisting in the development of light-sharing-based detector modules with a polished surface finish and using specular reflector materials. PMID:27660376

  13. A time-series analysis framework for the flood-wave method to estimate groundwater model parameters

    NASA Astrophysics Data System (ADS)

    Obergfell, Christophe; Bakker, Mark; Maas, Kees

    2016-06-01

    The flood-wave method is implemented within the framework of time-series analysis to estimate aquifer parameters for use in a groundwater model. The resulting extended flood-wave method is applicable to situations where groundwater fluctuations are affected significantly by time-varying precipitation and evaporation. Response functions for time-series analysis are generated with an analytic groundwater model describing stream-aquifer interaction. Analytical response functions play the same role as the well function in a pumping test, which is to translate observed head variations into groundwater model parameters by means of a parsimonious model equation. An important difference as compared to the traditional flood-wave method and pumping tests is that aquifer parameters are inferred from the combined effects of precipitation, evaporation, and stream stage fluctuations. Naturally occurring fluctuations are separated in contributions from different stresses. The proposed method is illustrated with data collected near a lowland river in the Netherlands. Special emphasis is put on the interpretation of the streambed resistance. The resistance of the streambed is the result of stream-line contraction instead of a semi-pervious streambed, which is concluded through comparison with the head loss calculated with an analytical two-dimensional cross-section model.

  14. Assimilation of spatially distributed water levels into a shallow-water flood model. Part I: Mathematical method and test case

    NASA Astrophysics Data System (ADS)

    Lai, X.; Monnier, J.

    2009-10-01

    SummaryRecent applications of remote sensing techniques produce rich spatially distributed observations for flood monitoring. In order to improve numerical flood prediction, we have developed a variational data assimilation method (4D-var) that combines remote sensing data (spatially distributed water levels extracted from spatial images) and a 2D shallow water model. In the present paper (part I), we demonstrate the efficiency of the method with a test case. First, we assimilated a single fully observed water level image to identify time-independent parameters (e.g. Manning coefficients and initial conditions) and time-dependent parameters (e.g. inflow). Second, we combined incomplete observations (a time series of water elevations at certain points and one partial image). This last configuration was very similar to the real case we analyze in a forthcoming paper (part II). In addition, a temporal strategy with time overlapping is suggested to decrease the amount of memory required for long-duration simulation.

  15. Ambient formic acid in southern California air: A comparison of two methods, Fourier transform infrared spectroscopy and alkaline trap-liquid chromatography with UV detection

    SciTech Connect

    Grosjean, D. ); Tuazon, E.C. ); Fujita, E. )

    1990-01-01

    Formic acid is an ubiquitous component of urban smog. Sources of formic acid in urban air include direct emissions from vehicles and in situ reaction of ozone with olefins. Ambient levels of formic acid in southern California air were first measured some 15 years ago by Hanst et al. using long-path Fourier transform infrared spectroscopy (FTIR). All subsequent studies of formic acid in the Los Angeles area have involved the use of two methods, either FTIR or collection on alkaline traps followed by gas chromatography, ion chromatography, or liquid chromatography analysis with UV detection, ATLC-UV. The Carbon Species Methods Comparison Study (CSMCS), a multilaboratory air quality study carried out in August 1986 at a southern California smog receptor site, provided an opportunity for direct field comparison of the FTIR and alkaline trap methods. The results of the comparison are presented in this brief report.

  16. The effect of alkaline agents on retention of EOR chemicals

    SciTech Connect

    Lorenz, P.B.

    1991-07-01

    This report summarizes a literature survey on how alkaline agents reduce losses of surfactants and polymers in oil recovery by chemical injection. Data are reviewed for crude sulfonates, clean anionic surfactants, nonionic surfactants, and anionic and nonionic polymers. The role of mineral chemistry is briefly described. Specific effects of various alkaline anions are discussed. Investigations needed to improve the design of alkaline-surfactant-polymer floods are suggested. 62 refs., 28 figs., 6 tabs.

  17. A method to calibrate channel friction and bathymetry parameters of a Sub-Grid hydraulic model using SAR flood images

    NASA Astrophysics Data System (ADS)

    Wood, M.; Neal, J. C.; Hostache, R.; Corato, G.; Chini, M.; Giustarini, L.; Matgen, P.; Wagener, T.; Bates, P. D.

    2015-12-01

    Synthetic Aperture Radar (SAR) satellites are capable of all-weather day and night observations that can discriminate between land and smooth open water surfaces over large scales. Because of this there has been much interest in the use of SAR satellite data to improve our understanding of water processes, in particular for fluvial flood inundation mechanisms. Past studies prove that integrating SAR derived data with hydraulic models can improve simulations of flooding. However while much of this work focusses on improving model channel roughness values or inflows in ungauged catchments, improvement of model bathymetry is often overlooked. The provision of good bathymetric data is critical to the performance of hydraulic models but there are only a small number of ways to obtain bathymetry information where no direct measurements exist. Spatially distributed river depths are also rarely available. We present a methodology for calibration of model average channel depth and roughness parameters concurrently using SAR images of flood extent and a Sub-Grid model utilising hydraulic geometry concepts. The methodology uses real data from the European Space Agency's archive of ENVISAT[1] Wide Swath Mode images of the River Severn between Worcester and Tewkesbury during flood peaks between 2007 and 2010. Historic ENVISAT WSM images are currently free and easy to access from archive but the methodology can be applied with any available SAR data. The approach makes use of the SAR image processing algorithm of Giustarini[2] et al. (2013) to generate binary flood maps. A unique feature of the calibration methodology is to also use parameter 'identifiability' to locate the parameters with higher accuracy from a pre-assigned range (adopting the DYNIA method proposed by Wagener[3] et al., 2003). [1] https://gpod.eo.esa.int/services/ [2] Giustarini. 2013. 'A Change Detection Approach to Flood Mapping in Urban Areas Using TerraSAR-X'. IEEE Transactions on Geoscience and Remote

  18. A novel method to suppress the dispersal of Japanese cedar pollen by inducing morphologic changes with weak alkaline solutions.

    PubMed

    Ishii, K; Hamamoto, H; Sekimizu, K

    2007-10-01

    Inhalation of airborne pollen causes irritative symptoms in humans, known as pollinosis. The changing global climate and increased pollution contribute to enhance the release of pollen, thereby increasing the number of people suffering from allergies. We examined the effect of spraying weak alkaline solutions onto cedar trees, the main allergenic culprit in Japan, on pollen release. Weak alkaline solutions were sprayed onto Japanese cedar blossoms to disrupt the external walls of the pollen, and to induce swelling of the cytosolic components containing the nucleus. This morphologic change of the pollen grains depended on the pH of the suspending solution, with a threshold pH of near 7.5. As the breakdown of the external walls and swelling of the cytosolic components are inhibited by high osmolarity, the influx of water triggered the morphologic changes. Weak alkaline solutions sprayed onto cedar blossoms decreased the amount of pollen released from the anthers in a pH dependent manner. The addition of detergent to the sodium bicarbonate solution facilitated this effect on cedar pollen release. We suggest that spraying cedar and cypress forests with a weak alkaline solution might prevent the scattering of pollen that causes allergies in humans.

  19. A new method for the determination of the nitrogen content of nitrocellulose based on the molar ratio of nitrite-to-nitrate ions released after alkaline hydrolysis.

    PubMed

    Alinat, Elodie; Delaunay, Nathalie; Archer, Xavier; Mallet, Jean-Maurice; Gareil, Pierre

    2015-04-01

    A new method was proposed to determine the nitrogen content of nitrocelluloses (NCs). It is based on the finding of a linear relationship between the nitrogen content and the molar ratio of nitrite-to-nitrate ions released after alkaline hydrolysis. Capillary electrophoresis was used to monitor the concentration of nitrite and nitrate ions. The influences of hydrolysis time and molar mass of NC on the molar ratio of nitrite-to-nitrate ions were investigated, and new insights into the understanding of the alkaline denitration mechanism of NCs, underlying this analytical strategy is provided. The method was then tested successfully with various explosive and non-explosive NC-containing samples such as various daily products and smokeless gunpowders. Inherently to its principle exploiting a concentration ratio, this method shows very good repeatability in the determination of nitrogen content in real samples with relative standard deviation (n = 3) inferior to 1.5%, and also provides very significant advantages with respect to sample extraction, analysis time (1h for alkaline hydrolysis, 3 min for electrophoretic separation), which was about 5 times shorter than for the classical Devarda's method, currently used in industry, and safety conditions (no need for preliminary drying NC samples, mild hydrolysis conditions with 1M sodium hydroxide for 1h at 60 °C). PMID:25562808

  20. A new method for the determination of the nitrogen content of nitrocellulose based on the molar ratio of nitrite-to-nitrate ions released after alkaline hydrolysis.

    PubMed

    Alinat, Elodie; Delaunay, Nathalie; Archer, Xavier; Mallet, Jean-Maurice; Gareil, Pierre

    2015-04-01

    A new method was proposed to determine the nitrogen content of nitrocelluloses (NCs). It is based on the finding of a linear relationship between the nitrogen content and the molar ratio of nitrite-to-nitrate ions released after alkaline hydrolysis. Capillary electrophoresis was used to monitor the concentration of nitrite and nitrate ions. The influences of hydrolysis time and molar mass of NC on the molar ratio of nitrite-to-nitrate ions were investigated, and new insights into the understanding of the alkaline denitration mechanism of NCs, underlying this analytical strategy is provided. The method was then tested successfully with various explosive and non-explosive NC-containing samples such as various daily products and smokeless gunpowders. Inherently to its principle exploiting a concentration ratio, this method shows very good repeatability in the determination of nitrogen content in real samples with relative standard deviation (n = 3) inferior to 1.5%, and also provides very significant advantages with respect to sample extraction, analysis time (1h for alkaline hydrolysis, 3 min for electrophoretic separation), which was about 5 times shorter than for the classical Devarda's method, currently used in industry, and safety conditions (no need for preliminary drying NC samples, mild hydrolysis conditions with 1M sodium hydroxide for 1h at 60 °C).

  1. Methods for determining magnitude and frequency of floods in California, based on data through water year 2006

    USGS Publications Warehouse

    Gotvald, Anthony J.; Barth, Nancy A.; Veilleux, Andrea G.; Parrett, Charles

    2012-01-01

    Methods for estimating the magnitude and frequency of floods in California that are not substantially affected by regulation or diversions have been updated. Annual peak-flow data through water year 2006 were analyzed for 771 streamflow-gaging stations (streamgages) in California having 10 or more years of data. Flood-frequency estimates were computed for the streamgages by using the expected moments algorithm to fit a Pearson Type III distribution to logarithms of annual peak flows for each streamgage. Low-outlier and historic information were incorporated into the flood-frequency analysis, and a generalized Grubbs-Beck test was used to detect multiple potentially influential low outliers. Special methods for fitting the distribution were developed for streamgages in the desert region in southeastern California. Additionally, basin characteristics for the streamgages were computed by using a geographical information system. Regional regression analysis, using generalized least squares regression, was used to develop a set of equations for estimating flows with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities for ungaged basins in California that are outside of the southeastern desert region. Flood-frequency estimates and basin characteristics for 630 streamgages were combined to form the final database used in the regional regression analysis. Five hydrologic regions were developed for the area of California outside of the desert region. The final regional regression equations are functions of drainage area and mean annual precipitation for four of the five regions. In one region, the Sierra Nevada region, the final equations are functions of drainage area, mean basin elevation, and mean annual precipitation. Average standard errors of prediction for the regression equations in all five regions range from 42.7 to 161.9 percent. For the desert region of California, an analysis of 33 streamgages was used to develop regional estimates

  2. The alkaline and alkaline-carbonatite magmatism from Southern Brazil

    NASA Astrophysics Data System (ADS)

    Ruberti, E.; Gomes, C. D. B.; Comin-Chiaramonti, P.

    2015-12-01

    Early to Late Cretaceous lasting to Paleocene alkaline magmatism from southern Brazil is found associated with major extensional structural features in and around the Paraná Basin and grouped into various provinces on the basis of several data. Magmatism is variable in size, mode of occurrence and composition. The alkaline rocks are dominantly potassic, a few occurrences showing sodic affinity. The more abundant silicate rocks are evolved undersaturated to saturated in silica syenites, displaying large variation in igneous forms. Less evolved types are restricted to subvolcanic environments and outcrops of effusive suites occur rarely. Cumulatic mafic and ultramafic rock types are very common, particularly in the alkali-carbonatitic complexes. Carbonatite bodies are represented by Ca-carbonatites and Mg-carbonatites and more scarcely by Fe-carbonatites. Available radiometric ages for the alkaline rocks fit on three main chronological groups: around 130 Ma, subcoveal with the Early Cretaceous flood tholeiites of the Paraná Basin, 100-110 Ma and 80-90 Ma (Late Cretaceous). The alkaline magmatism also extends into Paleocene times, as indicated by ages from some volcanic lavas. Geochemically, alkaline potassic and sodic rock types are distinguished by their negative and positive Nb-Ta anomalies, respectively. Negative spikes in Nb-Ta are also a feature common to the associated tholeiitic rocks. Sr-Nd-Pb systematics confirm the contribution of both HIMU and EMI mantle components in the formation of the alkaline rocks. Notably, Early and Late Cretaceous carbonatites have the same isotopic Sr-Nd initial ratios of the associated alkaline rocks. C-O isotopic Sr-Nd isotopic ratios indicate typical mantle signature for some carbonatites and the influence of post-magmatic processes in others. Immiscibility of liquids of phonolitic composition, derived from mafic alkaline parental magmas, has been responsible for the origin of the carbonatites. Close association of alkaline

  3. Evaluation of Alkaline Cleaner Materials

    NASA Technical Reports Server (NTRS)

    Partz, Earl

    1998-01-01

    Alkaline cleaners used to process aluminum substrates have contained chromium as the corrosion inhibitor. Chromium is a hazardous substance whose use and control are described by environmental laws. Replacement materials that have the characteristics of chromated alkaline cleaners need to be found that address both the cleaning requirements and environmental impacts. This report will review environmentally friendly candidates evaluated as non-chromium alkaline cleaner replacements and methods used to compare those candidates one versus another. The report will also list characteristics used to select candidates based on their declared contents. It will also describe and evaluate methods used to discriminate among the large number of prospective candidates.

  4. A simple method for the determination of ionic diffusion coefficients in flooded soils

    NASA Astrophysics Data System (ADS)

    Gardner, P. J.; Flynn, N.; Maltby, E.

    2001-02-01

    Soil cores from river marginal wetlands from the Torridge and Severn catchments in the UK were collected to study rates of soil denitrification at different sites and at two stations (levee and backplain depression) at the river margin. Half the cores were sterilized prior to flooding to destroy the denitrifying bacteria. After flooding and equilibration, monitoring the concentration of amended nitrate in the supernatant of the sterile cores over a period of 7 days provided a simple procedure for the estimation of the diffusion coefficient of the nitrate ion in the flooded soils. An expression was developed that permitted this diffusion coefficient to be extracted from the slope of a plot of supernatant concentration versus (time)1/2. The values obtained, at 15 °C, varied from 2·4 to 6·8 × 10-10m2s-1. Sterile cores are usually treated as controls in denitrification experiments; this work develops a procedure whereby they may yield useful soil process information.

  5. Development of a capillary electrophoresis method for the analysis in alkaline media as polyoxoanions of two strategic metals: Niobium and tantalum.

    PubMed

    Deblonde, Gauthier J-P; Chagnes, Alexandre; Cote, Gérard; Vial, Jérôme; Rivals, Isabelle; Delaunay, Nathalie

    2016-03-11

    Tantalum (Ta) and niobium (Nb) are two strategic metals essential to several key sectors, like the aerospace, gas and oil, nuclear and electronic industries, but their separation is really difficult due to their almost identical chemical properties. Whereas they are currently produced by hydrometallurgical processes using fluoride-based solutions, efforts are being made to develop cleaner processes by replacing the fluoride media by alkaline ones. However, methods to analyze Nb and Ta simultaneously in alkaline samples are lacking. In this work, we developed a capillary zone electrophoresis (CE) method able to separate and quantify Nb and Ta directly in alkaline media. This method takes advantage of the hexaniobate and hexatantalate ions which are naturally formed at pH>9 and absorb in the UV domain. First, the detection conditions, the background electrolyte (BGE) pH, the nature of the BGE co-ion and the internal standard (IS) were optimized by a systematic approach. As the BGE counter-ion nature modified the speciation of both ions, sodium- and lithium-based BGE were tested. For each alkaline cation, the BGE ionic strength and separation temperature were optimized using experimental designs. Since changes in the migration order of IS, Nb and Ta were observed within the experimental domain, the resolution was not a monotonic function of ionic strength and separation temperature. This forced us to develop an original data treatment for the prediction of the optimum separation conditions. Depending on the consideration of either peak widths or peak symmetries, with or without additional robustness constraints, four optima were predicted for each tested alkaline cation. The eight predicted optima were tested experimentally and the best experimental optimum was selected considering analysis time, resolution and robustness. The best separation was obtained at 31.0°C and in a BGE containing 10mM LiOH and 35mM LiCH3COO.The separation voltage was finally optimized

  6. Development of a capillary electrophoresis method for the analysis in alkaline media as polyoxoanions of two strategic metals: Niobium and tantalum.

    PubMed

    Deblonde, Gauthier J-P; Chagnes, Alexandre; Cote, Gérard; Vial, Jérôme; Rivals, Isabelle; Delaunay, Nathalie

    2016-03-11

    Tantalum (Ta) and niobium (Nb) are two strategic metals essential to several key sectors, like the aerospace, gas and oil, nuclear and electronic industries, but their separation is really difficult due to their almost identical chemical properties. Whereas they are currently produced by hydrometallurgical processes using fluoride-based solutions, efforts are being made to develop cleaner processes by replacing the fluoride media by alkaline ones. However, methods to analyze Nb and Ta simultaneously in alkaline samples are lacking. In this work, we developed a capillary zone electrophoresis (CE) method able to separate and quantify Nb and Ta directly in alkaline media. This method takes advantage of the hexaniobate and hexatantalate ions which are naturally formed at pH>9 and absorb in the UV domain. First, the detection conditions, the background electrolyte (BGE) pH, the nature of the BGE co-ion and the internal standard (IS) were optimized by a systematic approach. As the BGE counter-ion nature modified the speciation of both ions, sodium- and lithium-based BGE were tested. For each alkaline cation, the BGE ionic strength and separation temperature were optimized using experimental designs. Since changes in the migration order of IS, Nb and Ta were observed within the experimental domain, the resolution was not a monotonic function of ionic strength and separation temperature. This forced us to develop an original data treatment for the prediction of the optimum separation conditions. Depending on the consideration of either peak widths or peak symmetries, with or without additional robustness constraints, four optima were predicted for each tested alkaline cation. The eight predicted optima were tested experimentally and the best experimental optimum was selected considering analysis time, resolution and robustness. The best separation was obtained at 31.0°C and in a BGE containing 10mM LiOH and 35mM LiCH3COO.The separation voltage was finally optimized

  7. A simple-potentiometric method for determination of acid and alkaline phosphatase enzymes in biological fluids and dairy products using a nitrophenylphosphate plastic membrane sensor.

    PubMed

    Hassan, Saad S M; Sayour, Hossam E M; Kamel, Ayman H

    2009-04-27

    A novel poly(vinyl chloride) matrix membrane sensor responsive to 4-nitrophenylphosphate (4-NPP) substrate is described, characterized and used for the potentiometric assay of acid (ACP) and alkaline (ALP) phosphatase enzymes. The sensor is based on the use of the ion-association complex of 4-NPP anion with nickel(II)-bathophenanthroline cation as an electroactive material and nitrophenyloctyl ether (NPOE) as a solvent mediator. The sensor displays good selectivity and stability and demonstrates a near-Nernstian response for 4-NPP over the concentration range 9.6x10(-6) to 1.0x10(-2) M with an anionic slope of 28.6+/-0.3 mV decade(-1) and a detection limit of 6.3x10(-6) M over the pH range 4.5-10. The sensor is used to measure the decrease of a fixed concentration of 4-NPP substrate as a function of acid and alkaline phosphatase enzyme activities at optimized conditions of pH and temperature. A linear relationship between the initial rate of 4-NPP substrate hydrolysis and enzyme activity holds over 0.05-3.0 and 0.03-3.4 IU L(-1) of ACP and ALP enzymes, respectively. Validation of the method by measuring the lower detection limit, range, accuracy, precision, within-day repeatability and between-day-variability reveals good performance characteristics of the proposed sensor. The sensor is used for the determination of acid and alkaline phosphatase enzyme activities in biological fluids of some patients suffering from alcoholic cirrhosis, acute myelocytic leukemia, pre-eclampsia and prostatic cancer. The sensor is also utilized for assessment of alkaline phosphatase enzyme in milk and dairy products. The results obtained agree fairly well with data obtained by the standard spectrophotometric methods.

  8. Molten metal reactor and method of forming hydrogen, carbon monoxide and carbon dioxide using the molten alkaline metal reactor

    DOEpatents

    Bingham, Dennis N.; Klingler, Kerry M.; Turner, Terry D.; Wilding, Bruce M.

    2012-11-13

    A molten metal reactor for converting a carbon material and steam into a gas comprising hydrogen, carbon monoxide, and carbon dioxide is disclosed. The reactor includes an interior crucible having a portion contained within an exterior crucible. The interior crucible includes an inlet and an outlet; the outlet leads to the exterior crucible and may comprise a diffuser. The exterior crucible may contain a molten alkaline metal compound. Contained between the exterior crucible and the interior crucible is at least one baffle.

  9. Flood frequency in Alaska

    USGS Publications Warehouse

    Childers, J.M.

    1970-01-01

    Records of peak discharge at 183 sites were used to study flood frequency in Alaska. The vast size of Alaska, its great ranges of physiography, and the lack of data for much of the State precluded a comprehensive analysis of all flood determinants. Peak stream discharges, where gaging-station records were available, were analyzed for 2-year, 5-year, 10-year, 25-year, and 50-year average-recurrence intervals. A regional analysis of the flood characteristics by multiple-regression methods gave a set of equations that can be used to estimate floods of selected recurrence intervals up to 50 years for any site on any stream in Alaska. The equations relate floods to drainage-basin characteristics. The study indicates that in Alaska the 50-year flood can be estimated from 10-year gaging- station records with a standard error of 22 percent whereas the 50-year flood can be estimated from the regression equation with a standard error of 53 percent. Also, maximum known floods at more than 500 gaging stations and miscellaneous sites in Alaska were related to drainage-area size. An envelope curve of 500 cubic feet per second per square mile covered all but 2 floods in the State.

  10. Methods for Estimating Magnitude and Frequency of Floods in Rural Basins in the Southeastern United States: South Carolina

    USGS Publications Warehouse

    Feaster, Toby D.; Gotvald, Anthony J.; Weaver, J. Curtis

    2009-01-01

    For more than 50 years, the U.S. Geological Survey (USGS) has been developing regional regression equations that can be used to estimate flood magnitude and frequency at ungaged sites. Flood magnitude relates to the volume of flow that occurs over some period of time and usually is presented in cubic feet per second. Flood frequency relates to the probability of occurrence of a flood; that is, on average, what is the likelihood that a flood with a specified magnitude will occur in any given year (1 percent chance, 10 percent chance, 50 percent chance, and so on). Such flood estimates are needed for the efficient design of bridges, highway embankments, levees, and other structures near streams. In addition, these estimates are needed for the effective planning and management of land and water resources, to protect lives and property in flood-prone areas, and to determine flood-insurance rates.

  11. Flood Impact Modelling and Natural Flood Management

    NASA Astrophysics Data System (ADS)

    Owen, Gareth; Quinn, Paul; ODonnell, Greg

    2016-04-01

    Local implementation of Natural Flood Management methods are now being proposed in many flood schemes. In principal it offers a cost effective solution to a number of catchment based problem as NFM tackles both flood risk and WFD issues. However within larger catchments there is the issue of which subcatchments to target first and how much NFM to implement. If each catchment has its own configuration of subcatchment and rivers how can the issues of flood synchronisation and strategic investment be addressed? In this study we will show two key aspects to resolving these issues. Firstly, a multi-scale network water level recorder is placed throughout the system to capture the flow concentration and travel time operating in the catchment being studied. The second is a Flood Impact Model (FIM), which is a subcatchment based model that can generate runoff in any location using any hydrological model. The key aspect to the model is that it has a function to represent the impact of NFM in any subcatchment and the ability to route that flood wave to the outfall. This function allows a realistic representation of the synchronisation issues for that catchment. By running the model in interactive mode the user can define an appropriate scheme that minimises or removes the risk of synchornisation and gives confidence that the NFM investment is having a good level of impact downstream in large flood events.

  12. Pakistan Flooding

    Atmospheric Science Data Center

    2013-04-16

    article title:  Flooding in Pakistan     View Larger Image In late July 2010, flooding caused by heavy monsoon rains began in several regions of Pakistan, ... and Aug 11, 2010 Images:  Pakistan Flood location:  Asia thumbnail:  ...

  13. Evaluation of streams in selected communities for the application of limited-detail study methods for flood-insurance studies

    USGS Publications Warehouse

    Cobb, Ernest D.

    1986-01-01

    The U.S. Geological Survey evaluated 2,349 communities in 1984 for the application of limited-detail flood-insurance study methods, that is, methods with a reduced effort and cost compared to the detailed studies. Limited-detail study methods were found to be appropriate for 1,705 communities, while detailed studies were appropriate for 62 communities and no studies were appropriate for 582 communities. The total length of streams for which limited-detail studies are recommended is 9 ,327 miles with a corresponding cost of $23,007,000. This results in average estimated costs for conducting limited-detail studies of $2,500 per mile of studied stream length. The purpose of the report is to document the limited-detail study methods and the results of the evaluation. (USGS)

  14. Methods for predicting peak discharge of floods caused by failure of natural and constructed earthen dams

    USGS Publications Warehouse

    Walder, J.S.; O'Connor, J. E.

    1997-01-01

    Floods from failures of natural and constructed dams constitute a widespread hazard to people and property. Expeditious means of assessing flood hazards are necessary, particularly in the case of natural dams, which may form suddenly and unexpectedly. We revise statistical relations (derived from data for past constructed and natural dam failures) between peak discharge (Q(p)) and water volume released (V(0)) or drop in lake level (d) but assert that such relations, even when cast into a dimensionless form, are of limited utility because they fail to portray the effect of breach-formation rate. We then analyze a simple, physically based model of dam-breach formation to show that the hydrograph at the breach depends primarily on a dimensionless parameter ?? = kV0/g1/2d7/2, where k is the mean erosion rate of the breach and g is acceleration due to gravity. The functional relationship between Q(p) and ?? takes asymptotically distinct forms depending on whether ?? << 1 (relatively slow breach formation or small lake volume) or ?? >> 1 (relatively fast breach formation or large lake volume). Theoretical predictions agree well with data from dam failures for which k, and thus ??, can be estimated. The theory thus provides a rapid means of predicting the plausible range of values of peak discharge at the breach in an earthen dam as long as the impounded water volume and the water depth at the dam face can be estimated.

  15. Contribution of an exposure indicator to better anticipate damages with the AIGA flood warning method: a case study in the South of France

    NASA Astrophysics Data System (ADS)

    Saint-Martin, Clotilde; Fouchier, Catherine; Douvinet, Johnny; Javelle, Pierre; Vinet, Freddy

    2016-04-01

    On the 3rd October 2015, heavy localized precipitations have occurred in South Eastern France leading to major flash floods on the Mediterranean coast. The severity of those floods has caused 20 fatalities and important damage in almost 50 municipalities in the French administrative area of Alpes-Maritimes. The local recording rain gauges have shown how fast the event has happened: 156 mm of rain were recorded in Mandelieu-la-Napoule and 145 mm in Cannes within 2 hours. As the affected rivers are not monitored, no anticipation was possible from the authorities in charge of risk management. In this case, forecasting floods is indeed complex because of the small size of the watersheds which implies a reduced catchment response time. In order to cope with the need of issuing flood warnings on un-monitored small catchments, Irstea and Météo-France have developed an alternative warning system for ungauged basins called the AIGA method. AIGA is a flood warning system based on a simple distributed hydrological model run at a 1 km² resolution using real time radar rainfall information (Javelle, Demargne, Defrance, Pansu, & Arnaud, 2014). The flood warnings, produced every 15 minutes, result of the comparison of the real time runoff data produced by the model with statistical runoff values. AIGA is running in real time in the South of France, within the RHYTMME project (https://rhytmme.irstea.fr/). Work is on-going in order to offer a similar service for the whole French territory. More than 200 impacts of the 3rd October floods have been located using media, social networks and fieldwork. The first comparisons between these impacts and the AIGA warning levels computed for this event show several discrepancies. However, these latter discrepancies appear to be explained by the land-use. An indicator of the exposure of territories to flooding has thus been created to weight the levels of the AIGA hydrological warnings with the land-use of the area surrounding the streams

  16. A finite volume method for a two-phase multicomponent polymer flooding

    NASA Astrophysics Data System (ADS)

    K, Sudarshan Kumar; C, Praveen; D Veerappa Gowda, G.

    2014-10-01

    Multicomponent polymer flooding used in enhanced oil recovery is governed by a system of coupled non-strictly hyperbolic conservation laws. In the presence of gravity, the flux functions need not be monotone and hence designing Godunov type upwind schemes is difficult and computationally expensive. To overcome this difficulty, we use the basic idea of discontinuous flux to reduce the coupled system into an uncoupled system of scalar conservation laws with discontinuous coefficients. For these scalar equations we use the DFLU flux developed in [5] to construct a second order scheme. The scheme is shown to satisfy a maximum principle and the performance of the scheme is shown on both one and two dimensional test problems.

  17. A data-based comparison of flood frequency analysis methods used in France

    NASA Astrophysics Data System (ADS)

    Kochanek, K.; Renard, B.; Arnaud, P.; Aubert, Y.; Lang, M.; Cipriani, T.; Sauquet, E.

    2014-02-01

    Flood frequency analysis (FFA) aims at estimating quantiles with large return periods for an extreme discharge variable. Many FFA implementations are used in operational practice in France. These implementations range from the estimation of a pre-specified distribution to continuous simulation approaches using a rainfall simulator coupled with a rainfall-runoff model. This diversity of approaches raises questions regarding the limits of each implementation and calls for a nation-wide comparison of their predictive performances. This paper presents the results of a national comparison of the main FFA implementations used in France. More accurately, eight implementations are considered, corresponding to the local, regional and local-regional estimation of Gumbel and Generalized Extreme Value (GEV) distributions, as well as the local and regional versions of a continuous simulation approach. A data-based comparison framework is applied to these eight competitors to evaluate their predictive performances in terms of reliability and stability, using daily flow data from more than 1000 gauging stations in France. Results from this comparative exercise suggest that two implementations dominate their competitors in terms of predictive performances, namely the local version of the continuous simulation approach and the local-regional estimation of a GEV distribution. More specific conclusions include the following: (i) the Gumbel distribution is not suitable for Mediterranean catchments, since this distribution demonstrably leads to an underestimation of flood quantiles; (ii) the local estimation of a GEV distribution is not recommended, because the difficulty in estimating the shape parameter results in frequent predictive failures; (iii) all the purely regional implementations evaluated in this study displayed a quite poor reliability, suggesting that prediction in completely ungauged catchments remains a challenge.

  18. A data-based comparison of flood frequency analysis methods used in France

    NASA Astrophysics Data System (ADS)

    Kochanek, K.; Renard, B.; Arnaud, P.; Aubert, Y.; Lang, M.; Cipriani, T.; Sauquet, E.

    2013-09-01

    Many flood frequency analysis (FFA) implementations are used in operational practice in France. These implementations range from the estimation of a pre-specified distribution to continuous simulation approaches using a rainfall simulator coupled with a rainfall-runoff model. This diversity of approaches raises questions regarding the optimal ambits of each implementation and calls for a nation-wide comparison of their predictive performances. This paper presents the results of a national comparison of the main FFA implementations used in France. More accurately, eight implementations are considered, corresponding to the local, regional and local-regional estimation of Gumbel and Generalized Extreme Value (GEV) distributions, as well as the local and regional estimation of a continuous simulation approach eventually resulted in a local and a regional version. A data-based comparison framework is applied to these eight competitors to evaluate their predictive performances in terms of reliability and stability, using daily flow data data from more than one thousand gauging stations in France. Results from this comparative exercise suggest that two implementations dominate their competitors in terms of predictive performances, namely the local version of the continuous simulation approach and the local-regional estimation of a GEV distribution. More specific conclusions include the following: (i) the Gumbel distribution is not suitable for Mediterranean catchments, since this distribution demonstrably leads to an underestimation of flood quantiles; (ii) the local estimation of a GEV distribution is not recommended, because the difficulty in estimating the shape parameter results in frequent predictive failures; (iii) all the purely regional implementations evaluated in this study displayed a quite poor reliability, suggesting that prediction in completely ungauged catchments remains a challenge.

  19. Coupling the Alkaline-Surfactant-Polymer Technology and The Gelation Technology to Maximize Oil Production

    SciTech Connect

    Malcolm Pitts; Jie Qi; Dan Wilson; Phil Dowling; David Stewart; Bill Jones

    2005-12-01

    Performance and produced polymer evaluation of four alkaline-surfactant-polymer projects concluded that only one of the projects could have benefited from combining the alkaline-surfactant-polymer and gelation technologies. Cambridge, the 1993 Daqing, Mellott Ranch, and the Wardlaw alkaline-surfacant-polymer floods were studied. An initial gel treatment followed by an alkaline-surfactant-polymer flood in the Wardlaw field would have been a benefit due to reduction of fracture flow. Numerical simulation demonstrated that reducing the permeability of a high permeability zone of a reservoir with gel improved both waterflood and alkaline-surfactant-polymer flood oil recovery. A Minnelusa reservoir with both A and B sand production was simulated. A and B sands are separated by a shale layer. A sand and B sand waterflood oil recovery was improved by 196,000 bbls or 3.3% OOIP when a gel was placed in the B sand. Alkaline-surfactant-polymer flood oil recovery improvement over a waterflood was 392,000 bbls or 6.5% OOIP. Placing a gel into the B sand prior to an alkaline-surfactant-polymer flood resulted in 989,000 bbl or 16.4% OOIP more oil than only water injection. A sand and B sand alkaline-surfactant-polymer flood oil recovery was improved by 596,000 bbls or 9.9% OOIP when a gel was placed in the B sand.

  20. Regional flood frequency analysis

    SciTech Connect

    Singh, V.P.

    1987-01-01

    This book, the fourth of a four volume set, contains five sections encompassing major aspects of regional flood frequency analysis. Each section starts usually with an invited state-of-the-art paper followed by contributed papers. The first section provides an assessment of regional flood frequency analysis. Methods for performing regional frequency analysis for ungaged watersheds are presented in Section 2. More discussion on regional frequency analysis is provided in Section 3. Selection and comparison of regional frequency methods are dealt with in Section 4; these are of great interest to the user. Increasing attention is being focused these days on paleohydrologic flood analysis. This topic is covered in Section 5.

  1. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas

    PubMed Central

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-01-01

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation. PMID:27527202

  2. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas.

    PubMed

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-01-01

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation. PMID:27527202

  3. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas.

    PubMed

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-08-05

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation.

  4. Flash flood warning in mountainaious areas: using damages reports to evaluate the method at small ungauged catchments

    NASA Astrophysics Data System (ADS)

    Defrance, Dimitri; Javelle, Pierre; Ecrepont, Stéphane; Andreassian, Vazken

    2013-04-01

    In Europe, flash floods mainly occur in the Mediterranean area on small catchments with a short concentration time. Anticipating this kind of events is a major issue in order to reduce the resulting damages. But for many of the impacted catchments, no data are available to calibrate and evaluate hydrological models. In this context, the aims of this study is to develop and evaluate a warning method for the Southern French Alps. This area is of particular interest, because it regroups different hydrological regimes, from purely Mediterranean to purely Alpine influences. Two main issues should be addressed: - How to define the hydrological model and its parameterization for an application in an ungauged context? - How to evaluate the final results on 'real' ungauged catchments? The first issue is a classic one. Using a 'observed' data set (154 streamflow stations with catchment areas ranging from 5 to 1000 km² and distributed rainfall available on the 1997-2006 period), we developed a regional model specifically for the studied area. For this purpose, the AIGA method, initially developed for Mediterranean catchments was adapted, in order to take into account snowmelt and to produce baseflows. Then, different parameterizations were tested, derived from different simple regionalisation techniques: - the same parameters set for the whole area defined as the median of the local calibrated parameters; - the same technique as the previous case, but by considering different sub-areas, defined as "hydro-climatically" homogeneous by previous studies; - and finally the neighbour's method. The second issue is more original. Indeed, in most studies the final evaluation is done using gauged stations as they were 'ungauged', ie keeping the at-site discharge data only for validation ant not for calibration. The main disadvantage of this approach is that the evaluation is made at the scale of the gauged catchments, which are in general greater than the catchments impacted by flash

  5. Reinforcing flood-risk estimation.

    PubMed

    Reed, Duncan W

    2002-07-15

    Flood-frequency estimation is inherently uncertain. The practitioner applies a combination of gauged data, scientific method and hydrological judgement to derive a flood-frequency curve for a particular site. The resulting estimate can be thought fully satisfactory only if it is broadly consistent with all that is reliably known about the flood-frequency behaviour of the river. The paper takes as its main theme the search for information to strengthen a flood-risk estimate made from peak flows alone. Extra information comes in many forms, including documentary and monumental records of historical floods, and palaeological markers. Meteorological information is also useful, although rainfall rarity is difficult to assess objectively and can be a notoriously unreliable indicator of flood rarity. On highly permeable catchments, groundwater levels present additional data. Other types of information are relevant to judging hydrological similarity when the flood-frequency estimate derives from data pooled across several catchments. After highlighting information sources, the paper explores a second theme: that of consistency in flood-risk estimates. Following publication of the Flood estimation handbook, studies of flood risk are now using digital catchment data. Automated calculation methods allow estimates by standard methods to be mapped basin-wide, revealing anomalies at special sites such as river confluences. Such mapping presents collateral information of a new character. Can this be used to achieve flood-risk estimates that are coherent throughout a river basin? PMID:12804255

  6. Reinforcing flood-risk estimation.

    PubMed

    Reed, Duncan W

    2002-07-15

    Flood-frequency estimation is inherently uncertain. The practitioner applies a combination of gauged data, scientific method and hydrological judgement to derive a flood-frequency curve for a particular site. The resulting estimate can be thought fully satisfactory only if it is broadly consistent with all that is reliably known about the flood-frequency behaviour of the river. The paper takes as its main theme the search for information to strengthen a flood-risk estimate made from peak flows alone. Extra information comes in many forms, including documentary and monumental records of historical floods, and palaeological markers. Meteorological information is also useful, although rainfall rarity is difficult to assess objectively and can be a notoriously unreliable indicator of flood rarity. On highly permeable catchments, groundwater levels present additional data. Other types of information are relevant to judging hydrological similarity when the flood-frequency estimate derives from data pooled across several catchments. After highlighting information sources, the paper explores a second theme: that of consistency in flood-risk estimates. Following publication of the Flood estimation handbook, studies of flood risk are now using digital catchment data. Automated calculation methods allow estimates by standard methods to be mapped basin-wide, revealing anomalies at special sites such as river confluences. Such mapping presents collateral information of a new character. Can this be used to achieve flood-risk estimates that are coherent throughout a river basin?

  7. Flood hazard energy in urban areas: a new integrated method for flood risk analysis in synthesizing interactions with urban boundary layer

    NASA Astrophysics Data System (ADS)

    Park, S. Y.; Schmidt, A.

    2015-12-01

    Since urban physical characteristics (such as morphology and land-use/land-cover) are different from those of nature, altered interactions between the surface and atmosphere (especially urban boundary layer, UBL) or surface and subsurface can affect the hydrologic behavior and hence the flood hazards. In this research we focus on three main aspects of the urban surface/atmosphere interactions that affect flood hazard: urban heat island (UHI) effect, increased surface roughness, and accumulated aerosols. These factors, along with the uncertainties in quantifying these components make risk analysis intractable. In order to perform a risk analysis, the impact of these components needs to be mapped to a variable that can be mathematically described in a risk-analysis framework. We propose defining hazard energy as a surrogate for the combined effect of these three components. Perturbations that can change the hazard energy come from diverse sources in the urban areas and these somewhat disconnected things can be combined by the energy concept to characterize the impacts of urban areas in risk assessment. This approach synthesizes across hydrological and hydraulic processes in UBL, land surface, subsurface, and sewer network with scrutinizing energy exchange across places. We can extend our understanding about not only the influence of cities on local climate in rural areas or larger scales but also the interaction of cities and nature affecting each other.

  8. ALP (Alkaline Phosphatase) Test

    MedlinePlus

    ... known as: ALK PHOS; Alkp Formal name: Alkaline Phosphatase Related tests: AST ; ALT ; GGT ; Bilirubin ; Liver Panel ; Bone Markers ; Alkaline Phosphatase Isoenzymes; Bone Specific ALP All content on Lab ...

  9. The application of the Contingent Valuation method towards the assessment of the impacts emerged from the March 2006 floods in the Evros River. An experts-based survey.

    NASA Astrophysics Data System (ADS)

    Markantonis, V.; Bithas, K.

    2009-04-01

    In March 2006 Greece was struck by a severe flooding, which caused significant damages in the Prefecture of Evros, on the Eastern border of Greece. 250 million m² of farmland was flooded causing severe damages to agriculture, transport and water supply networks. Total direct damages are estimated at € 372 million. The negative effect on economic activity caused by the floods, considered the worst over the last 50 years, took place in an area that had already been severely affected by floods in 2005. Apart from the direct damages critical were also the indirect impacts on the environmental and the social level. The need for economic analysis concerning the design and implementation of efficient flood management policies is well emphasized in the natural hazards' policies. Within this framework, the present paper is analyzing the application of stated preferences valuation techniques for the assessment of the damages caused in the Prefecture of Evros by the severe floods of March 2006. The objective of this paper is to define the role of economic valuation techniques in assisting the design of efficient and sustainable policies for flood management. More specific, the Contingent Valuation (CV) method is applied in order to valuate the impacts of the March 2006 floods, including the environmental impacts as far as concerns the soil, the biodiversity and the aesthetic environment of the flooded areas. The paper begins with a discussion of the theoretical economic framework, and particularly, the contingent valuation method framework that can be used to evaluate flood impacts. Understanding public preferences for complex environmental policy changes, such as flood impacts, is a preeminent challenge for environmental economists and other social scientists. Information issues are central to the design and application of the survey-based contingent valuation (CV) method for valuing environmental goods. While content is under the control of the analyst, how this

  10. Purification of alkaline solutions and wastes from actinides and technetium by coprecipitation with some carriers using the method of appearing reagents: Final Report

    SciTech Connect

    Peretrukhin, V.F.; Silin, V.I.; Kareta, A.V.; Gelis, A.V.; Shilov, V.P.; German, K.E.; Firsova, E.V.; Maslennikov, A.G.; Trushina, V.E.

    1998-09-01

    The coprecipitation of transuranium elements (TRU) and technetium from alkaline solutions and from simulants of Hanford Site tank wastes has been studied in reducing and oxidizing conditions on uranium(IV,VI) hydroxocompounds, tetraalkylammonium perrhenate and perchlorate, and on hydroxides of Fe(III), Co(III), Mn(II), and Cr(III) using the method of appearing reagents (MAR). Coprecipitations in alkaline solution have been shown to give high decontamination factors (DF) at low content of carrier and in the presence of high salt concentrations. Uranium(IV) hydroxide in concentrations higher than 3 {times} 10{sup {minus}3} M coprecipitates Pu and Cm in any oxidation state from 0.2 to 4 M NaOH with DFs of 110 to 1000 and Np and Tc with DFs of 51 to 176. Technetium (VII) coprecipitates with (5 to 8) {times} 10{sup {minus}4} M tetrabutylammonium (TBA) perrhenate in 0.01 to 0.02 M TBA hydroxide from 0.5 to 1.5 M NaOH to give DFs of 150 to 200. Coprecipitations of Np and Pu with Co(OH){sub 3}, Fe(OH){sub 3}, Cr(OH){sub 3}, and Mn(OH){sub 2} obtained by the MAR from precursors in the range from pH 10.5 to 0.4 M NaOH give DFs from 80 to 400.

  11. Bipolar concept for alkaline fuel cells

    NASA Astrophysics Data System (ADS)

    Gülzow, E.; Schulze, M.; Gerke, U.

    Alkaline fuel cell stacks are mostly build in monopolar configuration of the cells. At the German Aerospace Center a bipolar plate for alkaline fuel cells has been developed and characterized in a short stack. As a consequence of the sealing concept of the stack two different bipolar plate types are needed. Therefore, the number of cells can only vary by 2 if the end plates are not changed. The single cell as well as the short stack is characterized by various methods, e.g. V- i characteristics, electrochemical impedance spectroscopy (EIS). As a result of the specific electrodes used the differential pressure between electrolyte and gas phase is limited to a few 10 mbar. At higher differential pressures gas crossover through the electrodes and electrolyte takes place with the result that the electrolyte may flood the flow fields. In contrast to PEFC, electrode supported by a metal net as conductor and mechanical support can be used in the AFC. Therefore, the structure of the flow field can be quite simple, this means flow fields with channels with large width and depth are possible. Consequently, the pressure loss over the flow field is very low. The single cell as well as the short stack was operated at overpressures of a few 10 mbar. The AFC can be operated without a compression but with a simple fan. The developed cell design is also used for the characterization of the fuel cell components like electrodes and diaphragms. The test facility for the single cell and for the stack is fully computer controlled and allows the variation of the operation conditions, e.g. flow of the electrolyte, hydrogen flow, oxygen or air flow and cell temperature.

  12. Assessment of Different Sampling Methods for Measuring and Representing Macular Cone Density Using Flood-Illuminated Adaptive Optics

    PubMed Central

    Feng, Shu; Gale, Michael J.; Fay, Jonathan D.; Faridi, Ambar; Titus, Hope E.; Garg, Anupam K.; Michaels, Keith V.; Erker, Laura R.; Peters, Dawn; Smith, Travis B.; Pennesi, Mark E.

    2015-01-01

    Purpose To describe a standardized flood-illuminated adaptive optics (AO) imaging protocol suitable for the clinical setting and to assess sampling methods for measuring cone density. Methods Cone density was calculated following three measurement protocols: 50 × 50-μm sampling window values every 0.5° along the horizontal and vertical meridians (fixed-interval method), the mean density of expanding 0.5°-wide arcuate areas in the nasal, temporal, superior, and inferior quadrants (arcuate mean method), and the peak cone density of a 50 × 50-μm sampling window within expanding arcuate areas near the meridian (peak density method). Repeated imaging was performed in nine subjects to determine intersession repeatability of cone density. Results Cone density montages could be created for 67 of the 74 subjects. Image quality was determined to be adequate for automated cone counting for 35 (52%) of the 67 subjects. We found that cone density varied with different sampling methods and regions tested. In the nasal and temporal quadrants, peak density most closely resembled histological data, whereas the arcuate mean and fixed-interval methods tended to underestimate the density compared with histological data. However, in the inferior and superior quadrants, arcuate mean and fixed-interval methods most closely matched histological data, whereas the peak density method overestimated cone density compared with histological data. Intersession repeatability testing showed that repeatability was greatest when sampling by arcuate mean and lowest when sampling by fixed interval. Conclusions We show that different methods of sampling can significantly affect cone density measurements. Therefore, care must be taken when interpreting cone density results, even in a normal population. PMID:26325414

  13. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory : evaluation of alkaline persulfate digestion as an alternative to Kjeldahl digestion for determination of total and dissolved nitrogen and phosphorus in water

    USGS Publications Warehouse

    Patton, Charles J.; Kryskalla, Jennifer R.

    2003-01-01

    Alkaline persulfate digestion was evaluated and validated as a more sensitive, accurate, and less toxic alternative to Kjeldahl digestion for routine determination of nitrogen and phosphorus in surface- and ground-water samples in a large-scale and geographically diverse study conducted by U.S. Geological Survey (USGS) between October 1, 2001, and September 30, 2002. Data for this study were obtained from about 2,100 surface- and ground-water samples that were analyzed for Kjeldahl nitrogen and Kjeldahl phosphorus in the course of routine operations at the USGS National Water Quality Laboratory (NWQL). These samples were analyzed independently for total nitrogen and total phosphorus using an alkaline persulfate digestion method developed by the NWQL Methods Research and Development Program. About half of these samples were collected during nominally high-flow (April-June) conditions and the other half were collected during nominally low-flow (August-September) conditions. The number of filtered and whole-water samples analyzed from each flow regime was about equal.By operational definition, Kjeldahl nitrogen (ammonium + organic nitrogen) and alkaline persulfate digestion total nitrogen (ammonium + nitrite + nitrate + organic nitrogen) are not equivalent. It was necessary, therefore, to reconcile this operational difference by subtracting nitrate + nitrite concentra-tions from alkaline persulfate dissolved and total nitrogen concentrations prior to graphical and statistical comparisons with dissolved and total Kjeldahl nitrogen concentrations. On the basis of two-population paired t-test statistics, the means of all nitrate-corrected alkaline persulfate nitrogen and Kjeldahl nitrogen concentrations (2,066 paired results) were significantly different from zero at the p = 0.05 level. Statistically, the means of Kjeldahl nitrogen concentrations were greater than those of nitrate-corrected alkaline persulfate nitrogen concentrations. Experimental evidence strongly

  14. Geomorphological method in the elaboration of hazard maps for flash-floods in the municipality of Jucuarán (El Salvador)

    NASA Astrophysics Data System (ADS)

    Fernández-Lavado, C.; Furdada, G.; Marqués, M. A.

    2007-07-01

    This work deals with the elaboration of flood hazard maps. These maps reflect the areas prone to floods based on the effects of Hurricane Mitch in the Municipality of Jucuarán of El Salvador. Stream channels located in the coastal range in the SE of El Salvador flow into the Pacific Ocean and generate alluvial fans. Communities often inhabit these fans can be affected by floods. The geomorphology of these stream basins is associated with small areas, steep slopes, well developed regolite and extensive deforestation. These features play a key role in the generation of flash-floods. This zone lacks comprehensive rainfall data and gauging stations. The most detailed topographic maps are on a scale of 1:25 000. Given that the scale was not sufficiently detailed, we used aerial photographs enlarged to the scale of 1:8000. The effects of Hurricane Mitch mapped on these photographs were regarded as the reference event. Flood maps have a dual purpose (1) community emergency plans, (2) regional land use planning carried out by local authorities. The geomorphological method is based on mapping the geomorphological evidence (alluvial fans, preferential stream channels, erosion and sedimentation, man-made terraces). Following the interpretation of the photographs this information was validated on the field and complemented by eyewitness reports such as the height of water and flow typology. In addition, community workshops were organized to obtain information about the evolution and the impact of the phenomena. The superimposition of this information enables us to obtain a comprehensive geomorphological map. Another aim of the study was the calculation of the peak discharge using the Manning and the paleohydraulic methods and estimates based on geomorphologic criterion. The results were compared with those obtained using the rational method. Significant differences in the order of magnitude of the calculated discharges were noted. The rational method underestimated the

  15. Free nitrous acid serving as a pretreatment method for alkaline fermentation to enhance short-chain fatty acid production from waste activated sludge.

    PubMed

    Zhao, Jianwei; Wang, Dongbo; Li, Xiaoming; Yang, Qi; Chen, Hongbo; Zhong, Yu; Zeng, Guangming

    2015-07-01

    Alkaline condition (especially pH 10) has been demonstrated to be a promising method for short-chain fatty acid (SCFA) production from waste activated sludge anaerobic fermentation, because it can effectively inhibit the activities of methanogens. However, due to the limit of sludge solubilization rate, long fermentation time is required but SCFA yield is still limited. This paper reports a new pretreatment method for alkaline fermentation, i.e., using free nitrous acid (FNA) to pretreat sludge for 2 d, by which the fermentation time is remarkably shortened and meanwhile the SCFA production is significantly enhanced. Experimental results showed the highest SCFA production of 370.1 mg COD/g VSS (volatile suspended solids) was achieved at 1.54 mg FNA/L pretreatment integration with 2 d of pH 10 fermentation, which was 4.7- and 1.5-fold of that in the blank (uncontrolled) and sole pH 10 systems, respectively. The total time of this integration system was only 4 d, whereas the corresponding time was 15 d in the blank and 8 d in the sole pH 10 systems. The mechanism study showed that compared with pH 10, FNA pretreatment accelerated disruption of both extracellular polymeric substances and cell envelope. After FNA pretreatment, pH 10 treatment (1 d) caused 38.0% higher substrate solubilization than the sole FNA, which indicated that FNA integration with pH 10 could cause positive synergy on sludge solubilization. It was also observed that this integration method benefited hydrolysis and acidification processes. Therefore, more SCFA was produced, but less fermentation time was required in the integrated system.

  16. Sampling variance of flood quantiles from the generalised logistic distribution estimated using the method of L-moments

    NASA Astrophysics Data System (ADS)

    Kjeldsen, Thomas R.; Jones, David A.

    The method of L-moments is the recommended method for fitting the three parameters (location, scale and shape) of a Generalised Logistic (GLO) distribution when conducting flood frequency analyses in the UK. This paper examines the sampling uncertainty of quantile estimates obtained using the GLO distribution for single site analysis using the median to estimate the location parameter. Analytical expressions for the mean and variance of the quantile estimates were derived, based on asymptotic theory. This has involved deriving expressions for the covariance between the sampling median (location parameter) and the quantiles of the estimated unit-median GLO distribution (growth curve). The accuracy of the asymptotic approximations for many of these intermediate results and for the quantile estimates was investigated by comparing the approximations to the outcome of a series of Monte Carlo experiments. The approximations were found to be adequate for GLO shape parameter values between -0.35 and 0.25, which is an interval that includes the shape parameter estimates for most British catchments. An investigation into the contribution of different components to the total uncertainty showed that for large returns periods, the variance of the growth curve is larger than the contribution of the median. Therefore, statistical methods using regional information to estimate the growth curve should be considered when estimating design events at large return periods.

  17. Methods for estimating the magnitude and frequency of floods for urban and small, rural streams in Georgia, South Carolina, and North Carolina, 2011

    USGS Publications Warehouse

    Feaster, Toby D.; Gotvald, Anthony J.; Weaver, J. Curtis

    2014-01-01

    Reliable estimates of the magnitude and frequency of floods are essential for the design of transportation and water-conveyance structures, flood-insurance studies, and flood-plain management. Such estimates are particularly important in densely populated urban areas. In order to increase the number of streamflow-gaging stations (streamgages) available for analysis, expand the geographical coverage that would allow for application of regional regression equations across State boundaries, and build on a previous flood-frequency investigation of rural U.S Geological Survey streamgages in the Southeast United States, a multistate approach was used to update methods for determining the magnitude and frequency of floods in urban and small, rural streams that are not substantially affected by regulation or tidal fluctuations in Georgia, South Carolina, and North Carolina. The at-site flood-frequency analysis of annual peak-flow data for urban and small, rural streams (through September 30, 2011) included 116 urban streamgages and 32 small, rural streamgages, defined in this report as basins draining less than 1 square mile. The regional regression analysis included annual peak-flow data from an additional 338 rural streamgages previously included in U.S. Geological Survey flood-frequency reports and 2 additional rural streamgages in North Carolina that were not included in the previous Southeast rural flood-frequency investigation for a total of 488 streamgages included in the urban and small, rural regression analysis. The at-site flood-frequency analyses for the urban and small, rural streamgages included the expected moments algorithm, which is a modification of the Bulletin 17B log-Pearson type III method for fitting the statistical distribution to the logarithms of the annual peak flows. Where applicable, the flood-frequency analysis also included low-outlier and historic information. Additionally, the application of a generalized Grubbs-Becks test allowed for the

  18. A low-cost method to measure the timing of post-fire flash floods and debris flows relative to rainfall

    USGS Publications Warehouse

    Kean, Jason W.; Staley, Dennis M.; Leeper, Robert J.; Schmidt, Kevin Michael; Gartner, Joseph E.

    2012-01-01

    Data on the specific timing of post-fire flash floods and debris flows are very limited. We describe a method to measure the response times of small burned watersheds to rainfall using a low-cost pressure transducer, which can be installed quickly after a fire. Although the pressure transducer is not designed for sustained sampling at the fast rates ({less than or equal to}2 sec) used at more advanced debris-flow monitoring sites, comparisons with high-data rate stage data show that measured spikes in pressure sampled at 1-min intervals are sufficient to detect the passage of most debris flows and floods. Post-event site visits are used to measure the peak stage and identify flow type based on deposit characteristics. The basin response timescale (tb) to generate flow at each site was determined from an analysis of the cross correlation between time series of flow pressure and 5-min rainfall intensity. This timescale was found to be less than 30 minutes for 40 post-fire floods and 11 post-fire debris flows recorded in 15 southern California watersheds ({less than or equal to} 1.4 km2). Including data from 24 other debris flows recorded at 5 more instrumentally advanced monitoring stations, we find there is not a substantial difference in the median tb for floods and debris flows (11 and 9 minutes, respectively); however, there are slight, statistically significant differences in the trends of flood and debris-flow tb with basin area, which are presumably related to differences in flow speed between floods and debris flows.

  19. Alkaline-resistance model of subtilisin ALP I, a novel alkaline subtilisin.

    PubMed

    Maeda, H; Mizutani, O; Yamagata, Y; Ichishima, E; Nakajima, T

    2001-05-01

    The alkaline-resistance mechanism of the alkaline-stable enzymes is not yet known. To clarify the mechanism of alkaline-resistance of alkaline subtilisin, structural changes of two typical subtilisins, subtilisin ALP I (ALP I) and subtilisin Sendai (Sendai), were studied by means of physicochemical methods. Subtilisin NAT (NAT), which exhibits no alkaline resistance, was examined as a control. ALP I gradually lost its activity, accompanied by protein degradation, but, on the contrary, Sendai was stable under alkaline conditions. CD spectral measurements at neutral and alkaline pH indicated no apparent differences between ALP I and Sendai. A significant difference was observed on measurement of fluorescence emission spectra of the tryptophan residues of ALP I that were exposed on the enzyme surface. The fluorescence intensity of ALP I was greatly reduced under alkaline conditions; moreover, the reduction was reversed when alkaline-treated ALP I was neutralized. The fluorescence spectrum of Sendai remained unchanged. The enzymatic and optical activities of NAT were lost at high pH, indicating a lack of functional and structural stability in an alkaline environment. Judging from these results, the alkaline resistance is closely related to the surface structure of the enzyme molecule.

  20. Simulating spatial aspects of a flash flood using the Monte Carlo method and GRASS GIS: a case study of the Malá Svinka Basin (Slovakia)

    NASA Astrophysics Data System (ADS)

    Hofierka, Jaroslav; Knutová, Monika

    2015-04-01

    This paper focuses on the flash flood assessment using a spatially-distributed hydrological model based on the Monte Carlo simulation method. The model is implemented as r.sim.water module in GRASS GIS and was applied to the Malá Svinka Basin in Eastern Slovakia where a heavy rainfall (100 mm/hr.) caused a flash flood event with deadly consequences in July 1998. The event was simulated using standard datasets representing elevation, soils and land cover. The results were captured in time series of water depth maps showing gradual changes in water depths across the basin. The hydrological effects of roads in the study area were simulated using the preferential flow feature of the model. This simulation helped to identify source areas contributing to flooding in built-up areas. The implementation in a GIS environment simplifies the data preparation and eventual modification for various scenarios and flood protection measures. The simulation confirmed excellent robustness and flexibility of the method.

  1. Methods for estimating magnitude and frequency of 1-, 3-, 7-, 15-, and 30-day flood-duration flows in Arizona

    USGS Publications Warehouse

    Kennedy, Jeffrey R.; Paretti, Nicholas V.; Veilleux, Andrea G.

    2014-01-01

    Regression equations, which allow predictions of n-day flood-duration flows for selected annual exceedance probabilities at ungaged sites, were developed using generalized least-squares regression and flood-duration flow frequency estimates at 56 streamgaging stations within a single, relatively uniform physiographic region in the central part of Arizona, between the Colorado Plateau and Basin and Range Province, called the Transition Zone. Drainage area explained most of the variation in the n-day flood-duration annual exceedance probabilities, but mean annual precipitation and mean elevation were also significant variables in the regression models. Standard error of prediction for the regression equations varies from 28 to 53 percent and generally decreases with increasing n-day duration. Outside the Transition Zone there are insufficient streamgaging stations to develop regression equations, but flood-duration flow frequency estimates are presented at select streamgaging stations.

  2. A Conceptual Model for Floodplains in California's Central Valley and a Method for Identifying Representative Floods and Floodplains

    NASA Astrophysics Data System (ADS)

    Opperman, J. J.; Andrews, E.; Bozkurt, S.; Mount, J. F.; Moyle, P. B.

    2005-05-01

    Currently, significant resources are being invested in restoring native species and ecosystems in California's Central Valley and the Sacramento-San Joaquin Delta, led by the California Bay-Delta Authority (CBDA). Functioning floodplains provide numerous ecological benefits and floodplain restoration is emerging as important component of ecosystem restoration in this region. We developed a conceptual model that describes the linkages between physical (hydrologic and geomorphic) processes and ecosystem processes and responses on Central Valley floodplains. Central to this model is the role of hydrological variability in driving topographic diversity, ecosystem heterogeneity and ecological processes. We attempt to capture the extremely complex linkages between hydrological variability and ecosystem response through `representative floods.' A representative flood encompasses a set of hydrological variables, such as frequency and duration, which produce a characteristic suite of ecological benefits. For example, frequent, long duration flooding in the spring provides spawning and rearing habitat for native fish and promotes high phytoplankton productivity which can be exported to riverine and delta ecosystems. Less frequent, higher magnitude floods drive extensive geomorphic change upon the floodplain, creating topographic and, ultimately, ecological heterogeneity. Here we describe a process to define, map, and quantify the area inundated by a particular representative flood in the Sacramento River valley. To illustrate we identify the area inundated by a frequent (exceedance probability of 67%), long duration (> 7 days) flood that occurs in the spring. We used paired gauges to find the stage corresponding to the representative flood parameters and compared a plane connecting the gauges to topography in the intervening reach of river. We found that this type of representative flood inundates very little area in the Sacramento Valley; primary areas of inundation are

  3. Estimates of evapotranspiration in alkaline scrub and meadow communities of Owens Valley, California, using the Bowen-ratio, eddy-correlation, and Penman-combination methods

    USGS Publications Warehouse

    Duell, L. F. W.

    1988-01-01

    In Owens Valley, evapotranspiration (ET) is one of the largest components of outflow in the hydrologic budget and the least understood. ET estimates for December 1983 through October 1985 were made for seven representative locations selected on the basis of geohydrology and the characteristics of phreatophytic alkaline scrub and meadow communities. The Bowen-ratio, eddy-correlation, and Penman-combination methods were used to estimate ET. The results of the analyses appear satisfactory when compared to other estimates of ET. Results by the eddy-correlation method are for a direct and a residual latent-heat flux that is based on sensible-heat flux and energy budget measurements. Penman-combination potential ET estimates were determined to be unusable because they overestimated actual ET. Modification in the psychrometer constant of this method to account for differences between heat-diffusion resistance and vapor-diffusion resistance permitted actual ET to be estimated. The methods may be used for studies in similar semiarid and arid rangeland areas in the Western United States. Meteorological data for three field sites are included in the appendix. Simple linear regression analysis indicates that ET estimates are correlated to air temperature, vapor-density deficit, and net radiation. Estimates of annual ET range from 300 mm at a low-density scrub site to 1,100 mm at a high-density meadow site. The monthly percentage of annual ET was determined to be similar for all sites studied. (Author 's abstract)

  4. Flood regionalization: A hybrid geographic and predictor-variable region-of-influence regression method

    USGS Publications Warehouse

    Eng, K.; Milly, P.C.D.; Tasker, Gary D.

    2007-01-01

    To facilitate estimation of streamflow characteristics at an ungauged site, hydrologists often define a region of influence containing gauged sites hydrologically similar to the estimation site. This region can be defined either in geographic space or in the space of the variables that are used to predict streamflow (predictor variables). These approaches are complementary, and a combination of the two may be superior to either. Here we propose a hybrid region-of-influence (HRoI) regression method that combines the two approaches. The new method was applied with streamflow records from 1,091 gauges in the southeastern United States to estimate the 50-year peak flow (Q50). The HRoI approach yielded lower root-mean-square estimation errors and produced fewer extreme errors than either the predictor-variable or geographic region-of-influence approaches. It is concluded, for Q50 in the study region, that similarity with respect to the basin characteristics considered (area, slope, and annual precipitation) is important, but incomplete, and that the consideration of geographic proximity of stations provides a useful surrogate for characteristics that are not included in the analysis. ?? 2007 ASCE.

  5. A modified and automated version of the 'Fluorimetric Detection of Alkaline DNA Unwinding' method to quantify formation and repair of DNA strand breaks

    PubMed Central

    Moreno-Villanueva, María; Pfeiffer, Ragen; Sindlinger, Thilo; Leake, Alan; Müller, Marcus; Kirkwood, Thomas BL; Bürkle, Alexander

    2009-01-01

    Background Formation and repair of DNA single-strand breaks are important parameters in the assessment of DNA damage and repair occurring in live cells. The 'Fluorimetric Detection of Alkaline DNA Unwinding (FADU)' method [Birnboim HC, Jevcak JJ. Cancer Res (1981) 41:1889–1892] is a sensitive procedure to quantify DNA strand breaks, yet it is very tedious to perform. Results In order (i) to render the FADU assay more convenient and robust, (ii) to increase throughput, and (iii) to reduce the number of cells needed, we have established a modified assay version that is largely automated and is based on the use of a liquid handling device. The assay is operated in a 96-well format, thus greatly increasing throughput. The number of cells required has been reduced to less than 10,000 per data point. The threshold for detection of X-ray-induced DNA strand breaks is 0.13 Gy. The total assay time required for a typical experiment to assess DNA strand break repair is 4–5 hours. Conclusion We have established a robust and convenient method measuring of formation and repair of DNA single-strand breaks in live cells. While the sensitivity of our method is comparable to current assays, throughput is massively increased while operator time is decreased. PMID:19389244

  6. Strange Floods: The Upper Tail of Flood Peaks in the Conterminous US

    NASA Astrophysics Data System (ADS)

    Smith, J. A.; Baeck, M. L.

    2015-12-01

    The strangest flood in US history is arguably the 14 June 1903 flood that devastated Heppner, Oregon. The notion of strange floods is based on the assumption that there are flood agents that dominate the upper tail of flood distributions for a region (severe thunderstorms in complex terrain in the case of the Heppner flood) and are exceedingly poorly characterized by conventional flood records. The orographic thunderstorm systems in the central Appalachians that dominate envelope curves of flood peaks in the eastern US for basin areas less than 1,000 sq. km. and control portions of the global envelope curve of rainfall accumulations at time scales shorter than 6 hours) provide a well-documented example of strange floods. Despite extensive evidence of their occurrence, principally from field-based case studies, they are poorly represented in conventional USGS flood records. We develop methods for examining strange floods based on analyses of the complete record of USGS annual peak observations and on hydrometeorological analyses of the most extreme floods in the US flood record. The methods we present are grounded in extreme value theory and designed to enhance our understanding of extreme floods and improve methods for estimating extreme flood magnitudes.

  7. LOWER COST METHODS FOR IMPROVED OIL RECOVERY (IOR) VIA SURFACTANT FLOODING

    SciTech Connect

    William A. Goddard III; Yongchun Tang; Patrick Shuler; Mario Blanco; Seung Soon Jang; Shiang-Tai Lin; Prabal Maiti; Yongfu Wu; Stefan Iglauer; Xiaohang Zhang

    2004-09-01

    This report provides a summary of the work performed in this 3-year project sponsored by DOE. The overall objective of this project is to identify new, potentially more cost-effective surfactant formulations for improved oil recovery (IOR). The general approach is to use an integrated experimental and computational chemistry effort to improve our understanding of the link between surfactant structure and performance, and from this knowledge, develop improved IOR surfactant formulations. Accomplishments for the project include: (1) completion of a literature review to assemble current and new surfactant IOR ideas, (2) Development of new atomistic-level MD (molecular dynamic) modeling methodologies to calculate IFT (interfacial tension) rigorously from first principles, (3) exploration of less computationally intensive mesoscale methods to estimate IFT, Quantitative Structure Property Relationship (QSPR), and cohesive energy density (CED) calculations, (4) experiments to screen many surfactant structures for desirable low IFT and solid adsorption behavior, and (5) further experimental characterization of the more promising new candidate formulations (based on alkyl polyglycosides (APG) and alkyl propoxy sulfate surfactants). Important findings from this project include: (1) the IFT between two pure substances may be calculated quantitatively from fundamental principles using Molecular Dynamics, the same approach can provide qualitative results for ternary systems containing a surfactant, (2) low concentrations of alkyl polyglycoside surfactants have potential for IOR (Improved Oil Recovery) applications from a technical standpoint (if formulated properly with a cosurfactant, they can create a low IFT at low concentration) and also are viable economically as they are available commercially, and (3) the alkylpropoxy sulfate surfactants have promising IFT performance also, plus these surfactants can have high optimal salinity and so may be attractive for use in higher

  8. Alkaline "Permanent" Paper.

    ERIC Educational Resources Information Center

    Pacey, Antony

    1991-01-01

    Discussion of paper manufacturing processes and their effects on library materials focuses on the promotion of alkaline "permanent" paper, with less acid, by Canadian library preservation specialists. Standards for paper acidity are explained; advantages of alkaline paper are described, including decreased manufacturing costs; and recyclability is…

  9. FUNDAMENTAL STUDY ON REAL-TIME FLOOD FORECASTING METHOD FOR LOCALLY HEAVY RAINFALL IN URBAN DRAINAGE AREAS

    NASA Astrophysics Data System (ADS)

    Kimura, Makoto; Kido, Yoshinobu; Nakakita, Eiichi

    Recently, locally heavy rainfall occurs frequently at highly urbanized area, and causes serious personal accidents, so importance of flood forecasting system is growing in order to reduce damage of inundation. However, flood forecasting that secured lead-time for evacuation is extremely difficult, because the rainfall flows out rapidly. In this study, the numerical simulation model that can finely express inundation mechanism of urban drainage areas was applied with the most recent available data and analysis tool. The influence of the factor (i.e. sewer system, overland and rainfall information) which affected inundation mechanism was evaluated through the sensibility analysis with this model, and evaluation results show some requirements of model condition and information on time and space resolution of real-time flood forecasting.

  10. Flooding in Bifurcation

    NASA Astrophysics Data System (ADS)

    Aoki, Masakazu; Matumoto, Aoki

    2010-05-01

    Edo River to diverge from Tone River on the right side flows away through Tokyo downtown, and into Tokyo Bay. Tone River of main stream flows through the north region of Kanto into Chiba prefecture of rural aria. Tone River originally flowed through present Edo River into Tokyo downtown. So when Tokyo (Edo era) became the political center of Japan 400 years ago, this place had been suffered from flood caused by augmenting downstream flowing of rainfall over watershed catchment area. Edo Government extended near independent small rivers and connected with Tone River and led away most of flood water transportation into Chiba prefecture to be a rural reason. The present rout of the river has been determined in the mass during the 16th century. Created artificial Edo River experimentally divided into 40 percentage and artificial Tone River divided into 60 percentage of flood water transportation. After that Japanese Government confirmed a safety against flood and confirmed to be a safety Tokyo by using SFM (storage function method) and SNFM (steady non-uniform flow method). Japanese Government estimated Plan High Water Discharge 17,500m3/s at upstream of the divergent point and Edo river flowing through 40 percentage (7,000m3/s) of 17,500m3/s which was same ratio as Edo era. But SFM and SNFM could not explain dynamic flow phenomena. We surveyed how many channel storage amount were there in this river by using UFM (unsteady flow method). We reproduce real flowing shape and carried out more detail dynamic phenomena. In this research, we had taken up diverse and various 11floods from 1981. These floods were confirmed that Edo River to be bifurcated less than 40 percentages. Large flood are not always high ratio of diversion in to Edo River. It's almost smaller ratio rather than higher ratio. For example, peak discharge 11,117m3/s, Aug. 1982 flood was bifurcated into Edo river flowing through 20 percentage of 11,117m3/s. Small flood peak discharge 1,030m3/s, Aug. 1992

  11. Methods for estimating annual exceedance-probability discharges and largest recorded floods for unregulated streams in rural Missouri

    USGS Publications Warehouse

    Southard, Rodney E.; Veilleux, Andrea G.

    2014-01-01

    Regression analysis techniques were used to develop a set of equations for rural ungaged stream sites for estimating discharges with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities, which are equivalent to annual flood-frequency recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years, respectively. Basin and climatic characteristics were computed using geographic information software and digital geospatial data. A total of 35 characteristics were computed for use in preliminary statewide and regional regression analyses. Annual exceedance-probability discharge estimates were computed for 278 streamgages by using the expected moments algorithm to fit a log-Pearson Type III distribution to the logarithms of annual peak discharges for each streamgage using annual peak-discharge data from water year 1844 to 2012. Low-outlier and historic information were incorporated into the annual exceedance-probability analyses, and a generalized multiple Grubbs-Beck test was used to detect potentially influential low floods. Annual peak flows less than a minimum recordable discharge at a streamgage were incorporated into the at-site station analyses. An updated regional skew coefficient was determined for the State of Missouri using Bayesian weighted least-squares/generalized least squares regression analyses. At-site skew estimates for 108 long-term streamgages with 30 or more years of record and the 35 basin characteristics defined for this study were used to estimate the regional variability in skew. However, a constant generalized-skew value of -0.30 and a mean square error of 0.14 were determined in this study. Previous flood studies indicated that the distinct physical features of the three physiographic provinces have a pronounced effect on the magnitude of flood peaks. Trends in the magnitudes of the residuals from preliminary statewide regression analyses from previous studies confirmed that regional analyses in this study were

  12. Evolution of flood typology across Europe

    NASA Astrophysics Data System (ADS)

    Hundecha, Yeshewatesfa; Parajka, Juraj; Viglione, Alberto

    2016-04-01

    Following the frequent occurrence of severe flood events in different parts of Europe in the recent past, there has been a rise in interest in understanding the mechanisms by which the different events have been triggered and how they have been evolving over time. This study was carried out to establish the characteristics of observed flood events in the past across Europe in terms of their spatial extent and the processes leading up to the events using a process based hydrological model. To this end, daily discharge data from more than 750 stations of the Global Runoff Data Center were used to identify flood events at the stations based on a threshold method for the period 1961-2010. The identified events at the different stations were further analyzed to determine whether they form the same flood event, thereby delineating the spatial extent of the flood events. The pan-European hydrological model, E-HYPE, which runs at a daily time step, was employed to estimate a set of catchment hydrological and hydro-meteorological state variables that are relevant in the flood generating process for each of the identified spatially delineated flood events. A subsequent clustering of the events based on the simulated state variables, together with the spatial extent of the flood events, was used to identify the flood generating mechanism of each flood event. Four general flood generation mechanisms were identified: long-rain flood, short-rain flood, snowmelt flood, and rain-on-snow flood. A trend analysis was performed to investigate how the frequency of each of the flood types has changed over time. In order to investigate whether there is a regional and seasonal pattern in the dominant flood generating mechanisms, this analysis was performed separately for winter and summer seasons and three different regions of Europe: Northern, Western, and Eastern Europe. The results show a regional difference both in the dominant flood generating mechanism and the corresponding trends.

  13. Analysis of an influence of the bias correction method on the projected changes of flood indices in the selected catchments in Poland

    NASA Astrophysics Data System (ADS)

    Osuch, Marzena; Lawrence, Deborah; Meresa, Hadush K.; Napiórkowski, Jaroslaw J.; Romanowicz, Renata J.

    2016-04-01

    The aim of the study is an estimation of the uncertainty in flood indices introduced by bias correction of climate model variables in ten catchments in Poland. A simulation approach is used to obtain daily flows in catchments under changing climatic conditions, following the RCP4.5 and RCP8.5 emission scenarios. Climate projections were obtained from the EURO-CORDEX initiative, and time series of precipitation and air temperature from different RCM/GCMs for the periods: 1971-2000, 2021-2050 and 2071-2100 were used. The climate model outputs in the Poland area are highly biased; therefore, additional post processing in the form of bias correction of precipitation and temperature is needed. In this work we used four versions of the quantile mapping method (empirical quantile mapping, and three distribution based mappings: double gamma, single gamma and Birnbaum-Sanders) for correction of the precipitation time series and one method for air temperature correction (empirical quantile method). The HBV rainfall-runoff catchment-based model is used to estimate future flow time series. The models are calibrated using the available precipitation, air temperature, and flow observations for the period 1971-2000. Model performance is evaluated using observed data for the period 2001-2010. We also verify performance using the EURO-CORDEX simulations for the reference period (1971-2000), both with and without bias correction of the RCM/GCM outputs. Finally, the models are run for the future climate simulated by the RCM/GCM models for the years: 2021-2050 and 2071-2100. Changes in the mean annual flood and in flood quantiles are analysed and the effect of bias correction on the estimated changes is also considered. The results indicate substantial differences between climate models and catchments. The regional variability has a close relationship with the flood regime type. Catchments where high flows are expected to increase have a rainfall-dominated flood regime in the current

  14. Estimates of evapotranspiration in alkaline scrub and meadow communities of Owens Valley, California, using the Bowen-ratio, eddy-correlation, and penman-combination methods

    USGS Publications Warehouse

    Duell, Lowell F. W.

    1990-01-01

    In Owens Valley, evapotranspiration (ET) is one of the largest components of outflow in the hydrologic budget and the least understood. ET estimates for December 1983 through October 1985 were made for seven representative locations selected on the basis of geohydrology and the characteristics of phreatophytic alkaline scrub and meadow communities. The Bowen-ratio, eddy-correlation, and Penman-combination methods were used to estimate ET. The results of the analyses appear satisfactory when compared with other estimates of ET. Results by the eddy-correlation method are for a direct and a residual latent-heat flux that is based on sensible-heat flux and energy-budget measurements. Penman-combination potential-ET estimates were determined to be unusable because they overestimated actual ET. Modification of the psychrometer constant of this method to account for differences between heat-diffusion resistance and vapor-diffusion resistance permitted actual ET to be estimated. The methods described in this report may be used for studies in similar semiarid and arid rangeland areas in the Western United States. Meteorological data for three field sites are included in the appendix of this report. Simple linear regression analysis indicates that ET estimates are correlated to air temperature, vapor-density deficit, and net radiation. Estimates of annual ET range from 301 millimeters at a low-density scrub site to 1,137 millimeters at a high-density meadow site. The monthly percentage of annual ET was determined to be similar for all sites studied.

  15. Instrumentation, methods of flood-data collection and transmission, and evaluation of streamflow-gaging network in Indiana

    USGS Publications Warehouse

    Glatfelter, D.R.; Butch, G.K.

    1994-01-01

    The study results indicate that installation of streamflow-gaging stations at 15 new sites would improve collection of flood data. Instrumenting the 15 new sites plus 26 existing streamflow-gaging stations with telemetry, preferably data-collection platforms with satellite transmitters, would improve transmission of data to users of the information.

  16. Flood information for flood-plain planning

    USGS Publications Warehouse

    Bue, Conrad D.

    1967-01-01

    Floods are natural and normal phenomena. They are catastrophic simply because man occupies the flood plain, the highwater channel of a river. Man occupies flood plains because it is convenient and profitable to do so, but he must purchase his occupancy at a price-either sustain flood damage, or provide flood-control facilities. Although large sums of money have been, and are being, spent for flood control, flood damage continues to mount. However, neither complete flood control nor abandonment of the flood plain is practicable. Flood plains are a valuable resource and will continue to be occupied, but the nature and degree of occupancy should be compatible with the risk involved and with the degree of protection that is practicable to provide. It is primarily to meet the needs for defining the risk that the flood-inundation maps of the U.S. Geological Survey are prepared.

  17. Technetium recovery from high alkaline solution

    DOEpatents

    Nash, Charles A.

    2016-07-12

    Disclosed are methods for recovering technetium from a highly alkaline solution. The highly alkaline solution can be a liquid waste solution from a nuclear waste processing system. Methods can include combining the solution with a reductant capable of reducing technetium at the high pH of the solution and adding to or forming in the solution an adsorbent capable of adsorbing the precipitated technetium at the high pH of the solution.

  18. A new simple method to incorporate climate variability in probabilistic climate change scenarios, applied to assessing future river flooding in the UK.

    NASA Astrophysics Data System (ADS)

    Ledbetter, Ralph; Prudhomme, Christel; Arnell, Nigel

    2010-05-01

    Understanding the impacts of climate change is crucial for adaptation and mitigation policy decisions. This is particularly true for the water sector and flood risk as they have a direct link with climate, and climate change might result in a potentially changed risk to society. Increasingly water managers request probabilistic projections of climate change impacts so they can incorporate uncertainty in their strategic planning. Climate change impact studies often rely on scenarios from global climate models (GCMs) or regional climate models (RCMs), but until recently, very few probabilistic climate change scenarios were available, thus making it difficult to generate probabilistic impact assessments. In addition, climate variability, which is known to play a significant role in the generation of floods and in the management of flood risk, is not always explicitly accounted for in climate change impact studies. In particular natural variability is often considered as stationary in future impact assessments. A new simple methodology is presented here that develops probabilistic climate change scenarios incorporating baseline and future variability of GCM outputs. The method is based on the change factor method, where the changing climate is defined as monthly differences between a future climate and a baseline climate. This change factor method has been widely used as a simple method to remove bias in GCMs, particular important for precipitation. Usually, for a GCM, both future and baseline climate are taken as the average of a 30-year GCM output time series, typically 2071-2100 for the 2080s future climate, and 1961-1990 as baseline climate. Instead, for each future and baseline, we randomly sample (with replacement) any monthly average from the relevant 30-year period to build multiple synthetic 30-year time series. Each multiple time series can then be used to calculate change factors, exactly as it is done for a single GCM realisation. By repeating the process

  19. A two-stage method of quantitative flood risk analysis for reservoir real-time operation using ensemble-based hydrologic forecasts

    NASA Astrophysics Data System (ADS)

    Liu, P.

    2013-12-01

    Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.

  20. A novel cobalt tetranitrophthalocyanine/graphene composite assembled by an in situ solvothermal synthesis method as a highly efficient electrocatalyst for the oxygen reduction reaction in alkaline medium.

    PubMed

    Lv, Guojun; Cui, Lili; Wu, Yanying; Liu, Ying; Pu, Tao; He, Xingquan

    2013-08-21

    A novel micro/nano-composite, based on cobalt(II) tetranitrophthalocyanine (CoTNPc) grown on poly(sodium-p-styrenesulfonate) modified graphene (PGr), as a non-noble-metal catalyst for the oxygen reduction reaction (ORR), is fabricated by an in situ solvothermal synthesis method. The CoTNPc/PGr is characterized by scanning electron microscopy (SEM), transmission electron microscopy (TEM), ultraviolet-visible (UV-vis) absorption spectroscopy, Fourier transform infrared spectroscopy (FTIR) and X-ray photoelectron spectroscopy (XPS), respectively. The electrocatalytic activity of the CoTNPc/PGr composite toward the ORR is evaluated using cyclic voltammetry and linear sweep voltammetry methods. The CoTNPc/PGr composite exhibits an unexpected, surprisingly high ORR activity compared to CoTNPc or PGr. The onset potential for ORR on CoTNPc/PGr is found to be around -0.10 V vs. SCE in 0.1 M NaOH solution, which is 30 mV and 70 mV more positive than that on PGr and CoTNPc, respectively. The peak current density on CoTNPc/PGr is about 2 times than that on PGr and CoTNPc, respectively. Rotating disk electrode (RDE) measurements reveal that the ORR mechanism is nearly via a four-electron pathway on CoTNPc/PGr. The current density for ORR on CoTNPc/PGr still remains 69.9% of its initial value after chronoamperometric measurements for 24 h. Pt/C catalyst, on the other hand, only retains 13.3% of its initial current. The peak potential shifts slightly and current barely changes when 3 M methanol is added. The fabricated composite catalyst for ORR displays high activity, good stability and excellent tolerance to the crossover effect, which may be used as a promising Pt-free catalyst in alkaline direct methanol fuel cells (DMFCs). PMID:23820483

  1. Tsunami flooding

    USGS Publications Warehouse

    Geist, Eric; Jones, Henry; McBride, Mark; Fedors, Randy

    2013-01-01

    Panel 5 focused on tsunami flooding with an emphasis on Probabilistic Tsunami Hazard Analysis (PTHA) as derived from its counterpart, Probabilistic Seismic Hazard Analysis (PSHA) that determines seismic ground-motion hazards. The Panel reviewed current practices in PTHA and determined the viability of extending the analysis to extreme design probabilities (i.e., 10-4 to 10-6). In addition to earthquake sources for tsunamis, PTHA for extreme events necessitates the inclusion of tsunamis generated by submarine landslides, and treatment of the large attendant uncertainty in source characterization and recurrence rates. Tsunamis can be caused by local and distant earthquakes, landslides, volcanism, and asteroid/meteorite impacts. Coastal flooding caused by storm surges and seiches is covered in Panel 7. Tsunamis directly tied to earthquakes, the similarities with (and path forward offered by) the PSHA approach for PTHA, and especially submarine landslide tsunamis were a particular focus of Panel 5.

  2. Regionalisation of a distributed method for flood quantiles estimation: Revaluation of local calibration hypothesis to enhance the spatial structure of the optimised parameter

    NASA Astrophysics Data System (ADS)

    Odry, Jean; Arnaud, Patrick

    2016-04-01

    The SHYREG method (Aubert et al., 2014) associates a stochastic rainfall generator and a rainfall-runoff model to produce rainfall and flood quantiles on a 1 km2 mesh covering the whole French territory. The rainfall generator is based on the description of rainy events by descriptive variables following probability distributions and is characterised by a high stability. This stochastic generator is fully regionalised, and the rainfall-runoff transformation is calibrated with a single parameter. Thanks to the stability of the approach, calibration can be performed against only flood quantiles associated with observated frequencies which can be extracted from relatively short time series. The aggregation of SHYREG flood quantiles to the catchment scale is performed using an areal reduction factor technique unique on the whole territory. Past studies demonstrated the accuracy of SHYREG flood quantiles estimation for catchments where flow data are available (Arnaud et al., 2015). Nevertheless, the parameter of the rainfall-runoff model is independently calibrated for each target catchment. As a consequence, this parameter plays a corrective role and compensates approximations and modelling errors which makes difficult to identify its proper spatial pattern. It is an inherent objective of the SHYREG approach to be completely regionalised in order to provide a complete and accurate flood quantiles database throughout France. Consequently, it appears necessary to identify the model configuration in which the calibrated parameter could be regionalised with acceptable performances. The revaluation of some of the method hypothesis is a necessary step before the regionalisation. Especially the inclusion or the modification of the spatial variability of imposed parameters (like production and transfer reservoir size, base flow addition and quantiles aggregation function) should lead to more realistic values of the only calibrated parameter. The objective of the work presented

  3. Regional flood probabilities

    USGS Publications Warehouse

    Troutman, B.M.; Karlinger, M.R.

    2003-01-01

    The T-year annual maximum flood at a site is defined to be that streamflow, that has probability 1/T of being exceeded in any given year, and for a group of sites the corresponding regional flood probability (RFP) is the probability that at least one site will experience a T-year flood in any given year. The RFP depends on the number of sites of interest and on the spatial correlation of flows among the sites. We present a Monte Carlo method for obtaining the RFP and demonstrate that spatial correlation estimates used in this method may be obtained with rank transformed data and therefore that knowledge of the at-site peak flow distribution is not necessary. We examine the extent to which the estimates depend on specification of a parametric form for the spatial correlation function, which is known to be nonstationary for peak flows. It is shown in a simulation study that use of a stationary correlation function to compute RFPs yields satisfactory estimates for certain nonstationary processes. Application of asymptotic extreme value theory is examined, and a methodology for separating channel network and rainfall effects on RFPs is suggested. A case study is presented using peak flow data from the state of Washington. For 193 sites in the Puget Sound region it is estimated that a 100-year flood will occur on the average every 4,5 years.

  4. Quantifying Floods of a Flood Regime in Space and Time

    NASA Astrophysics Data System (ADS)

    Whipple, A. A.; Fleenor, W. E.; Viers, J. H.

    2015-12-01

    Interaction between a flood hydrograph and floodplain topography results in spatially and temporally variable conditions important for ecosystem process and function. Individual floods whose frequency and dimensionality comprise a river's flood regime contribute to that variability and in aggregate are important drivers of floodplain ecosystems. Across the globe, water management actions, land use changes as well as hydroclimatic change associated with climate change have profoundly affected natural flood regimes and their expression within the floodplain landscape. Homogenization of riverscapes has degraded once highly diverse and productive ecosystems. Improved understanding of the range of flood conditions and spatial variability within floodplains, or hydrospatial conditions, is needed to improve water and land management and restoration activities to support the variable conditions under which species adapted. This research quantifies the flood regime of a floodplain site undergoing restoration through levee breaching along the lower Cosumnes River of California. One of the few lowland alluvial rivers of California with an unregulated hydrograph and regular floodplain connectivity, the Cosumnes River provides a useful test-bed for exploring river-floodplain interaction. Representative floods of the Cosumnes River are selected from previously-established flood types comprising the flood regime and applied within a 2D hydrodynamic model representing the floodplain restoration site. Model output is analyzed and synthesized to quantify and compare conditions in space and time, using metrics such as depth and velocity. This research establishes methods for quantifying a flood regime's floodplain inundation characteristics, illustrates the role of flow variability and landscape complexity in producing heterogeneous floodplain conditions, and suggests important implications for managing more ecologically functional floodplains.

  5. Spatio-temporal characteristics of the extreme precipitation by L-moment-based index-flood method in the Yangtze River Delta region, China

    NASA Astrophysics Data System (ADS)

    Yin, Yixing; Chen, Haishan; Xu, Chong-Yu; Xu, Wucheng; Chen, Changchun; Sun, Shanlei

    2016-05-01

    The regionalization methods, which "trade space for time" by pooling information from different locations in the frequency analysis, are efficient tools to enhance the reliability of extreme quantile estimates. This paper aims at improving the understanding of the regional frequency of extreme precipitation by using regionalization methods, and providing scientific background and practical assistance in formulating the regional development strategies for water resources management in one of the most developed and flood-prone regions in China, the Yangtze River Delta (YRD) region. To achieve the main goals, L-moment-based index-flood (LMIF) method, one of the most popular regionalization methods, is used in the regional frequency analysis of extreme precipitation with special attention paid to inter-site dependence and its influence on the accuracy of quantile estimates, which has not been considered by most of the studies using LMIF method. Extensive data screening of stationarity, serial dependence, and inter-site dependence was carried out first. The entire YRD region was then categorized into four homogeneous regions through cluster analysis and homogenous analysis. Based on goodness-of-fit statistic and L-moment ratio diagrams, generalized extreme-value (GEV) and generalized normal (GNO) distributions were identified as the best fitted distributions for most of the sub-regions, and estimated quantiles for each region were obtained. Monte Carlo simulation was used to evaluate the accuracy of the quantile estimates taking inter-site dependence into consideration. The results showed that the root-mean-square errors (RMSEs) were bigger and the 90 % error bounds were wider with inter-site dependence than those without inter-site dependence for both the regional growth curve and quantile curve. The spatial patterns of extreme precipitation with a return period of 100 years were finally obtained which indicated that there are two regions with highest precipitation

  6. How to decide which oblique image has the highest mapping potential for monoplotting method: a case studies on river erosion and floods

    NASA Astrophysics Data System (ADS)

    Triglav-Čekada, M.; Bric, V.; Zorn, M.

    2014-05-01

    When studying the development of different geomorphic processes, floods, glaciers or even cultural heritage through time, one cannot rely only on regular photogrammetrical procedures and metrical images. In a majority of cases the only available images are the archive images with unknown parameters of interior orientation showing the object of interest in oblique view. With the help of modern high resolution digital elevation models derived from aerial or terrestrial laser scanning (lidar) or from photogrammetric stereo-images by automatic image-matching techniques even single nonmetric high or low oblique image from the past can be applied in the monoplotting procedure to enable 3D-data extraction of changes through time. The first step of the monoplotting procedure is the orientation of an image in the space by the help of digital elevation model (DEM). When using oblique images tie points between an image and DEM are usually too sparse to enable automatic exterior orientation, still the manual interactive orientation using common features can resolve such shortages. The manual interactive orientation can be very time consuming. Therefore, before the start of the manual interactive orientation one should be certain if one can expect useful results from the chosen image. But how to decide which image has the highest mapping potential before we introduce a certain oblique image in orientation procedure? The test examples presented in this paper enable guidance for the use of monoplotting method for different geoscience applications. The most important factors are the resolution of digital elevation model (the best are the lidar derived ones), the presence of appropriate common features and the incidence angle of the oblique images (low oblique images or almost vertical aerial images are better). First the very oblique example of riverbank erosion on Dragonja river, Slovenija, is presented. Than the test example of September 2010 floods on Ljubljana moor is

  7. New petrophysical magnetic methods MACC and MAFM in permeability characterisation of petroleum reservoir rock cleaning, flooding modelling and determination of fines migration in formation damage

    NASA Astrophysics Data System (ADS)

    Ivakhnenko, O. P.

    2012-04-01

    Potential applications of magnetic techniques and methods in petroleum engineering and petrophysics (Ivakhnenko, 1999, 2006; Ivakhnenko & Potter, 2004) reveal their vast advantages for the petroleum reservoir characterisation and formation evaluation. In this work author proposes for the first time developed systematic methods of the Magnetic Analysis of Core Cleaning (MACC) and Magnetic Analysis of Fines Migration (MAFM) for characterisation of reservoir core cleaning and modelling estimations of fines migration for the petroleum reservoir formations. Using example of the one oil field we demonstrate results in application of these methods on the reservoir samples. Petroleum reservoir cores samples have been collected within reservoir using routine technique of reservoir sampling and preservation for PVT analysis. Immediately before the MACC and MAFM studies samples have been exposed to atmospheric air for a few days. The selected samples have been in detailed way characterised after fluid cleaning and core flooding by their mineralogical compositions and petrophysical parameters. Mineralogical composition has been estimated utilizing XRD techniques. The petrophysical parameters, such as permeability and porosity have been measured on the basis of total core analysis. The results demonstrate effectiveness and importance of the MACC and MAFM methods for the routine core analysis (RCAL) and the special core analysis (SCAL) in the reservoir characterisation, core flooding and formation damage analysis.

  8. Flood Inundation Analysis Considering Mega Floods in PyeonChang River Basin of South Korea

    NASA Astrophysics Data System (ADS)

    Kim, D.; Han, D.; Choi, C.; Lee, J.; Kim, H. S.

    2015-12-01

    Recently, abnormal climate has frequently occurred around the world due to global warming. In South Korea, more than 90% of damage due to natural disasters has been caused by extreme events like strong wind and heavy rainfall. Most studies regarding the impact of extreme events on flood damage have focused on a single heavy rainfall event. But several heavy rainfall events can be occurred continuously and these events will affect occurring huge flood damage. This study explores the impact of the continuous extreme events on the flood damage. Here we call Mega flood for this type of flood which is caused by the continuous extreme events. Inter Event Time Definition (IETD) method is applied for making Mega flood scenarios depending on independent rainfall event scenarios. Flood inundations are estimated in each situation of the Mega flood scenarios and the flood damages are estimated using a Multi-Dimensional Flood Damage Analysis (MD-FDA) method. As a result, we expect that flood damage caused by Mega flood leads to much greater than damage driven by single rainfall event. The results of this study can be contributed for making a guideline and design criteria in order to reduce flood damage.This work was supported by the National Research Foundation of Korea (NRF) and grant funded by the Korean government (MEST; No. 2011-0028564).

  9. Spatio-temporal analysis of the extreme precipitation by the L-moment-based index-flood method in the Yangtze River Delta region, China

    NASA Astrophysics Data System (ADS)

    Yin, Yixing; Chen, Haishan; Xu, Chongyu; Xu, Wucheng; Chen, Changchun

    2014-05-01

    The regionalization methods which 'trade space for time' by including several at-site data records in the frequency analysis are an efficient tool to improve the reliability of extreme quantile estimates. With the main aims of improving the understanding of the regional frequency of extreme precipitation and providing scientific and practical background and assistance in formulating the regional development strategies for water resources management in one of the most developed and flood-prone regions in China, the Yangtze River Delta (YRD) region, in this paper, L-moment-based index-flood (LMIF) method, one of the popular regionalization methods, is used in the regional frequency analysis of extreme precipitation; attention was paid to inter-site dependence and its influence on the accuracy of quantile estimates, which hasn't been considered for most of the studies using LMIF method. Extensive data screening of stationarity, serial dependence and inter-site dependence was carried out first. The entire YRD region was then categorized into four homogeneous regions through cluster analysis and homogenous analysis. Based on goodness-of-fit statistic and L-moment ratio diagrams, Generalized extreme-value (GEV) and Generalized Normal (GNO) distributions were identified as the best-fit distributions for most of the sub regions. Estimated quantiles for each region were further obtained. Monte-Carlo simulation was used to evaluate the accuracy of the quantile estimates taking inter-site dependence into consideration. The results showed that the root mean square errors (RMSEs) were bigger and the 90% error bounds were wider with inter-site dependence than those with no inter-site dependence for both the regional growth curve and quantile curve. The spatial patterns of extreme precipitation with return period of 100 years were obtained which indicated that there are two regions with the highest precipitation extremes (southeastern coastal area of Zhejiang Province and the

  10. Evaluation on an original resistivity inversion method of water flooding a conglomerate reservoir based on petrophysical analysis

    NASA Astrophysics Data System (ADS)

    Liu, Renqiang; Duan, Yonggang; Tan, Fengqi; Wang, Guochang; Qin, Jianhua; Neupane, Bhupati

    2015-10-01

    An accurate inversion of original reservoir resistivity is an important problem for waterflood development in oilfields in the middle-late development period. This paper describes the theoretical model of original resistivity recovery for a conglomerate reservoir established by petrophysical models, based on the stratigraphic model of reservoir vertical invasion of the conglomerate reservoir of an oilfield. Likewise two influencing factors of the resistivity change with a water-flooded reservoir were analyzed. The first one is the clay volume decrease due to an injected water wash argillaceous particle and the reservoir resistivity changes are influenced by it, and the other is to inject water to displace crude oil in the pore space leading to the increase of the water-bearing volume. Moreover the conductive ions of the injected water and the original formation water exchange and balance because of their salinity difference, and the reservoir resistivity changes are also influenced by them. Through the analysis of the above influential factors based on the fine identification of conglomerate lithologies the inversion models of three variables, including changes in the amount of clay, the resistivity of the irreducible water and the increase of the water bearing volume, were established by core analysis data, production performance and well logging curves information, and accurately recovered the original reservoir resistivity of the conglomerate. The original oil saturation of the reservoir was calculated according to multiple linear regression models. Finally, the produced index is defined as the difference of the original oil saturation and current oil saturation to the original oil saturation ratio, and it eliminates the effects of conglomerate lithologies and heterogeneity for the quantitative evaluation of flooded layers by the use of the principle of relative value. Compared with traditional flooding sensitive parameters which are oil saturation and water

  11. Evaluation of the alkaline electrolysis of zinc

    SciTech Connect

    Meisenhelder, J.H.; Brown, A.P.; Loutfy, R.O.; Yao, N.P.

    1981-05-01

    The alkaline leach and electrolysis process for zinc production is compared to the conventional acid-sulfate process in terms of both energy saving and technical merit. In addition, the potential for industrial application of the alkaline process is discussed on the basis of present market conditions, possible future zinc market scenarios, and the probability of increased secondary zinc recovery. In primary zinc production, the energy-saving potential for the alkaline process was estimated to be greater than 10%, even when significantly larger electrolysis current densities than those required for the sulfate process are used. The principal technical advantages of the alkaline process are that it can handle low-grade, high-iron-content or oxidized ores (like most of those found in the US) in a more cost- and energy-efficient manner than can the sulfate process. Additionally, in the electrowinning operation, the alkaline process should be technically superior because a dendritic or sponge deposit is formed that is amenable to automated collection without interruption of the electrolysis. Also, use of the higher current densities would result in significant capital cost reductions. Alkaline-based electrolytic recovery processes were considered for the recycling of zinc from smelter baghouse dusts and from the potential source of nickel/zinc electric-vehicle batteries. In all comparisons, an alkaline process was shown to be technically superior and, particularly for the baghouse dusts, energetically and economically superior to alternatively proposed recovery methods based on sulfate electrolysis. It is concluded that the alkaline zinc method is an important alternative technology to the conventional acid zinc process. (WHK)

  12. Preliminary report on a study to estimate flood volumes of small rural streams in Ohio; methods, site selection, and data base

    USGS Publications Warehouse

    Sherwood, J.M.

    1985-01-01

    In 1981, the U.S. Geological Survey, in cooperation with the Ohio Department of Transportation and the Federal Highway Administration, began a 7-year flood-volume study of small rural basins in Ohio. This report summarizes the methods of study and describes reconnaissance and site-selection procedures, locations and characteristics of the stations, instrumentation, and methods of collecting and storing data. The first phase of this study involved an intensive field reconnaissance of about 7,000 sites, of which 32 basins were selected for detailed analysis. Drainage areas for the basins varied from 0.13 to 6.45 square miles, and main-channel slopes ranged from 7.6 to 276 feet per mile. Five years of 5-minute rainfall-runoff data will be colledted for each study site. These data will be used to calibrate and verify a rainfall-runoff model for each basin. The calibrated model will be used in conjunction with 80 years of National Weather Service 5-minute precipitation data to synthesize a representative 80-year streamflow record at each site. A Log-Pearson Type III frequency distribution will be applied to each record to define the magnitudes and frequencies of flood volumes at each site. These data will be used to develop regionalized multiple regression models for estimating flood-volume magnitudes and frequencies at small rural ungaged sites in Ohio. The report also summarizes rainfall-runoff data collected from July 1981 through September 1983, but does not interpret the data. An average of eleven event periods per site were monitored where maximum 5-minute rainfall intensities varied from 0.02 to .067 inches and maximum peak discharges varied from 1 to 1,130 cubic feet per second.

  13. Cyber surveillance for flood disasters.

    PubMed

    Lo, Shi-Wei; Wu, Jyh-Horng; Lin, Fang-Pang; Hsu, Ching-Han

    2015-01-01

    Regional heavy rainfall is usually caused by the influence of extreme weather conditions. Instant heavy rainfall often results in the flooding of rivers and the neighboring low-lying areas, which is responsible for a large number of casualties and considerable property loss. The existing precipitation forecast systems mostly focus on the analysis and forecast of large-scale areas but do not provide precise instant automatic monitoring and alert feedback for individual river areas and sections. Therefore, in this paper, we propose an easy method to automatically monitor the flood object of a specific area, based on the currently widely used remote cyber surveillance systems and image processing methods, in order to obtain instant flooding and waterlogging event feedback. The intrusion detection mode of these surveillance systems is used in this study, wherein a flood is considered a possible invasion object. Through the detection and verification of flood objects, automatic flood risk-level monitoring of specific individual river segments, as well as the automatic urban inundation detection, has become possible. The proposed method can better meet the practical needs of disaster prevention than the method of large-area forecasting. It also has several other advantages, such as flexibility in location selection, no requirement of a standard water-level ruler, and a relatively large field of view, when compared with the traditional water-level measurements using video screens. The results can offer prompt reference for appropriate disaster warning actions in small areas, making them more accurate and effective. PMID:25621609

  14. Cyber Surveillance for Flood Disasters

    PubMed Central

    Lo, Shi-Wei; Wu, Jyh-Horng; Lin, Fang-Pang; Hsu, Ching-Han

    2015-01-01

    Regional heavy rainfall is usually caused by the influence of extreme weather conditions. Instant heavy rainfall often results in the flooding of rivers and the neighboring low-lying areas, which is responsible for a large number of casualties and considerable property loss. The existing precipitation forecast systems mostly focus on the analysis and forecast of large-scale areas but do not provide precise instant automatic monitoring and alert feedback for individual river areas and sections. Therefore, in this paper, we propose an easy method to automatically monitor the flood object of a specific area, based on the currently widely used remote cyber surveillance systems and image processing methods, in order to obtain instant flooding and waterlogging event feedback. The intrusion detection mode of these surveillance systems is used in this study, wherein a flood is considered a possible invasion object. Through the detection and verification of flood objects, automatic flood risk-level monitoring of specific individual river segments, as well as the automatic urban inundation detection, has become possible. The proposed method can better meet the practical needs of disaster prevention than the method of large-area forecasting. It also has several other advantages, such as flexibility in location selection, no requirement of a standard water-level ruler, and a relatively large field of view, when compared with the traditional water-level measurements using video screens. The results can offer prompt reference for appropriate disaster warning actions in small areas, making them more accurate and effective. PMID:25621609

  15. [Simultaneous Removal of Cd (II) and Phenol by Titanium Dioxide-Titanate Nanotubes Composite Nanomaterial Synthesized Through Alkaline-Acid Hydrothermal Method].

    PubMed

    Lei, Li; Jin, Yin-jia; Wang, Ting; Zhao, Xiao; Yan, You; Liu, Wen

    2015-07-01

    A composite nanomaterial, TiO2/TNTs, was synthesized by TiO2 (P25) through alkaline and acid hydrothermal reaction, which possessed both titanate nanotubes (TNTs) and TiO2 phase. It was found that the adsorption kinetics of Cd(II) onto TiO2/TNTs was very quick, and the adsorption could reach the equilibrium within 30 min. In addition, the maximum adsorption capacity of Cd(II) was as large as 120. 34 mg.g-1 calculated from Langmuir isotherm model. The adsorption mechanism of Cd(II) was ion-exchange between Cd2+ and Na+/H+ located in the interlayers of TNTs. However, the adsorption capacity of phenol on TiO2/TNTs was so small that the photocatalysis for phenol degradation was needed. In the adsorption-photocatalysis system, the removal efficiencies of Cd(II) and phenol could reach up to 99. 6% and 99.7%, respectively. Especially, removal of Cd(II) was attributed to adsorption by TNTs of the composite nanomaterial, while removal of phenol was resulted from photocatalytic reaction by the TiO2 phase. Moreover, the co-existing Cd(II) enhanced the photocatalytic degradation of phenol due to the enhancement on photocatalytic activity of TiO2/TNTs after Cd(II) was adsorbed. Co-existing Na+ did not show obvious effect on the co-removal of Cd(II) and phenol by TiO2/TNTs, but adsorption of Cd(II) was inhibited in the presence of Ca2+ as it could compete for the adsorption sites and enhance the aggregation of the material. Furthermore, TiO2/TNTs could be efficiently reused after desorption via HNO3 and regeneration via NaOH, and the removal efficiencies of Cd(II) and phenol were still as high as 91. 7% and 98. 1% even after three cycles. This study proposed a method to synthesize a material which had both adsorptive and photocatalytic performance, and it was of great importance for application of nanomaterials in the simultaneous removal of heavy metals and organic pollutants.

  16. A marked enhancement in the production of a highly alkaline and thermostable pectinase by Bacillus pumilus dcsr1 in submerged fermentation by using statistical methods.

    PubMed

    Sharma, D C; Satyanarayana, T

    2006-03-01

    The production of a highly alkaline and thermostable pectinase of Bacillus pumilus was optimized in submerged fermentation using Plackett-Burman design and response surface methodology. Three fermentation variables (C:N ratio, K(2)HPO(4), and pH), which were identified to significantly affect pectinase production by Plackett-Burman design were further optimized using response surface methodology of central composite design (CCD). An over all 34- and 41-fold increase in enzyme production was achieved in shake flasks and lab fermenter by the optimization of variables using statistical approaches, respectively. The enzyme was optimally active at pH 10.5 and 50 degrees C, and selectively degraded only the noncellulosic gummy material of ramie (Boehmeria nivea) fibres causing 10.96% fibre weight loss, and therefore, the enzyme could find application in fibre processing industry. The use of the enzyme in fibre processing reduces the use of alkali, and the associated alkalinization of water bodies. PMID:15936940

  17. Alkaline assisted thermal oil recovery: Kinetic and displacement studies

    SciTech Connect

    Saneie, S.; Yortsos, Y.C.

    1993-06-01

    This report deals with two major issues of chemical assisted flooding - the interaction of caustic, one of the proposed additives to steam flood, with the reservoir rock, and the displacement of oil by a chemical flood at elevated temperatures. A mathematical model simulating the kinetics of silica dissolution and hydroxyl ion consumption in a typical alkaline flooding environment is first developed. The model is based on the premise that dissolution occurs via hydrolysis of active sites through the formation of an intermediate complex, which is in equilibrium with the silicic acid in solution. Both static (batch) and dynamic (core flood) processes are simulated to examine the sensitivity of caustic consumption and silica dissolution to process parameters, and to determine rates of propagation of pH values. The model presented provides a quantitative description of the quartz-alkali interaction in terms of pH, salinity, ion exchange properties, temperature and contact time, which are of significant importance in the design of soluble silicate flooding processes. The modeling of an adiabatic hot waterflood assisted by the simultaneous injection of a chemical additive is next presented. The model is also applicable to the hot alkaline flooding under conditions of negligible adsorption of the generated anionic surfactant and of hydroxide adsorption being Langmuirian. The theory of generalized simple waves (coherence ) is used to develop solutions for the temperature, concentration, and oil saturation profiles, as well as the oil recovery curves. It is shown that, for Langmuir adsorption kinetics, the chemical resides in the heated region of the reservoir if its injection concentration is below a critical value, and in the unheated region if its concentration exceeds this critical value. Results for a chemical slug injection in a tertiary recovery process indicate recovery performance is maximized when chemical resides in the heated region of the reservior.

  18. Floods, flood control, and bottomland vegetation

    USGS Publications Warehouse

    Friedman, Jonathan M.; Auble, Gregor T.

    2000-01-01

    Bottomland plant communities are typically dominated by the effects of floods. Floods create the surfaces on which plants become established, transport seeds and nutrients, and remove establish plants. Floods provide a moisture subsidy that allows development of bottomland forests in arid regions and produce anoxic soils, which can control bottomland plant distribution in humid regions. Repeated flooding produces a mosaic of patches of different age, sediment texture, and inundation duration; this mosaic fosters high species richness.

  19. Surfactant mixing rules applied to surfactant enhanced alkaline flooding

    SciTech Connect

    Taylor, K.C. )

    1992-01-01

    This paper discusses surfactant mixing rules which have been used to describe crude oil/alkali/surfactant phase behavior, using David Lloydminster crude oil and the surfactant Neodol 25-3S. It was found that at a fixed salinity and alkali concentration, a specific mole fraction of synthetic surfactant to petroleum soap was required to produce optimal phase behavior as the water-to-oil ratio varied. This methodology is useful in understanding the relationship between the variables of water-to-oil ratio and synthetic surfactant concentration in phase behavior systems that produce a petroleum soap.

  20. Detailed evaluation of the West Kiehl alkaline-surfactant-polymer field project and it`s application to mature Minnelusa waterfloods. Technical progress report for the period of April--June, 1994

    SciTech Connect

    Pitts, M.J.

    1994-09-01

    The objective of this study of the West Kiehl is to (1) quantify the incremental oil produced from the West Kiehl alkaline-surfactant-polymer project by classical engineering and numerical simulation techniques, (2) quantify the effect of chemical slug volume on incremental oil in the two swept areas of the field, (3) determine the economics of the application of the alkaline-surfactant-polymer technology, (4) forecast the results of injecting an alkaline--surfactant-polymer solution to mature waterfloods and polymer floods, and (5) provide the basis for independent operators to book additional oil reserves by using the alkaline-surfactant-polymer technology. This report will document the numerical simulation waterflood, polymer flood, alkaline-surfactant flood and alkaline-surfactant-polymer flood predictions from the West Kiehl and Prairie Creek South fields.

  1. Visual Sensing for Urban Flood Monitoring.

    PubMed

    Lo, Shi-Wei; Wu, Jyh-Horng; Lin, Fang-Pang; Hsu, Ching-Han

    2015-08-14

    With the increasing climatic extremes, the frequency and severity of urban flood events have intensified worldwide. In this study, image-based automated monitoring of flood formation and analyses of water level fluctuation were proposed as value-added intelligent sensing applications to turn a passive monitoring camera into a visual sensor. Combined with the proposed visual sensing method, traditional hydrological monitoring cameras have the ability to sense and analyze the local situation of flood events. This can solve the current problem that image-based flood monitoring heavily relies on continuous manned monitoring. Conventional sensing networks can only offer one-dimensional physical parameters measured by gauge sensors, whereas visual sensors can acquire dynamic image information of monitored sites and provide disaster prevention agencies with actual field information for decision-making to relieve flood hazards. The visual sensing method established in this study provides spatiotemporal information that can be used for automated remote analysis for monitoring urban floods. This paper focuses on the determination of flood formation based on image-processing techniques. The experimental results suggest that the visual sensing approach may be a reliable way for determining the water fluctuation and measuring its elevation and flood intrusion with respect to real-world coordinates. The performance of the proposed method has been confirmed; it has the capability to monitor and analyze the flood status, and therefore, it can serve as an active flood warning system.

  2. Visual Sensing for Urban Flood Monitoring.

    PubMed

    Lo, Shi-Wei; Wu, Jyh-Horng; Lin, Fang-Pang; Hsu, Ching-Han

    2015-01-01

    With the increasing climatic extremes, the frequency and severity of urban flood events have intensified worldwide. In this study, image-based automated monitoring of flood formation and analyses of water level fluctuation were proposed as value-added intelligent sensing applications to turn a passive monitoring camera into a visual sensor. Combined with the proposed visual sensing method, traditional hydrological monitoring cameras have the ability to sense and analyze the local situation of flood events. This can solve the current problem that image-based flood monitoring heavily relies on continuous manned monitoring. Conventional sensing networks can only offer one-dimensional physical parameters measured by gauge sensors, whereas visual sensors can acquire dynamic image information of monitored sites and provide disaster prevention agencies with actual field information for decision-making to relieve flood hazards. The visual sensing method established in this study provides spatiotemporal information that can be used for automated remote analysis for monitoring urban floods. This paper focuses on the determination of flood formation based on image-processing techniques. The experimental results suggest that the visual sensing approach may be a reliable way for determining the water fluctuation and measuring its elevation and flood intrusion with respect to real-world coordinates. The performance of the proposed method has been confirmed; it has the capability to monitor and analyze the flood status, and therefore, it can serve as an active flood warning system. PMID:26287201

  3. Visual Sensing for Urban Flood Monitoring

    PubMed Central

    Lo, Shi-Wei; Wu, Jyh-Horng; Lin, Fang-Pang; Hsu, Ching-Han

    2015-01-01

    With the increasing climatic extremes, the frequency and severity of urban flood events have intensified worldwide. In this study, image-based automated monitoring of flood formation and analyses of water level fluctuation were proposed as value-added intelligent sensing applications to turn a passive monitoring camera into a visual sensor. Combined with the proposed visual sensing method, traditional hydrological monitoring cameras have the ability to sense and analyze the local situation of flood events. This can solve the current problem that image-based flood monitoring heavily relies on continuous manned monitoring. Conventional sensing networks can only offer one-dimensional physical parameters measured by gauge sensors, whereas visual sensors can acquire dynamic image information of monitored sites and provide disaster prevention agencies with actual field information for decision-making to relieve flood hazards. The visual sensing method established in this study provides spatiotemporal information that can be used for automated remote analysis for monitoring urban floods. This paper focuses on the determination of flood formation based on image-processing techniques. The experimental results suggest that the visual sensing approach may be a reliable way for determining the water fluctuation and measuring its elevation and flood intrusion with respect to real-world coordinates. The performance of the proposed method has been confirmed; it has the capability to monitor and analyze the flood status, and therefore, it can serve as an active flood warning system. PMID:26287201

  4. Hydrologic versus geomorphic drivers of trends in flood hazard

    NASA Astrophysics Data System (ADS)

    Slater, Louise J.; Singer, Michael Bliss; Kirchner, James W.

    2015-01-01

    is a major hazard to lives and infrastructure, but trends in flood hazard are poorly understood. The capacity of river channels to convey flood flows is typically assumed to be stationary, so changes in flood frequency are thought to be driven primarily by trends in streamflow. We have developed new methods for separately quantifying how trends in both streamflow and channel capacity have affected flood frequency at gauging sites across the United States Flood frequency was generally nonstationary, with increasing flood hazard at a statistically significant majority of sites. Changes in flood hazard driven by channel capacity were smaller, but more numerous, than those driven by streamflow. Our results demonstrate that accurately quantifying changes in flood hazard requires accounting separately for trends in both streamflow and channel capacity. They also show that channel capacity trends may have unforeseen consequences for flood management and for estimating flood insurance costs.

  5. Investigation on the coprecipitation of transuranium elements from alkaline solutions by the method of appearing reagents. Study of the effects of waste components on decontamination from Np(IV) and Pu(IV)

    SciTech Connect

    Bessonov, A.A.; Budantseva, N.A.; Gelis, A.V.; Nikonov, M.V.; Shilov, V.P.

    1997-09-01

    The third stage of the study on the homogeneous coprecipitation of neptunium and plutonium from alkaline high-level radioactive waste solutions by the Method of Appearing Reagents has been completed. Alkaline radioactive wastes exist at the U.S. Department of Energy Hanford Site. The recent studies investigated the effects of neptunium chemical reductants, plutonium(IV) concentration, and the presence of bulk tank waste solution components on the decontamination from tetravalent neptunium and plutonium achieved by homogeneous coprecipitation. Data on neptunium reduction to its tetravalent state in alkaline solution of different NaOH concentrations are given. Eleven reductants were tested to find those most suited to remove neptunium, through chemical reduction, from alkaline solution by homogeneous coprecipitation. Hydrazine, VOSO{sub 4}, and Na{sub 2}S{sub 2}O{sub 4} were found to be the most effective reductants. The rates of reduction with these reductants were comparable with the kinetics of carrier formation. Solution decontamination factors of about 400 were attained for 10{sup -6}M neptunium. Coprecipitation of plutonium(IV) with carriers obtained as products of thermal hydrolysis, redox transformations, and catalytic decomposition of [Co(NH{sub 3}){sub 6}]{sup 3+}, [Fe(CN){sub 5}NO]{sup 2-}, Cr(NO{sub 3}){sub 3}, KMnO{sub 4}, and Li{sub 4}UO{sub 2}(O{sub 2}){sub 3} was studied and results are described. Under optimum conditions, a 100-fold decrease of plutonium concentration was possible with each of these reagents.

  6. Flooding and Schools

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2011

    2011-01-01

    According to the Federal Emergency Management Agency, flooding is the nation's most common natural disaster. Some floods develop slowly during an extended period of rain or in a warming trend following a heavy snow. Flash floods can occur quickly, without any visible sign of rain. Catastrophic floods are associated with burst dams and levees,…

  7. Alkaline quinone flow battery.

    PubMed

    Lin, Kaixiang; Chen, Qing; Gerhardt, Michael R; Tong, Liuchuan; Kim, Sang Bok; Eisenach, Louise; Valle, Alvaro W; Hardee, David; Gordon, Roy G; Aziz, Michael J; Marshak, Michael P

    2015-09-25

    Storage of photovoltaic and wind electricity in batteries could solve the mismatch problem between the intermittent supply of these renewable resources and variable demand. Flow batteries permit more economical long-duration discharge than solid-electrode batteries by using liquid electrolytes stored outside of the battery. We report an alkaline flow battery based on redox-active organic molecules that are composed entirely of Earth-abundant elements and are nontoxic, nonflammable, and safe for use in residential and commercial environments. The battery operates efficiently with high power density near room temperature. These results demonstrate the stability and performance of redox-active organic molecules in alkaline flow batteries, potentially enabling cost-effective stationary storage of renewable energy. PMID:26404834

  8. Alkaline quinone flow battery.

    PubMed

    Lin, Kaixiang; Chen, Qing; Gerhardt, Michael R; Tong, Liuchuan; Kim, Sang Bok; Eisenach, Louise; Valle, Alvaro W; Hardee, David; Gordon, Roy G; Aziz, Michael J; Marshak, Michael P

    2015-09-25

    Storage of photovoltaic and wind electricity in batteries could solve the mismatch problem between the intermittent supply of these renewable resources and variable demand. Flow batteries permit more economical long-duration discharge than solid-electrode batteries by using liquid electrolytes stored outside of the battery. We report an alkaline flow battery based on redox-active organic molecules that are composed entirely of Earth-abundant elements and are nontoxic, nonflammable, and safe for use in residential and commercial environments. The battery operates efficiently with high power density near room temperature. These results demonstrate the stability and performance of redox-active organic molecules in alkaline flow batteries, potentially enabling cost-effective stationary storage of renewable energy.

  9. A physically-based method for predicting peak discharge of floods caused by failure of natural and constructed earthen dams

    USGS Publications Warehouse

    Walder, J.S.; O'Connor, J. E.; Costa, J.E.; ,

    1997-01-01

    We analyse a simple, physically-based model of breach formation in natural and constructed earthen dams to elucidate the principal factors controlling the flood hydrograph at the breach. Formation of the breach, which is assumed trapezoidal in cross-section, is parameterized by the mean rate of downcutting, k, the value of which is constrained by observations. A dimensionless formulation of the model leads to the prediction that the breach hydrograph depends upon lake shape, the ratio r of breach width to depth, the side slope ?? of the breach, and the parameter ?? = (V.D3)(k/???gD), where V = lake volume, D = lake depth, and g is the acceleration due to gravity. Calculations show that peak discharge Qp depends weakly on lake shape r and ??, but strongly on ??, which is the product of a dimensionless lake volume and a dimensionless erosion rate. Qp(??) takes asymptotically distinct forms depending on whether < ??? 1 or < ??? 1. Theoretical predictions agree well with data from dam failures for which k could be reasonably estimated. The analysis provides a rapid and in many cases graphical way to estimate plausible values of Qp at the breach.We analyze a simple, physically-based model of breach formation in natural and constructed earthen dams to elucidate the principal factors controlling the flood hydrograph at the breach. Formation of the breach, which is assumed trapezoidal in cross-section, is parameterized by the mean rate of downcutting, k, the value of which is constrained by observations. A dimensionless formulation of the model leads to the prediction that the breach hydrograph depends upon lake shape, the ratio r of breach width to depth, the side slope ?? of the breach, and the parameter ?? = (V/D3)(k/???gD), where V = lake volume, D = lake depth, and g is the acceleration due to gravity. Calculations show that peak discharge Qp depends weakly on lake shape r and ??, but strongly on ??, which is the product of a dimensionless lake volume and a

  10. A uniform technique for flood frequency analysis.

    USGS Publications Warehouse

    Thomas, W.O., Jr.

    1985-01-01

    This uniform technique consisted of fitting the logarithms of annual peak discharges to a Pearson Type III distribution using the method of moments. The objective was to adopt a consistent approach for the estimation of floodflow frequencies that could be used in computing average annual flood losses for project evaluation. In addition, a consistent approach was needed for defining equitable flood-hazard zones as part of the National Flood Insurance Program. -from ASCE Publications Information

  11. A physically-based method for predicting peak discharge of floods caused by failure of natural and constructed earthen dams

    USGS Publications Warehouse

    Walder, J.S.

    1997-01-01

    We analyse a simple, physically-based model of breach formation in natural and constructed earthen dams to elucidate the principal factors controlling the flood hydrograph at the breach. Formation of the breach, which is assumed trapezoidal in cross-section, is parameterized by the mean rate of downcutting, k, the value of which is constrained by observations. A dimensionless formulation of the model leads to the prediction that the breach hydrograph depends upon lake shape, the ratio r of breach width to depth, the side slope ?? of the breach, and the parameter ?? = (V/ D3)(k/???gD), where V = lake volume, D = lake depth, and g is the acceleration due to gravity. Calculations show that peak discharge Qp depends weakly on lake shape r and ??, but strongly on ??, which is the product of a dimensionless lake volume and a dimensionless erosion rate. Qp(??) takes asymptotically distinct forms depending on whether ?? > 1. Theoretical predictions agree well with data from dam failures for which k could be reasonably estimated. The analysis provides a rapid and in many cases graphical way to estimate plausible values of Qp at the breach.

  12. COUPLING THE ALKALINE-SURFACTANT-POLYMER TECHNOLOGY AND THE GELATION TECHNOLOGY TO MAXIMIZE OIL PRODUCTION

    SciTech Connect

    Malcolm Pitts; Jie Qi; Dan Wilson

    2004-10-01

    Gelation technologies have been developed to provide more efficient vertical sweep efficiencies for flooding naturally fractured oil reservoirs or more efficient areal sweep efficiency for those with high permeability contrast ''thief zones''. The field proven alkaline-surfactant-polymer technology economically recovers 15% to 25% OOIP more oil than waterflooding from swept pore space of an oil reservoir. However, alkaline-surfactant-polymer technology is not amenable to naturally fractured reservoirs or those with thief zones because much of injected solution bypasses target pore space containing oil. This work investigates whether combining these two technologies could broaden applicability of alkaline-surfactant-polymer flooding into these reservoirs. A prior fluid-fluid report discussed interaction of different gel chemical compositions and alkaline-surfactant-polymer solutions. Gel solutions under dynamic conditions of linear corefloods showed similar stability to alkaline-surfactant-polymer solutions as in the fluid-fluid analyses. Aluminum-polyacrylamide, flowing gels are not stable to alkaline-surfactant-polymer solutions of either pH 10.5 or 12.9. Chromium acetate-polyacrylamide flowing and rigid flowing gels are stable to subsequent alkaline-surfactant-polymer solution injection. Rigid flowing chromium acetate-polyacrylamide gels maintained permeability reduction better than flowing chromium acetate-polyacrylamide gels. Silicate-polyacrylamide gels are not stable with subsequent injection of either a pH 10.5 or a 12.9 alkaline-surfactant-polymer solution. Neither aluminum citrate-polyacrylamide nor silicate-polyacrylamide gel systems produced significant incremental oil in linear corefloods. Both flowing and rigid flowing chromium acetate-polyacrylamide gels produced incremental oil with the rigid flowing gel producing the greatest amount. Higher oil recovery could have been due to higher differential pressures across cores. None of the gels tested

  13. Drivers of flood damage on event level

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi

    2016-04-01

    Flood risk is dynamic and influenced by many processes related to hazard, exposure and vulnerability. Flood damage increased significantly over the past decades, however, resulting overall economic loss per event is an aggregated indicator and it is difficult to attribute causes to this increasing trend. Much has been learned about damaging processes during floods at the micro-scale, e.g. building level. However, little is known about the main factors determining the amount of flood damage on event level. Thus, we analyse and compare paired flood events, i.e. consecutive, similar damaging floods that occurred in the same area. In analogy to 'Paired catchment studies' - a well-established method in hydrology to understand how changes in land use affect streamflow - we will investigate how and why resulting flood damage in a region differed between the first and second consecutive flood events. One example are the 2002 and 2013 floods in the Elbe and Danube catchments in Germany. The 2002 flood caused the highest economic damage (EUR 11600 million) due to a natural hazard event in Germany. Damage was so high due to extreme flood hazard triggered by extreme precipitation and a high number of resulting dyke breaches. Additionally, exposure hotspots like the city of Dresden at the Elbe river as well as some smaller municipalities at the river Mulde (e.g. Grimma, Eilenburg, Bitterfeld, Dessau) were severely impacted. However, affected parties and authorities learned from the extreme flood in 2002, and many governmental flood risk programs and initiatives were launched. Considerable improvements since 2002 occurred on many levels that deal with flood risk reduction and disaster response, in particular in 1) increased flood prevention by improved spatial planning, 2) an increased number of property-level mitigation measures, 3) more effective early warning and improved coordination of disaster response and 4) a more targeted maintenance of flood defence systems and their

  14. Physicochemical methods for enhancing oil recovery from oil fields

    NASA Astrophysics Data System (ADS)

    Altunina, L. K.; Kuvshinov, V. A.

    2007-10-01

    Physicochemical methods for enhancing oil recovery from oil fields that are developed using water flooding and thermal steam treatment are considered. The results of pilot testing of processes based on these methods carried out at West Siberian and Chinese oil fields are analysed. The attention is focused on the processes that make use of surfactant blends and alkaline buffer solutions and thermotropic gel-forming systems.

  15. Flood Resilient Systems and their Application for Flood Resilient Planning

    NASA Astrophysics Data System (ADS)

    Manojlovic, N.; Gabalda, V.; Antanaskovic, D.; Gershovich, I.; Pasche, E.

    2012-04-01

    Following the paradigm shift in flood management from traditional to more integrated approaches, and considering the uncertainties of future development due to drivers such as climate change, one of the main emerging tasks of flood managers becomes the development of (flood) resilient cities. It can be achieved by application of non-structural - flood resilience measures, summarised in the 4As: assistance, alleviation, awareness and avoidance (FIAC, 2007). As a part of this strategy, the key aspect of development of resilient cities - resilient built environment can be reached by efficient application of Flood Resilience Technology (FReT) and its meaningful combination into flood resilient systems (FRS). FRS are given as [an interconnecting network of FReT which facilitates resilience (including both restorative and adaptive capacity) to flooding, addressing physical and social systems and considering different flood typologies] (SMARTeST, http://www.floodresilience.eu/). Applying the system approach (e.g. Zevenbergen, 2008), FRS can be developed at different scales from the building to the city level. Still, a matter of research is a method to define and systematise different FRS crossing those scales. Further, the decision on which resilient system is to be applied for the given conditions and given scale is a complex task, calling for utilisation of decision support tools. This process of decision-making should follow the steps of flood risk assessment (1) and development of a flood resilience plan (2) (Manojlovic et al, 2009). The key problem in (2) is how to match the input parameters that describe physical&social system and flood typology to the appropriate flood resilient system. Additionally, an open issue is how to integrate the advances in FReT and findings on its efficiency into decision support tools. This paper presents a way to define, systematise and make decisions on FRS at different scales of an urban system developed within the 7th FP Project

  16. Flood-frequency characteristics of Wisconsin streams

    USGS Publications Warehouse

    Walker, John F.; Krug, William R.

    2003-01-01

    Flood-frequency characteristics for 312 gaged sites on Wisconsin streams are presented for recurrence intervals of 2 to 100 years using flood-peak data collected through water year 2000. Equations of the relations between flood-frequency and drainage-basin characteristics were developed by multiple-regression analyses. Flood-frequency characteristics for ungaged sites on unregulated, rural streams can be estimated by use of these equations. The state was divided into five areas with similar physiographic characteristics. The most significant basin characteristics are drainage area, main-channel slope, soil permeability, storage, rainfall intensity, and forest cover. The standard error of prediction for the equation for the 100-year flood discharge ranges from 22 to 44 percent in the state. A graphical method for estimating flood-frequency characteristics of regulated streams was developed from the relation of discharge and drainage area. Graphs for the major regulated streams are presented.

  17. Building A Database Of Flood Extension Maps Using Satellite Imagery

    NASA Astrophysics Data System (ADS)

    Roque, D.; Afonso, N.; Fonseca, A. M.; Heleno, S.

    2013-12-01

    Hydraulic flood models can be used to identify the regions prone to floods. In order to achieve reliable information, the models must be calibrated using data from past floods. In this study, a set of optical and Synthetic Aperture Radar (SAR) images are used to obtain flood extension maps in the lower River Tagus, Portugal, from 1992 to 2012. An object-based approach and thresholding operations are used to extract the flood boundaries. While for optical data two thresholding operations are enough, for SAR images, successive thresholding procedures are applied over different data types in order to identify flooded regions with distinct characteristics (smooth water, disturbed water and emerged elements). The proposed method allowed the extraction of flood boundaries for 25 flood dates, with an 88% of correctly detected flood area for both the optical and the SAR data.

  18. Alkaline galvanic cell

    SciTech Connect

    Inoue, T.; Maeda, Y.; Momose, K.; Wakahata, T.

    1983-10-04

    An alkaline galvanic cell is disclosed including a container serving for a cathode terminal, a sealing plate in the form of a layered clad plate serving for an anode terminal to be fitted into the container, and an insulating packing provided between the sealing plate and container for sealing the cell upon assembly. The cell is provided with a layer of epoxy adduct polyamide amine having amine valence in the range of 50 to 400 and disposed between the innermost copper layer of the sealing plate arranged to be readily amalgamated and the insulating packing so as to serve as a sealing agent or liquid leakage suppression agent.

  19. Alkaline fuel cells applications

    NASA Astrophysics Data System (ADS)

    Kordesch, Karl; Hacker, Viktor; Gsellmann, Josef; Cifrain, Martin; Faleschini, Gottfried; Enzinger, Peter; Fankhauser, Robert; Ortner, Markus; Muhr, Michael; Aronson, Robert R.

    On the world-wide automobile market technical developments are increasingly determined by the dramatic restriction on emissions as well as the regimentation of fuel consumption by legislation. Therefore there is an increasing chance of a completely new technology breakthrough if it offers new opportunities, meeting the requirements of resource preservation and emission restrictions. Fuel cell technology offers the possibility to excel in today's motive power techniques in terms of environmental compatibility, consumer's profit, costs of maintenance and efficiency. The key question is economy. This will be decided by the costs of fuel cell systems if they are to be used as power generators for future electric vehicles. The alkaline hydrogen-air fuel cell system with circulating KOH electrolyte and low-cost catalysed carbon electrodes could be a promising alternative. Based on the experiences of Kordesch [K. Kordesch, Brennstoffbatterien, Springer, Wien, 1984, ISBN 3-387-81819-7; K. Kordesch, City car with H 2-air fuel cell and lead-battery, SAE Paper No. 719015, 6th IECEC, 1971], who operated a city car hybrid vehicle on public roads for 3 years in the early 1970s, improved air electrodes plus new variations of the bipolar stack assembly developed in Graz are investigated. Primary fuel choice will be a major issue until such time as cost-effective, on-board hydrogen storage is developed. Ammonia is an interesting option. The whole system, ammonia dissociator plus alkaline fuel cell (AFC), is characterised by a simple design and high efficiency.

  20. COUPLING THE ALKALINE-SURFACTANT-POLYMER TECHNOLOGY AND THE GELATION TECHNOLOGY TO MAXIMIZE OIL PRODUCTION

    SciTech Connect

    Malcolm Pitts; Jie Qui; Dan Wilson; Phil Dowling

    2004-05-01

    Gelation technologies have been developed to provide more efficient vertical sweep efficiencies for flooding naturally fractured oil reservoirs or more efficient areal sweep efficiency those with high permeability contrast ''thief zones''. The field proven alkaline-surfactant-polymer technology economically recovers 15% to 25% OOIP more oil than waterflooding in the swept pore space of an oil reservoir. However, alkaline-surfactant-polymer technology is not amenable to the naturally fractured reservoirs or those with thief zones because much of the injected solution bypasses the target pore space containing oil. The objective of this work is to investigate whether combining these two technologies could broaden the applicability of alkaline-surfactant-polymer flooding into these reservoirs. Fluid-fluid interaction with different gel chemical compositions and alkaline-surfactant-polymer solution with pH values ranging from 9.2 to 12.9 have been tested. Aluminum-polyacrylamide gels are not stable to alkaline-surfactant-polymer solutions at any pH. Chromium--polyacrylamide gels with polymer to chromium ion ratios of 25 or greater were stable to alkaline-surfactant-polymer solutions if solution pH was 10.6 or less. When the polymer to chromium ion was 15 or less, chromium-polyacrylamide gels were stable to alkaline-surfactant-polymer solutions with pH values up to 12.9. Chromium-xanthan gum gels were stable to alkaline-surfactant-polymer solutions with pH values of 12.9 at the polymer to chromium ion ratios tested. Silicate-polyacrylamide, resorcinol-formaldehyde, and sulfomethylated resorcinol-formaldehyde gels were also stable to alkaline-surfactant-polymer solutions with pH values ranging from 9.2 to 12.9. Iron-polyacrylamide gels were immediately destroyed when contacted with any of the alkaline-surfactant-polymer solutions with pH values of 9.2 to 12.9.

  1. Flood frequency analyses with annual and partial flood series

    NASA Astrophysics Data System (ADS)

    Bezak, N.; Brilly, M.; Sraj, M.

    2012-04-01

    The objective of the study was (1) to analyse the influence of time scale of the data on the results, (2) to analyse the relations between discharge, volume and time of flood waves of the Sava river at Litija (Slovenia), (3) to perform flood frequency analyses of peak discharges with annual and partial data series and compare the results and (4) to explore the influence of threshold value by POT method. Calculations and analyses were made for the period 1953-2010. Daily scale data sets (considering also local maximum) were used. The flood frequency analyses were based on anual and partial data series. The differences between daily and hourly time scale data sets were explored. Daily and hourly time scale hydrographs were compared and differences were analysed. Differences were adequately small. Daily time series with included maximums were logical choice because of the length of the daily time series and because hourly time series were not continuous due to gauging equipment failures. Important objective of the study was to analyse the relationship between discharge, volume and duration of flood waves. Baseflow was separated from continuous daily discharge measurements on simple and complex hydrographs. Simple graphical method with three points was used. Many different coefficients like base flow index were calculated and different combinations of correlation coefficient of wave components were examined. Annual maximum series were used to study the relationship between wave components. Flood frequency analyses were made with annual maximum series and partial duration series. Log-normal distribution, Pearson distribution type 3, log-Pearson distribution type 3, Gumbel distribution, exponential distribution, GEV distribution and GL distribution were used for annual maximum series. Simple equation of linear transformation was used to determine the design discharge and procedure which is proposed in Flood Estimation Handbook was used with GEV and GL distribution

  2. Enzymatic methods for the determination of pollution in seawater using salt resistant alkaline phosphatase from eggs of the sea urchin Strongylocentrotus intermedius.

    PubMed

    Menzorova, Natalie I; Seitkalieva, Alexandra V; Rasskazov, Valerу A

    2014-02-15

    A new salt resistant alkaline phosphatase from eggs of the sea urchin Strongylocentrotus intermedius (StAP) has been shown to have a unique property to hydrolyze substrate in seawater without loss of enzymatic activity. The enzyme has pH optimum at 8.0-8.5. Model experiments showed various concentrations of copper, zinc, cadmium and lead added to seawater or a standard buffer mixture to inhibit completely the enzyme activity at the concentrations of 15-150 μg/l. StAP sensitivity to the presence in seawater of metals, pesticides, detergents and oil products appears to be considerably less. Samples of seawater taken from aquatic areas of the Troitsy Bay of the Peter the Great Bay, Japan Sea have been shown to inhibit the enzyme activity; the same was shown for the samples of fresh waters. The phosphatase inhibition assay developed proved to be highly sensitive, technically easy-to use allowing to test a great number of samples.

  3. Frequency analyses for recent regional floods in the United States

    USGS Publications Warehouse

    Melcher, Nick B.; Martinez, Patsy G.

    1996-01-01

    During 1993-95, significant floods that resulted in record-high river stages, loss of life, and significant property damage occurred in the United States. The floods were caused by unique global weather patterns that produced large amounts of rain over large areas. Standard methods for flood-frequency analyses may not adequately consider the probability of recurrence of these global weather patterns.

  4. Characterization of remarkable floods in France, a transdisciplinary approach applied on generalized floods of January 1910

    NASA Astrophysics Data System (ADS)

    Boudou, Martin; Lang, Michel; Vinet, Freddy; Coeur, Denis

    2014-05-01

    . The January 1910's flood is one of these remarkable floods. This event is foremost known for its aftermaths on the Seine basin, where the flood remains the strongest recorded in Paris since 1658. However, its impacts were also widespread to France's Eastern regions (Martin, 2001). To demonstrate the evaluation grid's interest, we propose a deep analysis of the 1910's river flood with the integration of historical documentation. The approach focus on eastern France where the flood remains the highest recorded for several rivers but were often neglected by scientists in favor of Paris's flood. Through a transdisciplinary research based on the evaluation grid method, we will describe the January 1910 flood event and define why it can be considered as a remarkable flood for these regions.

  5. Silica in alkaline brines

    USGS Publications Warehouse

    Jones, B.F.; Rettig, S.L.; Eugster, H.P.

    1967-01-01

    Analysis of sodium carbonate-bicarbonate brines from closed basins in volcanic terranes of Oregon and Kenya reveals silica contents of up to 2700 parts per million at pH's higher than 10. These high concentrations of SiO 2 can be attributed to reaction of waters with silicates, and subsequent evaporative concentration accompanied by a rise in pH. Supersaturation with respect to amorphous silica may occur and persist for brines that are out of contact with silicate muds and undersaturated with respect to trona; correlation of SiO2 with concentration of Na and total CO2 support this interpretation. Addition of moredilute waters to alkaline brines may lower the pH and cause inorganic precipitation of substantial amounts of silica.

  6. Bifunctional alkaline oxygen electrodes

    NASA Technical Reports Server (NTRS)

    Swette, L.; Kackley, N.; Mccatty, S. A.

    1991-01-01

    The authors describe the identification and testing of electrocatalysts and supports for the positive electrode of moderate-temperature, single-unit, rechargeable alkaline fuel cells. Recent work on Na(x)Pt3O4, a potential bifunctional catalyst, is described, as well as the application of novel approaches to the development of more efficient bifunctional electrode structures. The three dual-character electrodes considered here showed similar superior performance; the Pt/RhO2 and Rh/RhO2 electrodes showed slightly better performance than the Pt/IrO2 electrode. It is concluded that Na(x)Pt3O4 continues to be a promising bifunctional oxygen electrode catalyst but requires further investigation and development.

  7. Immunohistochemical detection of disease-associated prion protein in the intestine of cattle naturally affected with bovine spongiform encephalopathy by using an alkaline-based chemical antigen retrieval method.

    PubMed

    Okada, Hiroyuki; Iwamaru, Yoshihumi; Imamura, Morikazu; Masujin, Kentaro; Yokoyama, Takashi; Mohri, Shirou

    2010-11-01

    An alkaline-based chemical antigen retrieval pretreatment step was used to enhance immunolabeling of disease-associated prion protein (PrP(Sc)) in formalin-fixed and paraffin-embedded tissue sections from cattle naturally affected with bovine spongiform encephalopathy (BSE). The modified chemical method used in this study amplified the PrP(Sc) signal by unmasking PrP(Sc) compared with the normal cellular prion protein. In addition, this method reduced nonspecific background immunolabeling that resulted from the destruction of the residual normal cellular form of prion protein, and reduced the treatment time compared with the usual autoclave pretreatment step. Immunolabeled PrP(Sc) was thereby clearly detected in the myenteric plexus of the ileum in naturally occurring BSE cattle.

  8. COUPLING THE ALKALINE-SURFACTANT-POLYMER TECHNOLOGY AND THE GELATION TECHNOLOGY TO MAXIMIZE OIL PRODUCTION

    SciTech Connect

    Malcolm Pitts; Jie Qi; Dan Wilson; David Stewart; Bill Jones

    2005-04-01

    Gelation technologies have been developed to provide more efficient vertical sweep efficiencies for flooding naturally fractured oil reservoirs or more efficient areal sweep efficiency for those with high permeability contrast ''thief zones''. The field proven alkaline-surfactant-polymer technology economically recovers 15% to 25% OOIP more oil than waterflooding from swept pore space of an oil reservoir. However, alkaline-surfactant-polymer technology is not amenable to naturally fractured reservoirs or those with thief zones because much of injected solution bypasses target pore space containing oil. This work investigates whether combining these two technologies could broaden applicability of alkaline-surfactant-polymer flooding into these reservoirs. A prior fluid-fluid report discussed interaction of different gel chemical compositions and alkaline-surfactant-polymer solutions. Gel solutions under dynamic conditions of linear corefloods showed similar stability to alkaline-surfactant-polymer solutions as in the fluid-fluid analyses. Aluminum-polyacrylamide, flowing gels are not stable to alkaline-surfactant-polymer solutions of either pH 10.5 or 12.9. Chromium acetate-polyacrylamide flowing and rigid flowing gels are stable to subsequent alkaline-surfactant-polymer solution injection. Rigid flowing chromium acetate-polyacrylamide gels maintained permeability reduction better than flowing chromium acetate-polyacrylamide gels. Silicate-polyacrylamide gels are not stable with subsequent injection of either a pH 10.5 or a 12.9 alkaline-surfactant-polymer solution. Chromium acetate-xanthan gum rigid gels are not stable to subsequent alkaline-surfactant-polymer solution injection. Resorcinol-formaldehyde gels were stable to subsequent alkaline-surfactant-polymer solution injection. When evaluated in a dual core configuration, injected fluid flows into the core with the greatest effective permeability to the injected fluid. The same gel stability trends to subsequent

  9. Severe Flooding in India

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Floods devestated parts of eastern India along the Brahmaputra River in June 2000. In some tributaries of the Brahmaputra, the water reached more than 5 meters (16.5 feet) above flood stage. At least 40 residents died, and the flood waters destroyed a bridge linking the region to the rest of India. High water also threatened endangered Rhinos in Kaziranga National Park. Flooded areas are shown in red in the above image. The map was derived from Advanced Very High Resolution Radiometer (AVHRR) data taken on June 15, 2000. For more information on observing floods with satellites, see: Using Satellites to Keep our Head above Water and the Dartmouth Flood Observatory Image by the Dartmouth Flood Observatory

  10. [Salt-alkaline tolerance of sorghum germplasm at seedling stage].

    PubMed

    Gao, Jian-Ming; Xia, Bu-Xian; Yuan, Qing-Hua; Luo, Feng; Han, Yun; Gui, Zhi; Pei, Zhong-You; Sun, Shou-Jun

    2012-05-01

    A sand culture experiment with Hoagland solution plus NaCl and Na2CO3 was conducted to study the responses of sorghum seedlings to salt-alkaline stress. An assessment method for identifying the salt-alkaline tolerance of sorghum at seedling stage was established, and the salt-alkaline tolerance of 66 sorghum genotypes was evaluated. At the salt concentrations 8.0-12.5 g x L(-1), there was a great difference in the salt-alkaline tolerance between tolerant genotype 'TS-185' and susceptive 'Tx-622B', suggesting that this range of salt concentrations was an appropriate one to evaluate the salt-alkaline tolerance of sorghum at seedling stage. At the salt concentrations 10.0 and 12.5 g x L(-1), there existed significant differences in the relative livability, relative fresh mass, and relative height among the 66 genotypes, indicating a great difference in the salt-alkaline tolerance among these genotypes. The genotype 'Sanchisan' was highly tolerant, 16 genotypes such as 'MN-2735' were tolerant, 32 genotypes such as 'EARLY HONEY' were mild tolerant, 16 genotypes such as 'Tx-622B' were susceptive, and genotype 'MN-4588' was highly susceptive to salt-alkaline stress. Most of the sorghum genotypes belonging to Sudangrasses possessed a high salt-alkaline tolerance, while the sorghum genotypes belonging to maintainer lines were in adverse. PMID:22919841

  11. Hydrologic versus geomorphic drivers of trends in flood hazard

    NASA Astrophysics Data System (ADS)

    Slater, Louise J.; Bliss Singer, Michael; Kirchner, James W.

    2016-04-01

    Flooding is a major threat to lives and infrastructure, yet trends in flood hazard are poorly understood. The capacity of river channels to convey flood flows is typically assumed to be stationary, so changes in flood frequency are thought to be driven primarily by trends in streamflow. However, changes in channel capacity will also modify flood hazard, even if the flow frequency distribution does not change. We developed new methods for separately quantifying how trends in both streamflow and channel capacity have affected flood frequency at gauging sites across the United States. Using daily discharge records and manual field measurements of channel cross-sectional geometry for USGS gauging stations that have defined flood stages (water levels), we present novel methods for measuring long-term trends in channel capacity of gauged rivers, and for quantifying how they affect overbank flood frequency. We apply these methods to 401 U.S. rivers and detect measurable trends in flood hazard linked to changes in channel capacity and/or the frequency of high flows. Flood frequency is generally nonstationary across these 401 U.S. rivers, with increasing flood hazard at a statistically significant majority of sites. Changes in flood hazard driven by channel capacity are smaller, but more numerous, than those driven by streamflow, with a slight tendency to compensate for streamflow changes. Our results demonstrate that accurately quantifying changes in flood hazard requires accounting separately for trends in both streamflow and channel capacity, or using water levels directly. They also show that channel capacity trends may have unforeseen consequences for flood management and for estimating flood insurance costs. Slater, L. J., M. B. Singer, and J. W. Kirchner (2015), Hydrologic versus geomorphic drivers of trends in flood hazard, Geophys. Res. Lett., 42, 370-376, doi:10.1002/2014GL062482.

  12. Social media for disaster response during floods

    NASA Astrophysics Data System (ADS)

    Eilander, D.; van de Vries, C.; Baart, F.; van Swol, R.; Wagemaker, J.; van Loenen, A.

    2015-12-01

    During floods it is difficult to obtain real-time accurate information about the extent and severity of the hazard. This information is very important for disaster risk reduction management and crisis relief organizations. Currently, real-time information is derived from few sources such as field reports, traffic camera's, satellite images and areal images. However, getting a real-time and accurate picture of the situation on the ground remains difficult. At the same time, people affected by natural hazards increasingly share their observations and their needs through digital media. Unlike conventional monitoring systems, Twitter data contains a relatively large number of real-time ground truth observations representing both physical hazard characteristics and hazard impacts. In the city of Jakarta, Indonesia, the intensity of unique flood related tweets during a flood event, peaked at almost 900 tweets per minute during floods in early 2015. Flood events around the world in 2014/2015 yielded large numbers of flood related tweets: from Philippines (85.000) to Pakistan (82.000) to South-Korea (50.000) to Detroit (20.000). The challenge here is to filter out useful content from this cloud of data, validate these observations and convert them to readily usable information. In Jakarta, flood related tweets often contain information about the flood depth. In a pilot we showed that this type of information can be used for real-time mapping of the flood extent by plotting these observations on a Digital Elevation Model. Uncertainties in the observations were taken into account by assigning a probability to each observation indicating its likelihood to be correct based on statistical analysis of the total population of tweets. The resulting flood maps proved to be correct for about 75% of the neighborhoods in Jakarta. Further cross-validation of flood related tweets against (hydro-) meteorological data is to likely improve the skill of the method.

  13. Detailed evaluation of the West Kiehl alkaline-surfactant-polymer field project and it`s application to mature Minnelusa waterfloods. Technical progress report, July--September, 1994

    SciTech Connect

    Pitts, M.J.

    1994-12-31

    The objective is to (1) quantify the incremental oil produced from the West Kiehl alkaline-surfactant-polymer project by classical engineering and numerical simulation techniques, (2) quantify the effect of chemical slug volume on incremental oil in the two swept areas of the field, (3) determine the economics of the application of the alkaline-surfactant-polymer technology, (4) forecast the results of injecting an alkaline-surfactant-polymer solution to mature waterfloods and polymer floods, and (5) provide the basis for independent operators to book additional oil reserves by using the alkaline-surfactant-polymer technology. A geological study of 72 Minnelusa field surrounding the West Kiehl is complete. Of the 72 fields, 35 were studied in detail and, from these 35 fields, Prairie Creek South and Simpson Ranch were selected for numerical simulation as representative of Minnelusa waterfloods and polymer floods, respectively. This report documents the numerical simulation waterflood, polymer flood, alkaline-surfactant flood and alkaline-surfactant-polymer flood predictions from the West Kiehl, Simpson Ranch and Prairie Creek South fields.

  14. Alkaline and ultrasound assisted alkaline pretreatment for intensification of delignification process from sustainable raw-material.

    PubMed

    Subhedar, Preeti B; Gogate, Parag R

    2014-01-01

    Alkaline and ultrasound-assisted alkaline pretreatment under mild operating conditions have been investigated for intensification of delignification. The effect of NaOH concentration, biomass loading, temperature, ultrasonic power and duty cycle on the delignification has been studied. Most favorable conditions for only alkaline pretreatment were alkali concentration of 1.75 N, solid loading of 0.8% (w/v), temperature of 353 K and pretreatment time of 6 h and under these conditions, 40.2% delignification was obtained. In case of ultrasound-assisted alkaline approach, most favorable conditions obtained were alkali concentration of 1N, paper loading of 0.5% (w/v), sonication power of 100 W, duty cycle of 80% and pretreatment time of 70 min and the delignification obtained in ultrasound-assisted alkaline approach under these conditions was 80%. The material samples were characterized by FTIR, SEM, XRD and TGA technique. The lignin was recovered from solution by precipitation method and was characterized by FTIR, GPC and TGA technique.

  15. Evaluation of methyl methanesulfonate, 2,6-diaminotoluene and 5-fluorouracil: Part of the Japanese center for the validation of alternative methods (JaCVAM) international validation study of the in vivo rat alkaline comet assay.

    PubMed

    Plappert-Helbig, Ulla; Junker-Walker, Ursula; Martus, Hans-Joerg

    2015-07-01

    As a part of the Japanese Center for the Validation of Alternative Methods (JaCVAM)-initiative international validation study of the in vivo rat alkaline comet assay (comet assay), we examined methyl methanesulfonate, 2,6-diaminotoluene, and 5-fluorouracil under coded test conditions. Rats were treated orally with the maximum tolerated dose (MTD) and two additional descending doses of the respective compounds. In the MMS treated groups liver and stomach showed significantly elevated DNA damage at each dose level and a significant dose-response relationship. 2,6-diaminotoluene induced significantly elevated DNA damage in the liver at each dose and a statistically significant dose-response relationship whereas no DNA damage was obtained in the stomach. 5-fluorouracil did not induce DNA damage in either liver or stomach.

  16. Alkaline battery, separator therefore

    NASA Technical Reports Server (NTRS)

    Schmidt, George F. (Inventor)

    1980-01-01

    An improved battery separator for alkaline battery cells has low resistance to electrolyte ion transfer and high resistance to electrode ion transfer. The separator is formed by applying an improved coating to an electrolyte absorber. The absorber, preferably, is a flexible, fibrous, and porous substrate that is resistant to strong alkali and oxidation. The coating composition includes an admixture of a polymeric binder, a hydrolyzable polymeric ester and inert fillers. The coating composition is substantially free of reactive fillers and plasticizers commonly employed as porosity promoting agents in separator coatings. When the separator is immersed in electrolyte, the polymeric ester of the film coating reacts with the electrolyte forming a salt and an alcohol. The alcohol goes into solution with the electrolyte while the salt imbibes electrolyte into the coating composition. When the salt is formed, it expands the polymeric chains of the binder to provide a film coating substantially permeable to electrolyte ion transfer but relatively impermeable to electrode ion transfer during use.

  17. Fabrication of free-standing NiCo{sub 2}O{sub 4} nanoarrays via a facile modified hydrothermal synthesis method and their applications for lithium ion batteries and high-rate alkaline batteries

    SciTech Connect

    Zheng, Qingyun Zhang, Xiangyang; Shen, Youming

    2015-03-15

    Graphical abstract: Hydrothermal-synthesized NiCo{sub 2}O{sub 4} nanoflake arrays exhibit porous structure and high capacity as well as good cycling life for lithium ion batteries and alkaline batteries. - Highlights: • Self-supported NiCo{sub 2}O{sub 4} nanoflake arrays are prepared by a hydrothermal method. • NiCo{sub 2}O{sub 4} nanoflake arrays show high capacity and good cycling life. • Porous nanoflake arrays structure is favorable for fast ion/electron transfer. - Abstract: Self-supported NiCo{sub 2}O{sub 4} nanoflake arrays on nickel foam are prepared by a facile hydrothermal method. The obtained NiCo{sub 2}O{sub 4} nanoflakes with thicknesses of ∼25 nm grow vertically to the nickel foam substrate and form an interconnected porous network with pore diameters of 50–500 nm. As anode material of LIBs, the NiCo{sub 2}O{sub 4} nanoflake arrays show a high initial coulombic efficiency of 76%, as well as good cycling stability with a capacity of 880 mAh g{sup −1} at 0.5 A g{sup −1}, and 523 mAh g{sup −1} at 1.5 A g{sup −1} after 50 cycles. As the cathode of alkaline batteries, a high capacity of 95 mAh g{sup −1} is achieved at 2 A g{sup −1} and 94% retention is maintained after 10,000 cycles. The superior electrochemical performance is mainly due to the unique nanoflake arrays structure with large surface area and shorter diffusion length for mass and charge transport.

  18. Quantifying peak discharges for historical floods

    USGS Publications Warehouse

    Cook, J.L.

    1987-01-01

    It is usually advantageous to use information regarding historical floods, if available, to define the flood-frequency relation for a stream. Peak stages can sometimes be determined for outstanding floods that occurred many years ago before systematic gaging of streams began. In the United States, this information is usually not available for more than 100-200 years, but in countries with long cultural histories, such as China, historical flood data are available at some sites as far back as 2,000 years or more. It is important in flood studies to be able to assign a maximum discharge rate and an associated error range to the historical flood. This paper describes the significant characteristics and uncertainties of four commonly used methods for estimating the peak discharge of a flood. These methods are: (1) rating curve (stage-discharge relation) extension; (2) slope conveyance; (3) slope area; and (4) step backwater. Logarithmic extensions of rating curves are based on theoretical plotting techniques that results in straight line extensions provided that channel shape and roughness do not change significantly. The slope-conveyance and slope-area methods are based on the Manning equation, which requires specific data on channel size, shape and roughness, as well as the water-surface slope for one or more cross-sections in a relatively straight reach of channel. The slope-conveyance method is used primarily for shaping and extending rating curves, whereas the slope-area method is used for specific floods. The step-backwater method, also based on the Manning equation, requires more cross-section data than the slope-area ethod, but has a water-surface profile convergence characteristic that negates the need for known or estimated water-surface slope. Uncertainties in calculating peak discharge for historical floods may be quite large. Various investigations have shown that errors in calculating peak discharges by the slope-area method under ideal conditions for

  19. The fate of added alkalinity in model scenarios of ocean alkalinization

    NASA Astrophysics Data System (ADS)

    Ferrer González, Miriam; Ilyina, Tatiana

    2014-05-01

    The deliberate large-scale manipulation of the Earth's climate (geo-engineering) has been proposed to mitigate climate change and ocean acidification. Whilst the mitigation potential of these technologies could sound promising, they may also pose many environmental risks. Our research aims at exploring the ocean-based carbon dioxide removal method of alkalinity enhancement. Its mitigation potential to reduce atmospheric CO2 and counteract the consequences of ocean acidification, risks and unintended consequences are studied. In order to tackle these questions, different scenarios are implemented in the state-of-the-art Earth system model of the Max Planck Institute for Meteorology. The model configuration is based on the 5th phase of the coupled model intercomparison project following a high CO2 future climate change scenario RCP8.5 (in which radiative forcing rises to 8.5 W/m² in 2100). Two different scenarios are performed where the alkalinity is artificially added globally uniformly in the upper ocean. In the first scenario, alkalinity is increased as a pulse by doubling natural values of the first 12 meters. In the second scenario we add alkalinity into the same ocean layer such that the atmospheric CO2 concentration is reduced from RCP8.5 to RCP4.5 levels (with the radiative forcing of 4.5 W/m² in 2100). We investigate the fate of the added alkalinity in these two scenarios and compare the differences in alkalinity budgets. In order to increase oceanic CO2 uptake from the atmosphere, enhanced alkalinity has to stay in the upper ocean. Once the alkalinity is added, it will become part of the biogeochemical cycles and it will be distributed with the ocean currents. Therefore, we are particularly interested in the residence time of the added alkalinity at the surface. Variations in CO2 partial pressure, seawater pH and saturation state of carbonate minerals produced in the implemented scenarios will be presented. Collateral changes in ocean biogeochemistry and

  20. Alkaline chemistry of transuranium elements and technetium and the treatment of alkaline radioactive wastes

    SciTech Connect

    Delegard, C.H.; Peretrukhin, V.F.; Shilov, V.P.; Pikaev, A.K.

    1995-05-01

    Goal of this survey is to generalize the known data on fundamental physical-chemical properties of TRUs and Tc, methods for their isolation, and to provide recommendations that will be useful for partitioning them from alkaline high-level wastes.

  1. Floods in mountain environments: A synthesis

    NASA Astrophysics Data System (ADS)

    Stoffel, Markus; Wyżga, Bartłomiej; Marston, Richard A.

    2016-11-01

    of mountain rivers, but morphological changes of rivers can also affect hydrological properties of floods and the associated risk for societies. This paper provides a review of research in the field of floods in mountain environments and puts the papers of this special issue dedicated to the same topic into context. It also provides insight into innovative studies, methods, or emerging aspects of the relations between environmental changes, geomorphic processes, and the occurrence of floods in mountain rivers.

  2. Floods in the Skagit River basin, Washington

    USGS Publications Warehouse

    Stewart, James E.; Bodhaine, George Lawrence

    1961-01-01

    According to Indian tradition, floods of unusually great magnitude harassed the Skagit River basin about 1815 and 1856. The heights of these floods were not recorded at the time; so they are called historical floods. Since the arrival of white men about 1863, a number of large and damaging floods have been witnessed and recorded. Data concerning and verifying the early floods, including those of 1815 and 1856, were collected prior to 1923 by James E. Stewart. He talked with many of the early settlers in the valley who had listened to Indians tell about the terrible floods. Some of these settlers had referenced the maximum stages of floods they had witnessed by cutting notches at or measuring to high-water marks on trees. In order to verify flood stages Stewart spent many weeks finding and levelling to high-water marks such as drift deposits, sand layers in coves, and silt in the bark of certain types of trees. Gaging stations have been in operation at various locations on the Skagit River and its tributaries since 1909, so recorded peak stages are available at certain sites for floods occurring since that date. All peak discharge data available for both historical and recorded floods have been listed in this report. The types of floods as to winter and summer, the duration of peaks, and the effect of reservoirs are discussed. In 1899 Sterling Dam was constructed at the head of Gages Slough near Sedro Woolley. This was the beginning of major diking in the lower reaches of the Skagit River. Maps included in the report show the location of most of the dike failures that have occurred during the last 73 years and the area probably inundated by major floods. The damage resulting from certain floods is briefly discussed. The report is concluded with a brief discussion of the U.S. Geological Survey method of computing flood-frequency curves as applied to the Skagit River basin. The treatment of single-station records and a means of combining these records for expressing

  3. Alabama district flood plan

    USGS Publications Warehouse

    Hedgecock, T. Scott; Pearman, J. Leroy; Stricklin, Victor E.

    2002-01-01

    The purpose of this flood plan is to outline and record advance planning for flood emergencies, so that all personnel will know the general plan and have a ready-reference for necessary information. This will ensure that during any flood event, regardless of the extent or magnitude, the resources of the District can be mobilized into a maximum data collection operation with a mimimum of effort.

  4. Alkaline earth cation extraction from acid solution

    DOEpatents

    Dietz, Mark; Horwitz, E. Philip

    2003-01-01

    An extractant medium for extracting alkaline earth cations from an aqueous acidic sample solution is described as are a method and apparatus for using the same. The separation medium is free of diluent, free-flowing and particulate, and comprises a Crown ether that is a 4,4'(5')[C.sub.4 -C.sub.8 -alkylcyclohexano]18-Crown-6 dispersed on an inert substrate material.

  5. Net alkalinity and net acidity 2: Practical considerations

    USGS Publications Warehouse

    Kirby, C.S.; Cravotta, C.A.

    2005-01-01

    The pH, alkalinity, and acidity of mine drainage and associated waters can be misinterpreted because of the chemical instability of samples and possible misunderstandings of standard analytical method results. Synthetic and field samples of mine drainage having various initial pH values and concentrations of dissolved metals and alkalinity were titrated by several methods, and the results were compared to alkalinity and acidity calculated based on dissolved solutes. The pH, alkalinity, and acidity were compared between fresh, unoxidized and aged, oxidized samples. Data for Pennsylvania coal mine drainage indicates that the pH of fresh samples was predominantly acidic (pH 2.5-4) or near neutral (pH 6-7); ??? 25% of the samples had pH values between 5 and 6. Following oxidation, no samples had pH values between 5 and 6. The Standard Method Alkalinity titration is constrained to yield values >0. Most calculated and measured alkalinities for samples with positive alkalinities were in close agreement. However, for low-pH samples, the calculated alkalinity can be negative due to negative contributions by dissolved metals that may oxidize and hydrolyze. The Standard Method hot peroxide treatment titration for acidity determination (Hot Acidity) accurately indicates the potential for pH to decrease to acidic values after complete degassing of CO2 and oxidation of Fe and Mn, and it indicates either the excess alkalinity or that required for neutralization of the sample. The Hot Acidity directly measures net acidity (= -net alkalinity). Samples that had near-neutral pH after oxidation had negative Hot Acidity; samples that had pH < 6.3 after oxidation had positive Hot Acidity. Samples with similar pH values before oxidation had dissimilar Hot Acidities due to variations in their alkalinities and dissolved Fe, Mn, and Al concentrations. Hot Acidity was approximately equal to net acidity calculated based on initial pH and dissolved concentrations of Fe, Mn, and Al minus the

  6. Past and present floods in South Moravia

    NASA Astrophysics Data System (ADS)

    Brázdil, Rudolf; Chromá, Kateřina; Řezníčková, Ladislava; Valášek, Hubert; Dolák, Lukáš; Stachoň, Zdeněk; Soukalová, Eva; Dobrovolný, Petr

    2015-04-01

    Floods represent the most destructive natural phenomena in the Czech Republic, often causing great material damage or loss of human life. Systematic instrumental measurements of water levels in Moravia (the eastern part of the Czech Republic) started mainly in the 1880s-1890s, while for discharges it was in the 1910s-1920s. Different documentary evidence allows extension of our knowledge about floods prior the instrumental period. The paper presents long-term flood chronologies for four South Moravian rivers: the Jihlava, the Svratka, the Dyje and the Morava. Different documentary data are used to extract floods. Taxation records are of particular importance among them. Since the mid-17th century, damage to property and land (fields, meadows, pastures or gardens) entitled farmers and landowners to request a tax relief. Related documents of this administration process kept mainly in Moravian Land Archives in Brno allow to obtain detail information about floods and their impacts. Selection of floods in the instrumental period is based on calculation of N-year return period of peak water levels and/or peak discharges for selected hydrological stations of the corresponding rivers (with return period of two years and more). Final flood chronologies combine floods derived from both documentary data and hydrological measurements. Despite greater inter-decadal variability, periods of higher flood frequency are c. 1821-1850 and 1921-1950 for all four rivers; for the Dyje and Morava rivers also 1891-1900. Flood frequency fluctuations are further compared with other Central European rivers. Uncertainties in created chronologies with respect to data and methods used for compilation of long-term series and anthropogenic changes in river catchments are discussed. The study is a part of the research project "Hydrometeorological extremes in Southern Moravia derived from documentary evidence" supported by the Grant Agency of the Czech Republic, reg. no. 13-19831S.

  7. Analysis of the flood extent extraction model and the natural flood influencing factors: A GIS-based and remote sensing analysis

    NASA Astrophysics Data System (ADS)

    Lawal, D. U.; Matori, A. N.; Yusuf, K. W.; Hashim, A. M.; Balogun, A. L.

    2014-02-01

    Serious floods have hit the State of Perlis in 2005, 2010, as well as 2011. Perlis is situated in the northern part of Peninsula Malaysia. The floods caused great damage to properties and human lives. There are various methods used in an attempt to provide the most reliable ways to reduce the flood risk and damage to the optimum level by identifying the flood vulnerable zones. The purpose of this paper is to develop a flood extent extraction model based on Minimum Distance Algorithm and to overlay with the natural flood influencing factors considered herein in order to examine the effect of each factor in flood generation. GIS spatial database was created from a geological map, SPOT satellite image, and the topographical map. An attribute database was equally created from field investigations and historical flood areas reports of the study area. The results show a great correlation between the flood extent extraction model and the flood factors.

  8. Root responses to flooding.

    PubMed

    Sauter, Margret

    2013-06-01

    Soil water-logging and submergence pose a severe threat to plants. Roots are most prone to flooding and the first to suffer from oxygen shortage. Roots are vital for plant function, however, and maintenance of a functional root system upon flooding is essential. Flooding-resistant plants possess a number of adaptations that help maintain oxygen supply to the root. Plants are also capable of initiating organogenesis to replace their original root system with adventitious roots if oxygen supply becomes impossible. This review summarizes current findings on root development and de novo root genesis in response to flooding.

  9. RASOR flood modelling

    NASA Astrophysics Data System (ADS)

    Beckers, Joost; Buckman, Lora; Bachmann, Daniel; Visser, Martijn; Tollenaar, Daniel; Vatvani, Deepak; Kramer, Nienke; Goorden, Neeltje

    2015-04-01

    Decision making in disaster management requires fast access to reliable and relevant information. We believe that online information and services will become increasingly important in disaster management. Within the EU FP7 project RASOR (Rapid Risk Assessment and Spatialisation of Risk) an online platform is being developed for rapid multi-hazard risk analyses to support disaster management anywhere in the world. The platform will provide access to a plethora of GIS data that are relevant to risk assessment. It will also enable the user to run numerical flood models to simulate historical and newly defined flooding scenarios. The results of these models are maps of flood extent, flood depths and flow velocities. The RASOR platform will enable to overlay historical event flood maps with observations and Earth Observation (EO) imagery to fill in gaps and assess the accuracy of the flood models. New flooding scenarios can be defined by the user and simulated to investigate the potential impact of future floods. A series of flood models have been developed within RASOR for selected case study areas around the globe that are subject to very different flood hazards: • The city of Bandung in Indonesia, which is prone to fluvial flooding induced by heavy rainfall. The flood hazard is exacerbated by land subsidence. • The port of Cilacap on the south coast of Java, subject to tsunami hazard from submarine earthquakes in the Sunda trench. • The area south of city of Rotterdam in the Netherlands, prone to coastal and/or riverine flooding. • The island of Santorini in Greece, which is subject to tsunamis induced by landslides. Flood models have been developed for each of these case studies using mostly EO data, augmented by local data where necessary. Particular use was made of the new TanDEM-X (TerraSAR-X add-on for Digital Elevation Measurement) product from the German Aerospace centre (DLR) and EADS Astrium. The presentation will describe the flood models and the

  10. Runoff models and flood frequency statistics for design flood estimation in Austria - Do they tell a consistent story?

    NASA Astrophysics Data System (ADS)

    Rogger, M.; Kohl, B.; Pirkl, H.; Viglione, A.; Komma, J.; Kirnbauer, R.; Merz, R.; Blöschl, G.

    2012-08-01

    SummaryDesign floods for a given location at a stream can be estimated by a number of approaches including flood frequency statistics and the design storm method. If applied to the same catchment the two methods often yield quite different results. The aim of this paper is to contribute to understanding the reasons for these differences. A case study is performed for 10 alpine catchments in Tyrol, Austria, where the 100-year floods are estimated by (a) flood frequency statistics and (b) an event based runoff model. To identify the sources of the differences of the two methods, the 100-year floods are also estimated by (c) Monte Carlo simulations using a continuous runoff model. The results show that, in most catchments, the event based model gives larger flood estimates than flood frequency statistics. The reasons for the differences depend on the catchment characteristics and different rainfall inputs that were applied. For catchments with a high storage capacity the Monte Carlo simulations indicate a step change in the flood frequency curve when a storage threshold is exceeded which is not captured by flood frequency statistics. Flood frequency statistics therefore tends to underestimate the floods in these catchments. For catchments with a low storage capacity or significant surface runoff, no step change occurs, but in three catchments the design storms used were larger than those read from the IDF (intensity duration frequency) curve leading to an overestimation of the design floods. Finally, also the correct representation of flood dominating runoff components was shown to influence design flood results. Geologic information on the catchments was essential for identifying the reasons for the mismatch of the flood estimates.

  11. Technique for estimating depth of floods in Tennessee

    USGS Publications Warehouse

    Gamble, C.R.

    1983-01-01

    Estimates of flood depths are needed for design of roadways across flood plains and for other types of construction along streams. Equations for estimating flood depths in Tennessee were derived using data for 150 gaging stations. The equations are based on drainage basin size and can be used to estimate depths of the 10-year and 100-year floods for four hydrologic areas. A method also was developed for estimating depth of floods having recurrence intervals between 10 and 100 years. Standard errors range from 22 to 30 percent for the 10-year depth equations and from 23 to 30 percent for the 100-year depth equations. (USGS)

  12. Methods for estimating magnitude and frequency of floods in Arizona, developed with unregulated and rural peak-flow data through water year 2010

    USGS Publications Warehouse

    Paretti, Nicholas V.; Kennedy, Jeffrey R.; Turney, Lovina A.; Veilleux, Andrea G.

    2014-01-01

    The regional regression equations were integrated into the U.S. Geological Survey’s StreamStats program. The StreamStats program is a national map-based web application that allows the public to easily access published flood frequency and basin characteristic statistics. The interactive web application allows a user to select a point within a watershed (gaged or ungaged) and retrieve flood-frequency estimates derived from the current regional regression equations and geographic information system data within the selected basin. StreamStats provides users with an efficient and accurate means for retrieving the most up to date flood frequency and basin characteristic data. StreamStats is intended to provide consistent statistics, minimize user error, and reduce the need for large datasets and costly geographic information system software.

  13. Uncertainty in surface water flood risk modelling

    NASA Astrophysics Data System (ADS)

    Butler, J. B.; Martin, D. N.; Roberts, E.; Domuah, R.

    2009-04-01

    uniform flow formulae (Manning's Equation) to direct flow over the model domain, sourcing water from the channel or sea so as to provide a detailed representation of river and coastal flood risk. The initial development step was to include spatially-distributed rainfall as a new source term within the model domain. This required optimisation to improve computational efficiency, given the ubiquity of ‘wet' cells early on in the simulation. Collaboration with UK water companies has provided detailed drainage information, and from this a simplified representation of the drainage system has been included in the model via the inclusion of sinks and sources of water from the drainage network. This approach has clear advantages relative to a fully coupled method both in terms of reduced input data requirements and computational overhead. Further, given the difficulties associated with obtaining drainage information over large areas, tests were conducted to evaluate uncertainties associated with excluding drainage information and the impact that this has upon flood model predictions. This information can be used, for example, to inform insurance underwriting strategies and loss estimation as well as for emergency response and planning purposes. The Flowroute surface-water flood risk platform enables efficient mapping of areas sensitive to flooding from high-intensity rainfall events due to topography and drainage infrastructure. As such, the technology has widespread potential for use as a risk mapping tool by the UK Environment Agency, European Member States, water authorities, local governments and the insurance industry. Keywords: Surface water flooding, Model Uncertainty, Insurance Underwriting, Flood inundation modelling, Risk mapping.

  14. Feedback on flood risk management

    NASA Astrophysics Data System (ADS)

    Moreau, K.; Roumagnac, A.

    2009-09-01

    are responsible of the transmission of meteorological alert and of rescue actions. In the crossing of the géo-information stemming from the space technology, communication, meteorology, hydraulics and hydrology, Predict-services brings help to local communities in their mission of protection and information to the citizens, for flood problems and helps companies to limit and delete operating losses facing floods. The initiative, developped by BRL, EADS Astrium, in association with Meteo France, has been employed and is functioning on cities of south of France, notably on Montpellier, and also on the scale of catchment area ( BRL is a regional development company, a public private partnership controlled by the local gouvernments of the Languedoc-Roussillon Region). The initiative has to be coordinated with state services to secure continuity and coherence of information. This initiative is developped in dialogue with State services as Météo France, the Ministry for the interior, the Ministry for ecology and the durable development, the Regional Direction of the Environment (DIREN), the Central service of Hydrometeorology and Support to the Forecast of the Floods ( SCHAPI) and service of forecast of rising (SPC). It has been successfully functioning for 5 years with 300 southern cities from South West to South East of France and notably Montpellier and Sommières, famous for it's flood problems on the Vidourle river where no human loss was to regret and where the economic impacts were minimized. Actually developed in cities of South of France, this initiative is to be developed nationaly and very soon internationally. Thanks to the efficiency of it's method, this initiative is also developed in partnership with insurance company involved in prevention actions. After more than 100 events observed and analysed in South of France, the experience gained, allowed PREDICT Services to better anticipate phenomena and also to better manage them. The presentation will expose

  15. Real-time flood extent maps based on social media

    NASA Astrophysics Data System (ADS)

    Eilander, Dirk; van Loenen, Arnejan; Roskam, Ruud; Wagemaker, Jurjen

    2015-04-01

    During a flood event it is often difficult to get accurate information about the flood extent and the people affected. This information is very important for disaster risk reduction management and crisis relief organizations. In the post flood phase, information about the flood extent is needed for damage estimation and calibrating hydrodynamic models. Currently, flood extent maps are derived from a few sources such as satellite images, areal images and post-flooding flood marks. However, getting accurate real-time or maximum flood extent maps remains difficult. With the rise of social media, we now have a new source of information with large numbers of observations. In the city of Jakarta, Indonesia, the intensity of unique flood related tweets during a flood event, peaked at 8 tweets per second during floods in early 2014. A fair amount of these tweets also contains observations of water depth and location. Our hypothesis is that based on the large numbers of tweets it is possible to generate real-time flood extent maps. In this study we use tweets from the city of Jakarta, Indonesia, to generate these flood extent maps. The data-mining procedure looks for tweets with a mention of 'banjir', the Bahasa Indonesia word for flood. It then removes modified and retweeted messages in order to keep unique tweets only. Since tweets are not always sent directly from the location of observation, the geotag in the tweets is unreliable. We therefore extract location information using mentions of names of neighborhoods and points of interest. Finally, where encountered, a mention of a length measure is extracted as water depth. These tweets containing a location reference and a water level are considered to be flood observations. The strength of this method is that it can easily be extended to other regions and languages. Based on the intensity of tweets in Jakarta during a flood event we can provide a rough estimate of the flood extent. To provide more accurate flood extend

  16. Improving Gas Flooding Efficiency

    SciTech Connect

    Reid Grigg; Robert Svec; Zheng Zeng; Alexander Mikhalin; Yi Lin; Guoqiang Yin; Solomon Ampir; Rashid Kassim

    2008-03-31

    This study focuses on laboratory studies with related analytical and numerical models, as well as work with operators for field tests to enhance our understanding of and capabilities for more efficient enhanced oil recovery (EOR). Much of the work has been performed at reservoir conditions. This includes a bubble chamber and several core flood apparatus developed or modified to measure interfacial tension (IFT), critical micelle concentration (CMC), foam durability, surfactant sorption at reservoir conditions, and pressure and temperature effects on foam systems.Carbon dioxide and N{sub 2} systems have been considered, under both miscible and immiscible conditions. The injection of CO2 into brine-saturated sandstone and carbonate core results in brine saturation reduction in the range of 62 to 82% brine in the tests presented in this paper. In each test, over 90% of the reduction occurred with less than 0.5 PV of CO{sub 2} injected, with very little additional brine production after 0.5 PV of CO{sub 2} injected. Adsorption of all considered surfactant is a significant problem. Most of the effect is reversible, but the amount required for foaming is large in terms of volume and cost for all considered surfactants. Some foams increase resistance to the value beyond what is practical in the reservoir. Sandstone, limestone, and dolomite core samples were tested. Dissolution of reservoir rock and/or cement, especially carbonates, under acid conditions of CO2 injection is a potential problem in CO2 injection into geological formations. Another potential change in reservoir injectivity and productivity will be the precipitation of dissolved carbonates as the brine flows and pressure decreases. The results of this report provide methods for determining surfactant sorption and can be used to aid in the determination of surfactant requirements for reservoir use in a CO{sub 2}-foam flood for mobility control. It also provides data to be used to determine rock permeability

  17. Probability plotting position formulas for flood records with historical information

    USGS Publications Warehouse

    Hirsch, R.M.

    1987-01-01

    For purposes of evaluating fitted flood frequency distributions or for purposes of estimating distributions directly from plots of flood peaks versus exceedance probabilities (either by subjective or objective techniques), one needs a probability plotting position formula which can be applied to all of the flood data available: both systematic and historic floods. Some of the formulas in use are simply extensions of existing formulas (such as Hazen and Weibull) used on systematic flood records. New plotting position formulas proposed by Hirsch and Stedinger (1986) and in this paper are based on a recognition that the flood data arises from partially censored sampling of the flood record. The theoretical appropriateness, bias in probability and bias in discharge of the various plotting position formulas are considered. The methods are compared in terms of their effects on flood frequency estimation when an objective curve-fitting method of estimation is employed. Consideration is also given to the correct interpretation of the historical record length and the effect of incorrectly assuming that record length equals the time since the first known historical flood. This assumption is employed in many flood frequency studies and may result in a substantial bias in estimated design flood magnitudes. ?? 1987.

  18. The chemistry and element fluxes of the July 2011 Múlakvísl and Kaldakvísl glacial floods, Iceland

    NASA Astrophysics Data System (ADS)

    Galeczka, Iwona; Oelkers, Eric H.; Gislason, Sigurdur R.

    2014-05-01

    The glacial floods, called 'jökulhlaups', are common in Iceland and are of interest to geologists for several reasons. Firstly, the heat source origin - subglacial volcanic eruption or/and subglacial geothermal activity - determines the potential environmental impact of the floods. For example, if the heat was sourced by a volcanic eruption, acid gas input might lead to acidic flood waters and toxic metal release from the host rock. In contrast, geothermal heat melts the ice slowly allowing long-term fluid-rock interaction to neutralize the flood waters, limiting their toxicity. The chemical composition of the flood waters is often the only indicator of the flood triggering mechanism in volcanic and geothermal areas. As such river water chemistry monitoring might be an effective method to predict an upcoming volcanic eruption. Secondly, glacial floods may play an important role in global cycle of elements. Due to high discharge during the events, flood waters can transport large amounts of particulate material. This particulate material has large surface areas, making it especially reactive once it arrives in estuaries. Slow dissolution of particulate material releases micro- and macronutrients which could enhance primary productivity along the coast and in lakes. In July 2011, two ~2000 m3/s glacial floods from the Icelandic Mördalsjökull and Vatnajökull glaciers emerged into the Múlakvísl and Kaldakvísl rivers, respectively. Water samples collected during both floods had neutral to alkaline pH and conductivity from 100 to 900 μS/cm. The total dissolved inorganic carbon (DIC), present mostly as HCO3-, was ~9 mmol/kg during the flood peak in the Múlakvísl river but stabilized at around 1 mmol/kg; a similar trend was observed in the Kaldakvísl river. Concentrations of most dissolved elements in the flood waters were comparable to those commonly observed in these rivers. The concentration of suspended particulate material however, increased dramatically

  19. Controlled synthesis of La{sub 1−x}Sr{sub x}CrO{sub 3} nanoparticles by hydrothermal method with nonionic surfactant and their ORR activity in alkaline medium

    SciTech Connect

    Choi, Bo Hyun; Park, Shin-Ae; Park, Bong Kyu; Chun, Ho Hwan; Kim, Yong-Tae

    2013-10-15

    Graphical abstract: We demonstrate that Sr-doped LaCrO{sub 3} nanoparticles were successfully prepared by the hydrothermal synthesis method using the nonionic surfactant Triton X-100 and the applicability of La{sub 1−x}Sr{sub x}CrO{sub 3} to oxygen reduction reaction (ORR) electrocatalysis in an alkaline medium. Compared with the nanoparticles synthesized by the coprecipitation method, they showed enhanced ORR activity. - Highlights: • Sr-doped LaCrO{sub 3} nanoparticles were successfully prepared by the hydrothermal method using the nonionic surfactant. • Homogeneously shaped and sized Sr-doped LaCrO{sub 3} nanoparticles were readily obtained. • Compared with the nanoparticles synthesized by the coprecipitation method, they showed an enhanced ORR activity. • The main origin was revealed to be the decreased particle size due to the nonionic surfactant. - Abstract: Sr-doped LaCrO{sub 3} nanoparticles were prepared by the hydrothermal method with the nonionic surfactant Triton X-100 followed by heat treatment at 1000 °C for 10 h. The obtained perovskite nanoparticles had smaller particle size (about 100 nm) and more uniform size distribution than those synthesized by the conventional coprecipitation method. On the other hand, it was identified with the material simulation that the electronic structure change by Sr doping was negligible, because the initially unfilled e{sub g}-band was not affected by the p-type doping. Finally, the perovskite nanoparticles synthesized by hydrothermal method showed much higher ORR activity by over 200% at 0.8 V vs. RHE than those by coprecipitation method.

  20. Coupling the Alkaline-Surfactant-Polymer Technology and The Gelation Technology to Maximize Oil Production

    SciTech Connect

    Malcolm Pitts; Jie Qi; Dan Wilson; David Stewart; Bill Jones

    2005-10-01

    Gelation technologies have been developed to provide more efficient vertical sweep efficiencies for flooding naturally fractured oil reservoirs or more efficient areal sweep efficiency for those with high permeability contrast ''thief zones''. The field proven alkaline-surfactant-polymer technology economically recovers 15% to 25% OOIP more oil than waterflooding from swept pore space of an oil reservoir. However, alkaline-surfactant-polymer technology is not amenable to naturally fractured reservoirs or those with thief zones because much of injected solution bypasses target pore space containing oil. This work investigates whether combining these two technologies could broaden applicability of alkaline-surfactant-polymer flooding into these reservoirs. A prior fluid-fluid report discussed interaction of different gel chemical compositions and alkaline-surfactant-polymer solutions. Gel solutions under dynamic conditions of linear corefloods showed similar stability to alkaline-surfactant-polymer solutions as in the fluid-fluid analyses. Aluminum-polyacrylamide, flowing gels are not stable to alkaline-surfactant-polymer solutions of either pH 10.5 or 12.9. Chromium acetate-polyacrylamide flowing and rigid flowing gels are stable to subsequent alkaline-surfactant-polymer solution injection. Rigid flowing chromium acetate-polyacrylamide gels maintained permeability reduction better than flowing chromium acetate-polyacrylamide gels. Silicate-polyacrylamide gels are not stable with subsequent injection of either a pH 10.5 or a 12.9 alkaline-surfactant-polymer solution. Chromium acetate-xanthan gum rigid gels are not stable to subsequent alkaline-surfactant-polymer solution injection. Resorcinol-formaldehyde gels were stable to subsequent alkaline-surfactant-polymer solution injection. When evaluated in a dual core configuration, injected fluid flows into the core with the greatest effective permeability to the injected fluid. The same gel stability trends to subsequent

  1. The Spokane flood controversy

    NASA Technical Reports Server (NTRS)

    Baker, V. R.

    1978-01-01

    An enormous plexus of proglacial channels that eroded into the loess and basalt of the Columbia Plateau, eastern Washington is studied. This channeled scabland contained erosional and depositional features that were unique among fluvial phenomena. Documentation of the field relationships of the region explains the landforms as the product of a relatively brief, but enormous flood, then so-called the Spokane flood.

  2. Glacier generated floods

    USGS Publications Warehouse

    Walder, J.S.; Fountain, A.G.; ,

    1997-01-01

    Destructive floods result from drainage of glacier-dammed lakes and sudden release of water stored within glaciers. There is a good basis - both empirical and theoretical - for predicting the magnitude of floods from ice-dammed lakes, although some aspects of flood initiation need to be better understood. In contrast, an understanding of floods resulting from release of internally stored water remains elusive, owing to lack of knowledge of how and where water is stored and to inadequate understanding of the complex physics of the temporally and spatially variable subglacial drainage system.Destructive floods result from drainage of glacier-dammed lakes and sudden release of water stored within glaciers. There is a good basis - both empirical and theoretical - for predicting the magnitude of floods from ice-dammed lakes, although some aspects of flood initiation need to be better understood. In contrast, an understanding of floods resulting from release of internally stored water remains elusive, owing to lack of knowledge of how and where water is stored and to inadequate understanding of the complex physics of the temporally and spatially variable subglacial drainage system.

  3. Continental Flood Basalts

    NASA Astrophysics Data System (ADS)

    Continental flood basalts have been receiving considerable scientific attention lately. Recent publications have focused on several particular flood-basalt provinces (Brito-Arctic, Karoo, Parana', Deccan, and Columbia Plateau), and much attention has been given to the proposed connection between flood-basalt volcanism, bolide impacts, and mass extinctions. The editor of Continental Flood Basalts, J. D. Macdougall, conceived the book to assemble in a single volume, from a vast and scattered literature, an overview of each major post-Cambrian flood-basalt province.Continental Flood Basalts has 10 chapters; nine treat individual flood-basalt provinces, and a summary chapter compares and contrasts continental flood-basalts and mid-oceanic ridge basalts. Specifically, the chapters address the Columbia River basalt, the northwest United States including the Columbia River basalt, the Ethiopian Province, the North Atlantic Tertiary Province, the Deccan Traps, the Parana' Basin, the Karoo Province, the Siberian Platform, and Cenozoic basaltic rocks in eastern China. Each chapter is written by one or more individuals with an extensive background in the province.

  4. Discover Floods Educators Guide

    ERIC Educational Resources Information Center

    Project WET Foundation, 2009

    2009-01-01

    Now available as a Download! This valuable resource helps educators teach students about both the risks and benefits of flooding through a series of engaging, hands-on activities. Acknowledging the different roles that floods play in both natural and urban communities, the book helps young people gain a global understanding of this common--and…

  5. Urban flood risk warning under rapid urbanization.

    PubMed

    Chen, Yangbo; Zhou, Haolan; Zhang, Hui; Du, Guoming; Zhou, Jinhui

    2015-05-01

    In the past decades, China has observed rapid urbanization, the nation's urban population reached 50% in 2000, and is still in steady increase. Rapid urbanization in China has an adverse impact on urban hydrological processes, particularly in increasing the urban flood risks and causing serious urban flooding losses. Urban flooding also increases health risks such as causing epidemic disease break out, polluting drinking water and damaging the living environment. In the highly urbanized area, non-engineering measurement is the main way for managing urban flood risk, such as flood risk warning. There is no mature method and pilot study for urban flood risk warning, the purpose of this study is to propose the urban flood risk warning method for the rapidly urbanized Chinese cities. This paper first presented an urban flood forecasting model, which produces urban flood inundation index for urban flood risk warning. The model has 5 modules. The drainage system and grid dividing module divides the whole city terrain into drainage systems according to its first-order river system, and delineates the drainage system into grids based on the spatial structure with irregular gridding technique; the precipitation assimilation module assimilates precipitation for every grids which is used as the model input, which could either be the radar based precipitation estimation or interpolated one from rain gauges; runoff production module classifies the surface into pervious and impervious surface, and employs different methods to calculate the runoff respectively; surface runoff routing module routes the surface runoff and determines the inundation index. The routing on surface grid is calculated according to the two dimensional shallow water unsteady flow algorithm, the routing on land channel and special channel is calculated according to the one dimensional unsteady flow algorithm. This paper then proposed the urban flood risk warning method that is called DPSIR model based

  6. Magnitude and frequency of floods in Alabama

    USGS Publications Warehouse

    Olin, D.A.

    1985-01-01

    Methods are presented to estimate flood magnitude for selected recurrence intervals for urban and rural streams with drainage areas from 1 to 22,000 square miles. Seven hydrologic areas were delineated and regression equations were developed for six areas. Hydrologic data could not be regionalized for the seventh area. Drainage area was the only independent variable used in the equations for five hydrologic areas. Drainage area and a storage factor were used in the equations for the other area. One hydrologic area, located in the central part of the State, has flood runoffs two to four times greater than the other areas. It is recommended that the rural equations be used for estimates of flood magnitudes for both urban and rural streams in the hydrologic area. Rivers with drainage areas greater than 1,500 square miles could not be regionalized. Estimating methods for these rivers are shown graphically. Maximum flood magnitudes versus drainage area also are presented. (USGS)

  7. Urban flood risk warning under rapid urbanization.

    PubMed

    Chen, Yangbo; Zhou, Haolan; Zhang, Hui; Du, Guoming; Zhou, Jinhui

    2015-05-01

    In the past decades, China has observed rapid urbanization, the nation's urban population reached 50% in 2000, and is still in steady increase. Rapid urbanization in China has an adverse impact on urban hydrological processes, particularly in increasing the urban flood risks and causing serious urban flooding losses. Urban flooding also increases health risks such as causing epidemic disease break out, polluting drinking water and damaging the living environment. In the highly urbanized area, non-engineering measurement is the main way for managing urban flood risk, such as flood risk warning. There is no mature method and pilot study for urban flood risk warning, the purpose of this study is to propose the urban flood risk warning method for the rapidly urbanized Chinese cities. This paper first presented an urban flood forecasting model, which produces urban flood inundation index for urban flood risk warning. The model has 5 modules. The drainage system and grid dividing module divides the whole city terrain into drainage systems according to its first-order river system, and delineates the drainage system into grids based on the spatial structure with irregular gridding technique; the precipitation assimilation module assimilates precipitation for every grids which is used as the model input, which could either be the radar based precipitation estimation or interpolated one from rain gauges; runoff production module classifies the surface into pervious and impervious surface, and employs different methods to calculate the runoff respectively; surface runoff routing module routes the surface runoff and determines the inundation index. The routing on surface grid is calculated according to the two dimensional shallow water unsteady flow algorithm, the routing on land channel and special channel is calculated according to the one dimensional unsteady flow algorithm. This paper then proposed the urban flood risk warning method that is called DPSIR model based

  8. SIMULATION OF FLOOD HYDROGRAPHS FOR GEORGIA STREAMS.

    USGS Publications Warehouse

    Inman, E.J.; Armbruster, J.T.

    1986-01-01

    Flood hydrographs are needed for the design of many highway drainage structures and embankments. A method for simulating these flood hydrographs at urban and rural ungauged sites in Georgia is presented. The O'Donnell method was used to compute unit hydrographs from 355 flood events from 80 stations. An average unit hydrograph and an average lag time were computed for each station. These average unit hydrographs were transformed to unit hydrographs having durations of one-fourth, one-third, one-half, and three-fourths lag time and then reduced to dimensionless terms by dividing the time by lag time and the discharge by peak discharge. Hydrographs were simulated for these 355 flood events and their widths were compared with the widths of the observed hydrographs at 50 and 75 percent of peak flow. For simulating hydrographs at sites larger than 500 mi**2, the U. S. Geological Survey computer model CONROUT can be used.

  9. Assessment of flood risk in Tokyo metropolitan area

    NASA Astrophysics Data System (ADS)

    Hirano, J.; Dairaku, K.

    2013-12-01

    Flood is one of the most significant natural hazards in Japan. The Tokyo metropolitan area has been affected by several large flood disasters. Therefore, investigating potential flood risk in Tokyo metropolitan area is important for development of adaptation strategy for future climate change. We aim to develop a method for evaluating flood risk in Tokyo Metropolitan area by considering effect of historical land use and land cover change, socio-economic change, and climatic change. Ministry of land, infrastructure, transport and tourism in Japan published 'Statistics of flood', which contains data for flood causes, number of damaged houses, area of wetted surface, and total amount of damage for each flood at small municipal level. By using these flood data, we estimated damage by inundation inside a levee for each prefecture based on a statistical method. On the basis of estimated damage, we developed flood risk curves in the Tokyo metropolitan area, representing relationship between damage and exceedance probability of flood for the period 1976-2008 for each prefecture. Based on the flood risk curve, we attempted evaluate potential flood risk in the Tokyo metropolitan area and clarify the cause for regional difference of flood risk. By analyzing flood risk curves, we found out regional differences of flood risk. We identified high flood risk in Tokyo and Saitama prefecture. On the other hand, flood risk was relatively low in Ibaraki and Chiba prefecture. We found that these regional differences of flood risk can be attributed to spatial distribution of entire property value and ratio of damaged housing units in each prefecture.We also attempted to evaluate influence of climate change on potential flood risk by considering variation of precipitation amount and precipitation intensity in the Tokyo metropolitan area. Results shows that we can evaluate potential impact of precipitation change on flood risk with high accuracy by using our methodology. Acknowledgments

  10. Re-assessing the flood risk in Scotland.

    PubMed

    Black, Andrew R; Burns, John C

    2002-07-22

    This paper presents a review of changes in flood risk estimation on Scottish rivers resulting from re-analysis of flood records or from the application of new methods. The review arises at a time when flood damages have received recent prominence through the occurrence of a number of extreme floods in Scotland, and when the possible impacts of climate change on flood risk are receiving considerable attention. An analysis of the nine longest available peaks-over-threshold (POT) flood series for Scottish rivers reveals that, for thresholds yielding two events per year on average, annual POT frequencies on western rivers have increased in the 1980s/1990s to maximum recorded values, while in the east, values were highest in the 1950s/1960s. These results support the results of flood modelling work based on rainfall and temperature records from the 1870s, which indicate that, in western catchments, annual POT frequencies in the 1980s/1990s are unprecedented. No general trends in flood magnitude series were found, but an unexpected cluster of extreme floods is identified as having occurred since 1988, resulting in eight of Scotland's 16 largest gauged rivers producing their maximum recorded flows since then. These shifts are related to recent increases in the dominance of westerly airflows, share similarities with the results of climate change modelling, and collectively point to increases in flood risk in many parts of Scotland. The paper also reviews advances in flood risk estimation arising from the publication of the UK Flood Estimation Handbook, developments in the collection and use of historic flood estimation and the production of maps of 100-year flood areal extent. Finally the challenges in flood risk estimation posed by climate change are examined, particularly in relation to the assumption of stationarity.

  11. Flood inundation map library, Fort Kent, Maine

    USGS Publications Warehouse

    Lombard, Pamela J.

    2012-01-01

    Severe flooding occurred in northern Maine from April 28 to May 1, 2008, and damage was extensive in the town of Fort Kent (Lombard, 2010). Aroostook County was declared a Federal disaster area on May 9, 2008. The extent of flooding on both the Fish and St. John Rivers during this event showed that the current Federal Emergency Management Agency (FEMA) Flood Insurance Study (FIS) and Flood Insurance Rate Map (FIRM) (Federal Emergency Management Agency, 1979) were out of date. The U.S. Geological Survey (USGS) conducted a study to develop a flood inundation map library showing the areas and depths for a range of flood stages from bankfull to the flood of record for Fort Kent to complement an updated FIS (Federal Emergency Management Agency, in press). Hydrologic analyses that support the maps include computer models with and without the levee and with various depths of backwater on the Fish River. This fact sheet describes the methods used to develop the maps and describes how the maps can be accessed.

  12. An initial abstraction and constant loss model, and methods for estimating unit hydrographs, peak streamflows, and flood volumes for urban basins in Missouri

    USGS Publications Warehouse

    Huizinga, Richard J.

    2014-01-01

    The rainfall-runoff pairs from the storm-specific GUH analysis were further analyzed against various basin and rainfall characteristics to develop equations to estimate the peak streamflow and flood volume based on a quantity of rainfall on the basin.

  13. Flood management: prediction of microbial contamination in large-scale floods in urban environments.

    PubMed

    Taylor, Jonathon; Lai, Ka Man; Davies, Mike; Clifton, David; Ridley, Ian; Biddulph, Phillip

    2011-07-01

    With a changing climate and increased urbanisation, the occurrence and the impact of flooding is expected to increase significantly. Floods can bring pathogens into homes and cause lingering damp and microbial growth in buildings, with the level of growth and persistence dependent on the volume and chemical and biological content of the flood water, the properties of the contaminating microbes, and the surrounding environmental conditions, including the restoration time and methods, the heat and moisture transport properties of the envelope design, and the ability of the construction material to sustain the microbial growth. The public health risk will depend on the interaction of these complex processes and the vulnerability and susceptibility of occupants in the affected areas. After the 2007 floods in the UK, the Pitt review noted that there is lack of relevant scientific evidence and consistency with regard to the management and treatment of flooded homes, which not only put the local population at risk but also caused unnecessary delays in the restoration effort. Understanding the drying behaviour of flooded buildings in the UK building stock under different scenarios, and the ability of microbial contaminants to grow, persist, and produce toxins within these buildings can help inform recovery efforts. To contribute to future flood management, this paper proposes the use of building simulations and biological models to predict the risk of microbial contamination in typical UK buildings. We review the state of the art with regard to biological contamination following flooding, relevant building simulation, simulation-linked microbial modelling, and current practical considerations in flood remediation. Using the city of London as an example, a methodology is proposed that uses GIS as a platform to integrate drying models and microbial risk models with the local building stock and flood models. The integrated tool will help local governments, health authorities

  14. Feedback on flood risk management

    NASA Astrophysics Data System (ADS)

    Moreau, K.; Roumagnac, A.

    2009-09-01

    space technology, communication, meteorology, hydraulics and hydrology, Predict-services brings help to local communities in their mission of protection and information to the citizens, for flood problems and helps companies to limit and delete operating losses facing floods. The initiative, developped by BRL, EADS Astrium, in association with Meteo France, has been employed and is functioning on cities of south of France, notably on Montpellier, and also on the scale of catchment area( BRL is a regional development company, a public private partnership controlled by the local gouvernments of the Languedoc-Roussillon Region). The initiative has to be coordinated with state services to secure continuity and coherence of information. This initiative is developped in dialogue with State services as Météo France, the Ministry for the interior, the Ministry for ecology and the durable development, the Regional Direction of the Environment (DIREN), the Central service of Hydrometeorology and Support to the Forecast of the Floods ( SCHAPI) and service of forecast of rising (SPC). It has been successfully functioning for 5 years with 300 southern cities from South West to South East of France and notably Montpellier and Sommières, famous for it’s flood problems on the Vidourle river where no human loss was to regret and where the economic impacts were minimized. Actually developed in cities of South of France, this initiative is to be developed nationaly and very soon internationally. Thanks to the efficiency of it’s method, this initiative is also developed in partnership with insurance company involved in prevention actions. The presentation will expose the feedback of this initiative and lessons learned.

  15. Coupling the Alkaline-Surfactant-Polymer Technology and The Gelation Technology to Maximize Oil Production

    SciTech Connect

    Malcolm Pitts; Jie Qi; Dan Wilson; Phil Dowling; David Stewart; Bill Jones

    2005-12-01

    Gelation technologies have been developed to provide more efficient vertical sweep efficiencies for flooding naturally fractured oil reservoirs or reservoirs with different sand lenses with high permeability contrast. The field proven alkaline-surfactant-polymer technology economically recovers 15% to 25% OOIP more crude oil than waterflooding from swept pore space of an oil reservoir. However, alkaline-surfactant-polymer technology is not amenable to naturally fractured reservoirs or reservoirs with high permeability contrast zones because much of injected solution bypasses target pore space containing oil. This work investigates whether combining these two technologies could broaden applicability of alkaline-surfactant-polymer flooding into these reservoirs. Fluid-fluid interaction with different gel chemical compositions and alkaline-surfactant-polymer solution with pH values ranging from 9.2 to 12.9 have been tested. Aluminum-polyacrylamide gels are not stable to alkaline-surfactant-polymer solutions at any pH. Chromium-polyacrylamide gels with polymer to chromium ion ratios of 25 or greater were stable to alkaline-surfactant-polymer solutions if solution pH was 10.6 or less. When the polymer to chromium ion was 15 or less, chromium-polyacrylamide gels were stable to alkaline-surfactant-polymer solutions with pH values up to 12.9. Chromium-xanthan gum gels were stable to alkaline-surfactant-polymer solutions with pH values of 12.9 at the polymer to chromium ion ratios tested. Silicate-polyacrylamide, resorcinol-formaldehyde, and sulfomethylated resorcinol-formaldehyde gels were also stable to alkaline-surfactant-polymer solutions with pH values ranging from 9.2 to 12.9. Iron-polyacrylamide gels were immediately destroyed when contacted with any of the alkaline-surfactant-polymer solutions with pH values ranging from 9.2 to 12.9. Gel solutions under dynamic conditions of linear corefloods showed similar stability to alkaline-surfactant-polymer solutions as in

  16. Necessity of Flood Early Warning Systems in India

    NASA Astrophysics Data System (ADS)

    Kurian, C.; Natesan, U.; Durga Rao, K. H. V.

    2014-12-01

    India is one of the highly flood prone countries in the world. National flood commission has reported that 400,000 km² of geographical area is prone to floods, constituting to twelve percent of the country's geographical area. Despite the reoccurrences of floods, India still does not have a proper flood warning system. Probably this can be attributed to the lack of trained personnel in using advanced techniques. Frequent flood hazards results in damage to livelihood, infrastructure and public utilities. India has a potential to develop an early warning system since it is one of the few countries where satellite based inputs are regularly used for monitoring and mitigating floods. However, modeling of flood extent is difficult due to the complexity of hydraulic and hydrologic processes during flood events. It has been reported that numerical methods of simulations can be effectively used to simulate the processes correctly. Progress in computational resources, data collection and development of several numerical codes has enhanced the use of hydrodynamic modeling approaches to simulate the flood extent in the floodplains. In this study an attempt is made to simulate the flood in one of the sub basins of Godavari River in India using hydrodynamic modeling techniques. The modeling environment includes MIKE software, which simulates the water depth at every grid cell of the study area. The runoff contribution from the catchment was calculated using Nebdor Afstromnings model. With the hydrodynamic modeling approach, accuracy in discharge and water level computations are improved compared to the conventional methods. The results of the study are proming to develop effective flood management plans in the basin. Similar studies could be taken up in other flood prone areas of the country for continuous modernisation of flood forecasting techniques, early warning systems and strengthening decision support systems, which will help the policy makers in developing management

  17. Distillation Column Flooding Predictor

    SciTech Connect

    George E. Dzyacky

    2010-11-23

    The Flooding Predictor™ is a patented advanced control technology proven in research at the Separations Research Program, University of Texas at Austin, to increase distillation column throughput by over 6%, while also increasing energy efficiency by 10%. The research was conducted under a U. S. Department of Energy Cooperative Agreement awarded to George Dzyacky of 2ndpoint, LLC. The Flooding Predictor™ works by detecting the incipient flood point and controlling the column closer to its actual hydraulic limit than historical practices have allowed. Further, the technology uses existing column instrumentation, meaning no additional refining infrastructure is required. Refiners often push distillation columns to maximize throughput, improve separation, or simply to achieve day-to-day optimization. Attempting to achieve such operating objectives is a tricky undertaking that can result in flooding. Operators and advanced control strategies alike rely on the conventional use of delta-pressure instrumentation to approximate the column’s approach to flood. But column delta-pressure is more an inference of the column’s approach to flood than it is an actual measurement of it. As a consequence, delta pressure limits are established conservatively in order to operate in a regime where the column is never expected to flood. As a result, there is much “left on the table” when operating in such a regime, i.e. the capacity difference between controlling the column to an upper delta-pressure limit and controlling it to the actual hydraulic limit. The Flooding Predictor™, an innovative pattern recognition technology, controls columns at their actual hydraulic limit, which research shows leads to a throughput increase of over 6%. Controlling closer to the hydraulic limit also permits operation in a sweet spot of increased energy-efficiency. In this region of increased column loading, the Flooding Predictor is able to exploit the benefits of higher liquid

  18. 78 FR 52954 - Final Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-27

    ... SECURITY Federal Emergency Management Agency Final Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Final notice. SUMMARY: Flood hazard determinations, which may include additions or modifications of Base Flood Elevations (BFEs), base flood depths, Special Flood Hazard...

  19. 78 FR 52953 - Final Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-27

    ... SECURITY Federal Emergency Management Agency Final Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Final Notice. SUMMARY: Flood hazard determinations, which may include additions or modifications of Base Flood Elevations (BFEs), base flood depths, Special Flood Hazard...

  20. 78 FR 5820 - Final Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-28

    ... SECURITY Federal Emergency Management Agency Final Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Final Notice. SUMMARY: Flood hazard determinations, which may include additions or modifications of Base Flood Elevations (BFEs), base flood depths, Special Flood Hazard...

  1. 78 FR 5821 - Final Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-28

    ... SECURITY Federal Emergency Management Agency Final Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Final Notice. SUMMARY: Flood hazard determinations, which may include additions or modifications of Base Flood Elevations (BFEs), base flood depths, Special Flood Hazard...

  2. 78 FR 21143 - Final Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-09

    ... SECURITY Federal Emergency Management Agency Final Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Final notice. SUMMARY: Flood hazard determinations, which may include additions or modifications of Base Flood Elevations (BFEs), base flood depths, Special Flood Hazard...

  3. Validation of a Global Hydrodynamic Flood Inundation Model

    NASA Astrophysics Data System (ADS)

    Bates, P. D.; Smith, A.; Sampson, C. C.; Alfieri, L.; Neal, J. C.

    2014-12-01

    In this work we present first validation results for a hyper-resolution global flood inundation model. We use a true hydrodynamic model (LISFLOOD-FP) to simulate flood inundation at 1km resolution globally and then use downscaling algorithms to determine flood extent and depth at 90m spatial resolution. Terrain data are taken from a custom version of the SRTM data set that has been processed specifically for hydrodynamic modelling. Return periods of flood flows along the entire global river network are determined using: (1) empirical relationships between catchment characteristics and index flood magnitude in different hydroclimatic zones derived from global runoff data; and (2) an index flood growth curve, also empirically derived. Bankful return period flow is then used to set channel width and depth, and flood defence impacts are modelled using empirical relationships between GDP, urbanization and defence standard of protection. The results of these simulations are global flood hazard maps for a number of different return period events from 1 in 5 to 1 in 1000 years. We compare these predictions to flood hazard maps developed by national government agencies in the UK and Germany using similar methods but employing detailed local data, and to observed flood extent at a number of sites including St. Louis, USA and Bangkok in Thailand. Results show that global flood hazard models can have considerable skill given careful treatment to overcome errors in the publicly available data that are used as their input.

  4. Analysis of flood-rich and flood-poor periods across Germany

    NASA Astrophysics Data System (ADS)

    Merz, Bruno; Viet Dung, Nguyen; Vorogushyn, Sergiy

    2016-04-01

    It has often been suggested that flood occurrence is clustered in flood-rich and flood-poor periods. We test this suggestion for 68 catchments across Germany for the common period 1932-2005. For assessing the robustness of the results, we use three methods to derive the significance of temporal clustering. Clustering is assessed for different thresholds and time scales to understand whether it changes with flood severity and time scale. The majority of catchments show temporal clustering at the 5 % significance level for low thresholds and time scales of one to a few years. However, clustering decreases substantially with increasing threshold and time scale. We hypothesize that flood clustering in Germany is mainly caused by catchment memory effects along with intra- to inter-annual climate variability, and that decadal climate variability plays a minor role.

  5. Development of a simple method for the determination of lead in lipstick using alkaline solubilization and graphite furnace atomic absorption spectrometry.

    PubMed

    Soares, Aline Rodrigues; Nascentes, Clésia Cristina

    2013-02-15

    A simple method was developed for determining the total lead content in lipstick samples by graphite furnace atomic absorption spectrometry (GFAAS) after treatment with tetramethylammonium hydroxide (TMAH). Multivariate optimization was used to establish the optimal conditions of sample preparation. The graphite furnace heating program was optimized through pyrolysis and atomization curves. An aliquot containing approximately 50mg of the sample was mixed with TMAH and heated in a water bath at 60°C for 60 min. Using Nb as the permanent modifier and Pd as the chemical modifier, the optimal temperatures were 900°C and 1800°C for pyrolysis and atomization, respectively. Under optimum conditions, the working range was from 1.73 to 50.0 μg L(-1), with detection and quantification limits of 0.20 and 0.34 μg g(-1), respectively. The precision was evaluated under conditions of repeatability and intermediate precision and showed standard deviations of 2.37%-4.61% and 4.93%-9.75%, respectively. The % recovery ranged from 96.2% to 109%, and no significant differences were found between the results obtained using the proposed method and the microwave decomposition method for real samples. Lead was detected in 21 tested lipstick samples; the lead content in these samples ranged from 0.27 to 4.54 μg g(-1).

  6. Development of a simple method for the determination of lead in lipstick using alkaline solubilization and graphite furnace atomic absorption spectrometry.

    PubMed

    Soares, Aline Rodrigues; Nascentes, Clésia Cristina

    2013-02-15

    A simple method was developed for determining the total lead content in lipstick samples by graphite furnace atomic absorption spectrometry (GFAAS) after treatment with tetramethylammonium hydroxide (TMAH). Multivariate optimization was used to establish the optimal conditions of sample preparation. The graphite furnace heating program was optimized through pyrolysis and atomization curves. An aliquot containing approximately 50mg of the sample was mixed with TMAH and heated in a water bath at 60°C for 60 min. Using Nb as the permanent modifier and Pd as the chemical modifier, the optimal temperatures were 900°C and 1800°C for pyrolysis and atomization, respectively. Under optimum conditions, the working range was from 1.73 to 50.0 μg L(-1), with detection and quantification limits of 0.20 and 0.34 μg g(-1), respectively. The precision was evaluated under conditions of repeatability and intermediate precision and showed standard deviations of 2.37%-4.61% and 4.93%-9.75%, respectively. The % recovery ranged from 96.2% to 109%, and no significant differences were found between the results obtained using the proposed method and the microwave decomposition method for real samples. Lead was detected in 21 tested lipstick samples; the lead content in these samples ranged from 0.27 to 4.54 μg g(-1). PMID:23598019

  7. Design considerations and construction techniques for successive alkalinity producing systems

    SciTech Connect

    Skovran, G.A.; Clouser, C.R.

    1998-12-31

    Successive Alkalinity Producing Systems (SAPS) have been utilized for several years for the passive treatment of acid mine drainage. The SAPS technology is an effective method for inducing alkalinity to neutralize acid mine water and promote the precipitation of contaminating metals. Several design considerations and construction techniques are important for proper system function and longevity. This paper discusses SAPS design, water collection and introduction to the SAPS, hydraulics of SAPS, construction, operation and maintenance, and safety, and found that these factors were critical to obtaining maximum alkalinity at several SAPS treatment sites in Southwestern Pennsylvania. Taking care to incorporate these factors into future SAPS will aid effective treatment, reduce maintenance costs, and maximize long term effectiveness of successive alkalinity producing systems.

  8. Flood Hazard Assessment for the Savannah River Site

    SciTech Connect

    Chen, K.F.

    2000-08-15

    A method was developed to determine the probabilistic flood elevation curves for certain Savannah River Site (SRS) facilities. This paper presents the method used to determine the probabilistic flood elevation curve for F-Area due to runoff from the Upper Three Runs basin. Department of Energy (DOE) Order 420.1, Facility Safety, outlines the requirements for Natural Phenomena Hazard (NPH) mitigation for new and existing DOE facilities. The NPH considered in this paper is flooding. The facility-specific probabilistic flood hazard curve defines as a function of water elevation the annual probability of occurrence or the return period in years. Based on facility-specific probabilistic flood hazard curves and the nature of facility operations (e.g., involving hazardous or radioactive materials), facility managers can design permanent or temporary devices to prevent the propagation of flood on site, and develop emergency preparedness plans to mitigate the consequences of floods. A method was developed to determine the probabilistic flood hazard curves for SRS facilities. The flood hazard curves for the SRS F-Area due to flooding in the Upper Three Runs basin are presented in this paper.

  9. Evaluation of design flood estimates with respect to sample size

    NASA Astrophysics Data System (ADS)

    Kobierska, Florian; Engeland, Kolbjorn

    2016-04-01

    Estimation of design floods forms the basis for hazard management related to flood risk and is a legal obligation when building infrastructure such as dams, bridges and roads close to water bodies. Flood inundation maps used for land use planning are also produced based on design flood estimates. In Norway, the current guidelines for design flood estimates give recommendations on which data, probability distribution, and method to use dependent on length of the local record. If less than 30 years of local data is available, an index flood approach is recommended where the local observations are used for estimating the index flood and regional data are used for estimating the growth curve. For 30-50 years of data, a 2 parameter distribution is recommended, and for more than 50 years of data, a 3 parameter distribution should be used. Many countries have national guidelines for flood frequency estimation, and recommended distributions include the log Pearson II, generalized logistic and generalized extreme value distributions. For estimating distribution parameters, ordinary and linear moments, maximum likelihood and Bayesian methods are used. The aim of this study is to r-evaluate the guidelines for local flood frequency estimation. In particular, we wanted to answer the following questions: (i) Which distribution gives the best fit to the data? (ii) Which estimation method provides the best fit to the data? (iii) Does the answer to (i) and (ii) depend on local data availability? To answer these questions we set up a test bench for local flood frequency analysis using data based cross-validation methods. The criteria were based on indices describing stability and reliability of design flood estimates. Stability is used as a criterion since design flood estimates should not excessively depend on the data sample. The reliability indices describe to which degree design flood predictions can be trusted.

  10. Comparison of methods and optimisation of the analysis of fumonisins B₁ and B₂ in masa flour, an alkaline cooked corn product.

    PubMed

    De Girolamo, A; Pascale, M; Visconti, A

    2011-05-01

    A comparison study of different extraction and clean-up procedures for the liquid chromatographic analysis of fumonisins B(1) (FB(1)) and B(2) (FB(2)) in corn masa flour was performed. The procedures included extraction (heat or room temperature) with acidic conditions or EDTA-containing solvents, and clean-up by immunoaffinity or C18 solid-phase extraction columns. Thereafter an analytical method was optimised using extraction with an acidic mixture of methanol-acetonitrile-citrate/phosphate buffer, clean-up through the immunoaffinity column and determination of fumonisins by liquid chromatography with automated pre-column derivatisation with o-phthaldialdehyde reagent. Recovery experiments performed on yellow, white and blue masa flours at spiking levels of 400, 800 and 1200 µg kg(-1) FB(1) and of 100, 200 and 300 µg kg(-1) FB(2) gave overall mean recoveries of 99% (±6%) for FB(1) and 88% (±6%) for FB(2). Good recoveries (higher than 90% for both FB(1) and FB(2)) were also obtained with corn tortilla chips. The limits of quantification of the method (signal-to-noise ratio of 10) were 25 µg kg(-1) for FB(1) and 17 µg kg(-1) for FB(2). The method was tested on different commercial corn masa flours as well as on white and yellow corn tortilla chips, showing fumonisin contamination levels (FB(1) + FB(2)) up to 1800 µg kg(-1) (FB(1) + FB(2)) in masa flour and 960 µg kg(-1) in tortilla chips. Over 30% of masa flours originating from Mexico exceeded the European Union maximum permitted level.

  11. Comparison of methods and optimisation of the analysis of fumonisins B₁ and B₂ in masa flour, an alkaline cooked corn product.

    PubMed

    De Girolamo, A; Pascale, M; Visconti, A

    2011-05-01

    A comparison study of different extraction and clean-up procedures for the liquid chromatographic analysis of fumonisins B(1) (FB(1)) and B(2) (FB(2)) in corn masa flour was performed. The procedures included extraction (heat or room temperature) with acidic conditions or EDTA-containing solvents, and clean-up by immunoaffinity or C18 solid-phase extraction columns. Thereafter an analytical method was optimised using extraction with an acidic mixture of methanol-acetonitrile-citrate/phosphate buffer, clean-up through the immunoaffinity column and determination of fumonisins by liquid chromatography with automated pre-column derivatisation with o-phthaldialdehyde reagent. Recovery experiments performed on yellow, white and blue masa flours at spiking levels of 400, 800 and 1200 µg kg(-1) FB(1) and of 100, 200 and 300 µg kg(-1) FB(2) gave overall mean recoveries of 99% (±6%) for FB(1) and 88% (±6%) for FB(2). Good recoveries (higher than 90% for both FB(1) and FB(2)) were also obtained with corn tortilla chips. The limits of quantification of the method (signal-to-noise ratio of 10) were 25 µg kg(-1) for FB(1) and 17 µg kg(-1) for FB(2). The method was tested on different commercial corn masa flours as well as on white and yellow corn tortilla chips, showing fumonisin contamination levels (FB(1) + FB(2)) up to 1800 µg kg(-1) (FB(1) + FB(2)) in masa flour and 960 µg kg(-1) in tortilla chips. Over 30% of masa flours originating from Mexico exceeded the European Union maximum permitted level. PMID:21400323

  12. Flood insurance in Canada: implications for flood management and residential vulnerability to flood hazards.

    PubMed

    Oulahen, Greg

    2015-03-01

    Insurance coverage of damage caused by overland flooding is currently not available to Canadian homeowners. As flood disaster losses and water damage claims both trend upward, insurers in Canada are considering offering residential flood coverage in order to properly underwrite the risk and extend their business. If private flood insurance is introduced in Canada, it will have implications for the current regime of public flood management and for residential vulnerability to flood hazards. This paper engages many of the competing issues surrounding the privatization of flood risk by addressing questions about whether flood insurance can be an effective tool in limiting exposure to the hazard and how it would exacerbate already unequal vulnerability. A case study investigates willingness to pay for flood insurance among residents in Metro Vancouver and how attitudes about insurance relate to other factors that determine residential vulnerability to flood hazards. Findings indicate that demand for flood insurance is part of a complex, dialectical set of determinants of vulnerability.

  13. Flood Insurance in Canada: Implications for Flood Management and Residential Vulnerability to Flood Hazards

    NASA Astrophysics Data System (ADS)

    Oulahen, Greg

    2015-03-01

    Insurance coverage of damage caused by overland flooding is currently not available to Canadian homeowners. As flood disaster losses and water damage claims both trend upward, insurers in Canada are considering offering residential flood coverage in order to properly underwrite the risk and extend their business. If private flood insurance is introduced in Canada, it will have implications for the current regime of public flood management and for residential vulnerability to flood hazards. This paper engages many of the competing issues surrounding the privatization of flood risk by addressing questions about whether flood insurance can be an effective tool in limiting exposure to the hazard and how it would exacerbate already unequal vulnerability. A case study investigates willingness to pay for flood insurance among residents in Metro Vancouver and how attitudes about insurance relate to other factors that determine residential vulnerability to flood hazards. Findings indicate that demand for flood insurance is part of a complex, dialectical set of determinants of vulnerability.

  14. Net alkalinity and net acidity 1: Theoretical considerations

    USGS Publications Warehouse

    Kirby, C.S.; Cravotta, C.A.

    2005-01-01

    Net acidity and net alkalinity are widely used, poorly defined, and commonly misunderstood parameters for the characterization of mine drainage. The authors explain theoretical expressions of 3 types of alkalinity (caustic, phenolphthalein, and total) and acidity (mineral, CO2, and total). Except for rarely-invoked negative alkalinity, theoretically defined total alkalinity is closely analogous to measured alkalinity and presents few practical interpretation problems. Theoretically defined "CO 2-acidity" is closely related to most standard titration methods with an endpoint pH of 8.3 used for determining acidity in mine drainage, but it is unfortunately named because CO2 is intentionally driven off during titration of mine-drainage samples. Using the proton condition/mass- action approach and employing graphs to illustrate speciation with changes in pH, the authors explore the concept of principal components and how to assign acidity contributions to aqueous species commonly present in mine drainage. Acidity is defined in mine drainage based on aqueous speciation at the sample pH and on the capacity of these species to undergo hydrolysis to pH 8.3. Application of this definition shows that the computed acidity in mg L -1 as CaCO3 (based on pH and analytical concentrations of dissolved FeII, FeIII, Mn, and Al in mg L -1):aciditycalculated=50{1000(10-pH)+[2(FeII)+3(FeIII)]/56+2(Mn)/ 55+3(Al)/27}underestimates contributions from HSO4- and H+, but overestimates the acidity due to Fe3+ and Al3+. However, these errors tend to approximately cancel each other. It is demonstrated that "net alkalinity" is a valid mathematical construction based on theoretical definitions of alkalinity and acidity. Further, it is shown that, for most mine-drainage solutions, a useful net alkalinity value can be derived from: (1) alkalinity and acidity values based on aqueous speciation, (2) measured alkalinity minus calculated acidity, or (3) taking the negative of the value obtained in a

  15. Nogales flood detention study

    USGS Publications Warehouse

    Norman, Laura M.; Levick, Lainie; Guertin, D. Phillip; Callegary, James; Guadarrama, Jesus Quintanar; Anaya, Claudia Zulema Gil; Prichard, Andrea; Gray, Floyd; Castellanos, Edgar; Tepezano, Edgar; Huth, Hans; Vandervoet, Prescott; Rodriguez, Saul; Nunez, Jose; Atwood, Donald; Granillo, Gilberto Patricio Olivero; Ceballos, Francisco Octavio Gastellum

    2010-01-01

    Flooding in Ambos Nogales often exceeds the capacity of the channel and adjacent land areas, endangering many people. The Nogales Wash is being studied to prevent future flood disasters and detention features are being installed in tributaries of the wash. This paper describes the application of the KINEROS2 model and efforts to understand the capacity of these detention features under various flood and urbanization scenarios. Results depict a reduction in peak flow for the 10-year, 1-hour event based on current land use in tributaries with detention features. However, model results also demonstrate that larger storm events and increasing urbanization will put a strain on the features and limit their effectiveness.

  16. Paleohydrologic techniques used to define the spatial occurrence of floods

    USGS Publications Warehouse

    Jarrett, R.D.

    1990-01-01

    Defining the cause and spatial characteristics of floods may be difficult because of limited streamflow and precipitation data. New paleohydrologic techniques that incorporate information from geomorphic, sedimentologic, and botanic studies provide important supplemental information to define homogeneous hydrologic regions. These techniques also help to define the spatial structure of rainstorms and floods and improve regional flood-frequency estimates. The occurrence and the non-occurrence of paleohydrologic evidence of floods, such as flood bars, alluvial fans, and tree scars, provide valuable hydrologic information. The paleohydrologic research to define the spatial characteristics of floods improves the understanding of flood hydrometeorology. This research was used to define the areal extent and contributing drainage area of flash floods in Colorado. Also, paleohydrologic evidence was used to define the spatial boundaries for the Colorado foothills region in terms of the meteorologic cause of flooding and elevation. In general, above 2300 m, peak flows are caused by snowmelt. Below 2300 m, peak flows primarily are caused by rainfall. The foothills region has an upper elevation limit of about 2300 m and a lower elevation limit of about 1500 m. Regional flood-frequency estimates that incorporate the paleohydrologic information indicate that the Big Thompson River flash flood of 1976 had a recurrence interval of approximately 10,000 years. This contrasts markedly with 100 to 300 years determined by using conventional hydrologic analyses. Flood-discharge estimates based on rainfall-runoff methods in the foothills of Colorado result in larger values than those estimated with regional flood-frequency relations, which are based on long-term streamflow data. Preliminary hydrologic and paleohydrologic research indicates that intense rainfall does not occur at higher elevations in other Rocky Mountain states and that the highest elevations for rainfall-producing floods

  17. Development of flood index by characterisation of flood hydrographs

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Biswa; Suman, Asadusjjaman

    2015-04-01

    In recent years the world has experienced deaths, large-scale displacement of people, billions of Euros of economic damage, mental stress and ecosystem impacts due to flooding. Global changes (climate change, population and economic growth, and urbanisation) are exacerbating the severity of flooding. The 2010 floods in Pakistan and the 2011 floods in Australia and Thailand demonstrate the need for concerted action in the face of global societal and environmental changes to strengthen resilience against flooding. Due to climatological characteristics there are catchments where flood forecasting may have a relatively limited role and flood event management may have to be trusted upon. For example, in flash flood catchments, which often may be tiny and un-gauged, flood event management often depends on approximate prediction tools such as flash flood guidance (FFG). There are catchments fed largely by flood waters coming from upstream catchments, which are un-gauged or due to data sharing issues in transboundary catchments the flow of information from upstream catchment is limited. Hydrological and hydraulic modelling of these downstream catchments will never be sufficient to provide any required forecasting lead time and alternative tools to support flood event management will be required. In FFG, or similar approaches, the primary motif is to provide guidance by synthesising the historical data. We follow a similar approach to characterise past flood hydrographs to determine a flood index (FI), which varies in space and time with flood magnitude and its propagation. By studying the variation of the index the pockets of high flood risk, requiring attention, can be earmarked beforehand. This approach can be very useful in flood risk management of catchments where information about hydro-meteorological variables is inadequate for any forecasting system. This paper presents the development of FI and its application to several catchments including in Kentucky in the USA

  18. Clean method for the synthesis of reduced graphene oxide-supported PtPd alloys with high electrocatalytic activity for ethanol oxidation in alkaline medium.

    PubMed

    Ren, Fangfang; Wang, Huiwen; Zhai, Chunyang; Zhu, Mingshan; Yue, Ruirui; Du, Yukou; Yang, Ping; Xu, Jingkun; Lu, Wensheng

    2014-03-12

    In this article, a clean method for the synthesis of PtPd/reduced graphene oxide (RGO) catalysts with different Pt/Pd ratios is reported in which no additional components such as external energy (e.g., high temperature or high pressure), surfactants, or stabilizing agents are required. The obtained catalysts were characterized by X-ray diffraction (XRD), transmission electron microscopy (TEM), high-resolution transmission electron microscopy (HRTEM), Raman spectroscopy, X-ray photoelectron spectroscopy (XPS), induced coupled plasma atomic emission spectroscopy (ICP-AES), and electrochemical measurements. The HRTEM measurements showed that all of the metallic nanoparticles (NPs) exhibited well-defined crystalline structures. The composition of these Pt-Pd/RGO catalysts can be easily controlled by adjusting the molar ratio of the Pt and Pd precursors. Both cyclic voltammetry (CV) and chronoamperometry (CA) results demonstrate that bimetallic PtPd catalysts have superior catalytic activity for the ethanol oxidation reaction compared to the monometallic Pt or Pd catalyst, with the best performance found with the PtPd (1:3)/RGO catalyst. The present study may open a new approach for the synthesis of PtPd alloy catalysts, which is expected to have promising applications in fuel cells.

  19. Evaluation of 4,4'-diaminodiphenyl ether in the rat comet assay: Part of the Japanese Center for the Validation of Alternative Methods (JaCVAM)-initiative international validation study of in vivo rat alkaline comet assay.

    PubMed

    Priestley, Catherine C; Walker, Joanne S; O'Donovan, Michael R; Doherty, Ann T

    2015-07-01

    As a part of the Japanese Center for the Validation of Alternative Methods (JaCVAM)-initiative international validation study of the in vivo rat alkaline comet assay, 4,4'-diaminodiphenyl ether (DPE), a known rodent genotoxic carcinogen, was tested in this laboratory. Sprague Dawley rats (7-9 weeks of age) were given three oral doses of DPE, 24 and 21 h apart and liver or stomach sampled 3h after the final dose. Under the conditions of the test, no increases in DNA damage in liver and stomach were observed with DPE (up to 200 mg/kg/day). A dose-dependent decrease in DNA migration, compared to vehicle controls, was noted for DPE in rat stomach. Further analysis is required to elucidate fully whether this decrease is a consequence of the mode of action or due to the toxicity of DPE. What is perhaps surprising is the inability of the comet assay to detect a known rat genotoxic carcinogen in liver. Further investigation is needed to clarify whether this apparent lack of response results from limited tissue exposure or metabolic differences between species. This finding highlights a need for careful consideration of study design when evaluating assay performance as a measure of in vivo genotoxicity.

  20. Estimating Non-stationary Flood Risk in a Changing Climate

    NASA Astrophysics Data System (ADS)

    Yu, X.; Cohn, T. A.; Stedinger, J. R.

    2015-12-01

    Flood risk is usually described by a probability distribution for annual maximum streamflow which is assumed not to change with time. Federal, state and local governments in the United States are demanding guidance on flood frequency estimates that account for climate change. If a trend exists in peak flow series, ignoring it could result in large quantile estimator bias, while trying to estimate a trend will increase the flood quantile estimator's variance. Thus the issue is, what bias-variance tradeoff should we accept? This paper discusses approaches to flood frequency analysis (FFA) when flood series have trends. GCMs describe how annual runoff might vary over sub-continental scales, but this information is nearly useless for FFA in small watersheds. A LP3 Monte Carlo analysis and a re-sampling study of 100-year flood estimation (25- and 50-year projections) compares the performance of five methods: FFA as prescribed in national guidelines (Bulletin 17B), assumes the flood series is stationary and follows a log-Pearson type III (LP3) distribution; Fitting a LP3 distribution with time-varying parameters that include future trends in mean and perhaps variance, where slopes are assumed known; Fitting a LP3 distribution with time-varying parameters that capture future trends in mean and perhaps variance, where slopes are estimated from annual peak flow series; Employing only the most recent 30 years of flood records to fit a LP3 distribution; Applying a safety factor to the 100-year flood estimator (e.g. 25% increase). The 100-year flood estimator of method 2 has the smallest log-space mean squared error, though it is unlikely that the true trend would be known. Method 3 is only recommended over method 1 for large trends (≥ 0.5% per year). The 100-year flood estimators of method 1, 4, and 5 often have poor accuracy. Clearly, flood risk assessment will be a challenge in an uncertain world.

  1. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    SciTech Connect

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNA populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.

  2. Support vector regression for real-time flood stage forecasting

    NASA Astrophysics Data System (ADS)

    Yu, Pao-Shan; Chen, Shien-Tsung; Chang, I.-Fan

    2006-09-01

    SummaryFlood forecasting is an important non-structural approach for flood mitigation. The flood stage is chosen as the variable to be forecasted because it is practically useful in flood forecasting. The support vector machine, a novel artificial intelligence-based method developed from statistical learning theory, is adopted herein to establish a real-time stage forecasting model. The lags associated with the input variables are determined by applying the hydrological concept of the time of response, and a two-step grid search method is applied to find the optimal parameters, and thus overcome the difficulties in constructing the learning machine. Two structures of models used to perform multiple-hour-ahead stage forecasts are developed. Validation results from flood events in Lan-Yang River, Taiwan, revealed that the proposed models can effectively predict the flood stage forecasts one-to-six-hours ahead. Moreover, a sensitivity analysis was conducted on the lags associated with the input variables.

  3. Defining the hundred year flood: A Bayesian approach for using historic data to reduce uncertainty in flood frequency estimates

    NASA Astrophysics Data System (ADS)

    Parkes, Brandon; Demeritt, David

    2016-09-01

    This paper describes a Bayesian statistical model for estimating flood frequency by combining uncertain annual maximum (AMAX) data from a river gauge with estimates of flood peak discharge from various historic sources that predate the period of instrument records. Such historic flood records promise to expand the time series data needed for reducing the uncertainty in return period estimates for extreme events, but the heterogeneity and uncertainty of historic records make them difficult to use alongside Flood Estimation Handbook and other standard methods for generating flood frequency curves from gauge data. Using the flow of the River Eden in Carlisle, Cumbria, UK as a case study, this paper develops a Bayesian model for combining historic flood estimates since 1800 with gauge data since 1967 to estimate the probability of low frequency flood events for the area taking account of uncertainty in the discharge estimates. Results show a reduction in 95% confidence intervals of roughly 50% for annual exceedance probabilities of less than 0.0133 (return periods over 75 years) compared to standard flood frequency estimation methods using solely systematic data. Sensitivity analysis shows the model is sensitive to 2 model parameters both of which are concerned with the historic (pre-systematic) period of the time series. This highlights the importance of adequate consideration of historic channel and floodplain changes or possible bias in estimates of historic flood discharges. The next steps required to roll out this Bayesian approach for operational flood frequency estimation at other sites is also discussed.

  4. Distillation Column Flooding Predictor

    SciTech Connect

    2002-02-01

    This factsheet describes a research project whose goal is to develop the flooding predictor, an advanced process control strategy, into a universally useable tool that will maximize the separation yield of a distillation column.

  5. Alkaline Waterflooding Demonstration Project, Ranger Zone, Long Beach Unit, Wilmington Field, California. Fourth annual report, June 1979-May 1980. Volume 3. Appendices II-XVII

    SciTech Connect

    Carmichael, J.D.

    1981-03-01

    Volume 3 contains Appendices II through XVII: mixing instructions for sodium orthosilicate; oil displacement studies using THUMS C-331 crude oil and extracted reservoir core material from well B-110; clay mineral analysis of B-827-A cores; sieve analysis of 4 Fo sand samples from B-110-IA and 4 Fo sand samples from B-827-A; core record; delayed secondary caustic consumption tests; long-term alkaline consumption in reservoir sands; demulsification study for THUMS Long Beach Company, Island White; operating plans and instructions for DOE injection demonstration project, alkaline injection; caustic pilot-produced water test graphs; well test irregularities (6/1/79-5/31/80); alkaline flood pump changes (6/1/79-5/31/80); monthly DOE pilot chemical waterflood injection reports (preflush injection, alkaline-salt injection, and alkaline injection without salt); and caustic safety procedures-alkaline chemicals.

  6. Real Time Monitoring of Flooding from Microwave Satellite Observations

    NASA Technical Reports Server (NTRS)

    Galantowicz, John F.; Frey, Herb (Technical Monitor)

    2002-01-01

    We have developed a new method for making high-resolution flood extent maps (e.g., at the 30-100 m scale of digital elevation models) in real-time from low-resolution (20-70 km) passive microwave observations. The method builds a "flood-potential" database from elevations and historic flood imagery and uses it to create a flood-extent map consistent with the observed open water fraction. Microwave radiometric measurements are useful for flood monitoring because they sense surface water in clear-or-cloudy conditions and can provide more timely data (e.g., compared to radars) from relatively wide swath widths and an increasing number of available platforms (DMSP, ADEOS-II, Terra, NPOESS, GPM). The chief disadvantages for flood mapping are the radiometers' low resolution and the need for local calibration of the relationship between radiances and open-water fraction. We present our method for transforming microwave sensor-scale open water fraction estimates into high-resolution flood extent maps and describe 30-day flood map sequences generated during a retrospective study of the 1993 Great Midwest Flood. We discuss the method's potential improvement through as yet unimplemented algorithm enhancements and expected advancements in microwave radiometry (e.g., improved resolution and atmospheric correction).

  7. Flood Bypass Capacity Optimization

    NASA Astrophysics Data System (ADS)

    Siclari, A.; Hui, R.; Lund, J. R.

    2015-12-01

    Large river flows can damage adjacent flood-prone areas, by exceeding river channel and levee capacities. Particularly large floods are difficult to contain in leveed river banks alone. Flood bypasses often can efficiently reduce flood risks, where excess river flow is diverted over a weir to bypasses, that incur much less damage and cost. Additional benefits of bypasses include ecosystem protection, agriculture, groundwater recharge and recreation. Constructing or expanding an existing bypass costs in land purchase easements, and levee setbacks. Accounting for such benefits and costs, this study develops a simple mathematical model for optimizing flood bypass capacity using benefit-cost and risk analysis. Application to the Yolo Bypass, an existing bypass along the Sacramento River in California, estimates optimal capacity that economically reduces flood damage and increases various benefits, especially for agriculture. Land availability is likely to limit bypass expansion. Compensation for landowners could relax such limitations. Other economic values could affect the optimal results, which are shown by sensitivity analysis on major parameters. By including land geography into the model, location of promising capacity expansions can be identified.

  8. Framework for probabilistic flood risk assessment in an Alpine region

    NASA Astrophysics Data System (ADS)

    Schneeberger, Klaus; Huttenlau, Matthias; Steinberger, Thomas; Achleitner, Stefan; Stötter, Johann

    2014-05-01

    Flooding is among the natural hazards that regularly cause significant losses to property and human lives. The assessment of flood risk delivers crucial information for all participants involved in flood risk management and especially for local authorities and insurance companies in order to estimate the possible flood losses. Therefore a framework for assessing flood risk has been developed and is introduced with the presented contribution. Flood risk is thereby defined as combination of the probability of flood events and of potential flood damages. The probability of occurrence is described through the spatial and temporal characterisation of flood. The potential flood damages are determined in the course of vulnerability assessment, whereas, the exposure and the vulnerability of the elements at risks are considered. Direct costs caused by flooding with the focus on residential building are analysed. The innovative part of this contribution lies on the development of a framework which takes the probability of flood events and their spatio-temporal characteristic into account. Usually the probability of flooding will be determined by means of recurrence intervals for an entire catchment without any spatial variation. This may lead to a misinterpretation of the flood risk. Within the presented framework the probabilistic flood risk assessment is based on analysis of a large number of spatial correlated flood events. Since the number of historic flood events is relatively small additional events have to be generated synthetically. This temporal extrapolation is realised by means of the method proposed by Heffernan and Tawn (2004). It is used to generate a large number of possible spatial correlated flood events within a larger catchment. The approach is based on the modelling of multivariate extremes considering the spatial dependence structure of flood events. The input for this approach are time series derived from river gauging stations. In a next step the

  9. An automated approach to flood mapping

    NASA Astrophysics Data System (ADS)

    Sun, Weihua; Mckeown, Donald M.; Messinger, David W.

    2012-10-01

    Heavy rain from Tropical Storm Lee resulted in a major flood event for the southern tier of New York State in early September 2011 causing evacuation of approximately 20,000 people in and around the city of Binghamton. In support of the New York State Office of Emergency Management, a high resolution multispectral airborne sensor (WASP) developed by RIT was deployed over the flooded area to collect aerial images. One of the key benefits of these images is their provision for flood inundation area mapping. However, these images require a significant amount of storage space and the inundation mapping process is conventionally carried out using manual digitization. In this paper, we design an automated approach for flood inundation mapping from the WASP airborne images. This method employs Spectral Angle Mapper (SAM) for color RGB or multispectral aerial images to extract the flood binary map; then it uses a set of morphological processing and a boundary vectorization technique to convert the binary map into a shapefile. This technique is relatively fast and only requires the operator to select one pixel on the image. The generated shapefile is much smaller than the original image and can be imported to most GIS software packages. This enables critical flood information to be shared with and by disaster response managers very rapidly, even over cellular phone networks.

  10. Preparing for floods: flood forecasting and early warning

    NASA Astrophysics Data System (ADS)

    Cloke, Hannah

    2016-04-01

    Flood forecasting and early warning has continued to stride ahead in strengthening the preparedness phases of disaster risk management, saving lives and property and reducing the overall impact of severe flood events. For example, continental and global scale flood forecasting systems such as the European Flood Awareness System and the Global Flood Awareness System provide early information about upcoming floods in real time to various decisionmakers. Studies have found that there are monetary benefits to implementing these early flood warning systems, and with the science also in place to provide evidence of benefit and hydrometeorological institutional outlooks warming to the use of probabilistic forecasts, the uptake over the last decade has been rapid and sustained. However, there are many further challenges that lie ahead to improve the science supporting flood early warning and to ensure that appropriate decisions are made to maximise flood preparedness.

  11. Development of evaluation metod of flood risk in Tokyo metropolitan area

    NASA Astrophysics Data System (ADS)

    Hirano, J.; Dairaku, K.

    2012-12-01

    Flood is one of the most significant natural hazards in Japan. In particular, the Tokyo metropolitan area has been affected by several large flood disasters. Investigating potential flood risk in Tokyo metropolitan area is important for development of climate change adaptation strategy. We aim to develop a method for evaluating flood risk in Tokyo Metropolitan area by considering effect of historical land use and land cover change, socio-economic change, and climatic change. Ministry of land, infrastructure, transport and tourism in Japan published "Statistics of flood", which contains data for flood causes, number of damaged houses, area of wetted surface, and total amount of damage for each flood at small municipal level. Based on these flood data, we constructed a flood database system for Tokyo metropolitan area for the period from 1961 to 2008 by using ArcGIS software.Based on these flood data , we created flood risk curve, representing the relation ship between damage and exceedbability of flood for the period 1976-2008. Based on the flood risk cruve, we aim to evaluate potential flood risk in the Tokyo metropolitan area and clarify the cause of regional difference in flood risk at Tokyo metropolitan area by considering effect of socio-economic change and climate change

  12. Nucleotide sequences encoding a thermostable alkaline protease

    DOEpatents

    Wilson, David B.; Lao, Guifang

    1998-01-01

    Nucleotide sequences, derived from a thermophilic actinomycete microorganism, which encode a thermostable alkaline protease are disclosed. Also disclosed are variants of the nucleotide sequences which encode a polypeptide having thermostable alkaline proteolytic activity. Recombinant thermostable alkaline protease or recombinant polypeptide may be obtained by culturing in a medium a host cell genetically engineered to contain and express a nucleotide sequence according to the present invention, and recovering the recombinant thermostable alkaline protease or recombinant polypeptide from the culture medium.

  13. Nucleotide sequences encoding a thermostable alkaline protease

    DOEpatents

    Wilson, D.B.; Lao, G.

    1998-01-06

    Nucleotide sequences, derived from a thermophilic actinomycete microorganism, which encode a thermostable alkaline protease are disclosed. Also disclosed are variants of the nucleotide sequences which encode a polypeptide having thermostable alkaline proteolytic activity. Recombinant thermostable alkaline protease or recombinant polypeptide may be obtained by culturing in a medium a host cell genetically engineered to contain and express a nucleotide sequence according to the present invention, and recovering the recombinant thermostable alkaline protease or recombinant polypeptide from the culture medium. 3 figs.

  14. Validation of a global hydrodynamic flood inundation model against high resolution observation data of urban flooding

    NASA Astrophysics Data System (ADS)

    Bates, Paul; Sampson, Chris; Smith, Andy; Neal, Jeff

    2015-04-01

    In this work we present further validation results for a hyper-resolution global flood inundation model. We use a true hydrodynamic model that uses highly efficient numerical algorithms (LISFLOOD-FP) to simulate flood inundation at ~1km resolution globally and then use downscaling algorithms to determine flood extent and water depth at 3 seconds of arc spatial resolution (~90m at the equator). The global model has ~150 million cells and requires ~180 hours of CPU time for a 10 year simulation period. Terrain data are taken from a custom version of the SRTM data set that has been processed specifically for hydrodynamic modelling. Return periods of flood flows along the entire global river network are determined using: (1) empirical relationships between catchment characteristics and index flood magnitude in different hydroclimatic zones derived from global runoff data; and (2) an index flood growth curve, also empirically derived. Bankful return period flow is then used to set channel width and depth, and flood defence impacts are modelled using empirical relationships between GDP, urbanization and defence standard of protection. The results of these simulations are global flood hazard maps for a number of different return period events from 1 in 5 to 1 in 1000 years. This method has already been show to compare well to return period flood hazard maps derived from models built with high resolution and accuracy local data (Sampson et al., submitted), yet the output from the global flood model has not yet been compared to real flood observations. Whilst the spatial resolution of the global model is high given the size of the model domain, ~1km resolution is still coarse compared to the models typically used to simulate urban flooding and the data typically used to validate these (~25m or less). Comparison of the global model to real-world observations or urban flooding therefore represents an exceptionally stringent test of model skill. In this paper we therefore

  15. Historical flood information and their utility in contemporary flood risk assessment

    NASA Astrophysics Data System (ADS)

    Macdonald, Neil; Kjeldsen, Thomas; Lang, Michel; Mediero, Luis

    2014-05-01

    The COST Action ES0901 on 'European procedures for flood frequency estimation' has initiated the collection information on how historical flood records are incorporated into flood frequency analysis across Europe, the survey also examines methods, practices and extent of historical information in each country, with notable variability across Europe. Currently, flood frequency is most commonly based on systematic instrumental data, collected by a variety of station authorities/bodies across Europe, these stations are of various form and complexity dependent on the level of data accuracy required. A well-know consequence of the extrapolation from short series is the high level of uncertainty associated with estimates of design floods with large return periods. Given that the average record length is typically in the range 20-40 years, hydrologists have attempted to reduce the uncertainty levels by extending available records by bringing flood data from before the beginning of systematic flow recording into the analysis in the form of historical and palaeoflood data, this is included with development and planning regulations in some EU countries and is identified within the EU Floods Directive (2007/60/EC).

  16. Alkalinity Enrichment Enhances Net Calcification of a Coral Reef Flat

    NASA Astrophysics Data System (ADS)

    Albright, R.; Caldeira, K.

    2015-12-01

    Ocean acidification is projected to shift reefs from a state of net accretion to one of net dissolution sometime this century. While retrospective studies show large-scale changes in coral calcification over the last several decades, it is not possible to unequivocally link these results to ocean acidification due to confounding factors of temperature and other environmental parameters. Here, we quantified the calcification response of a coral reef flat to alkalinity enrichment to test whether reef calcification increases when ocean chemistry is restored to near pre-industrial conditions. We used sodium hydroxide (NaOH) to increase the total alkalinity of seawater flowing over a reef flat, with the aim of increasing carbonate ion concentrations [CO32-] and the aragonite saturation state (Ωarag) to values that would have been attained under pre-industrial atmospheric pCO2 levels. We developed a dual tracer regression method to estimate alkalinity uptake (i.e., calcification) in response to alkalinity enrichment. This approach uses the change in ratios between a non-conservative tracer (alkalinity) and a conservative tracer (a non-reactive dye, Rhodamine WT) to assess the fraction of added alkalinity that is taken up by the reef as a result of an induced increase in calcification rate. Using this method, we estimate that an average of 17.3% ± 2.3% of the added alkalinity was taken up by the reef community. In providing results from the first seawater chemistry manipulation experiment performed on a natural coral reef community (without artificial confinement), we demonstrate that, upon increase of [CO32-] and Ωarag to near pre-industrial values, reef calcification increases. Thus, we conclude that, the impacts of ocean acidification are already being felt by coral reefs. This work is the culmination of years of work in the Caldeira lab at the Carnegie Institution for Science, involving many people including Jack Silverman, Kenny Schneider, and Jana Maclaren.

  17. Method for estimating potential wetland extent by utilizing streamflow statistics and flood-inundation mapping techniques: Pilot study for land along the Wabash River near Terre Haute, Indiana

    USGS Publications Warehouse

    Kim, Moon H.; Ritz, Christian T.; Arvin, Donald V.

    2012-01-01

    Potential wetland extents were estimated for a 14-mile reach of the Wabash River near Terre Haute, Indiana. This pilot study was completed by the U.S. Geological Survey in cooperation with the U.S. Department of Agriculture, Natural Resources Conservation Service (NRCS). The study showed that potential wetland extents can be estimated by analyzing streamflow statistics with the available streamgage data, calculating the approximate water-surface elevation along the river, and generating maps by use of flood-inundation mapping techniques. Planning successful restorations for Wetland Reserve Program (WRP) easements requires a determination of areas that show evidence of being in a zone prone to sustained or frequent flooding. Zone determinations of this type are used by WRP planners to define the actively inundated area and make decisions on restoration-practice installation. According to WRP planning guidelines, a site needs to show evidence of being in an "inundation zone" that is prone to sustained or frequent flooding for a period of 7 consecutive days at least once every 2 years on average in order to meet the planning criteria for determining a wetland for a restoration in agricultural land. By calculating the annual highest 7-consecutive-day mean discharge with a 2-year recurrence interval (7MQ2) at a streamgage on the basis of available streamflow data, one can determine the water-surface elevation corresponding to the calculated flow that defines the estimated inundation zone along the river. By using the estimated water-surface elevation ("inundation elevation") along the river, an approximate extent of potential wetland for a restoration in agricultural land can be mapped. As part of the pilot study, a set of maps representing the estimated potential wetland extents was generated in a geographic information system (GIS) application by combining (1) a digital water-surface plane representing the surface of inundation elevation that sloped in the downstream

  18. Optimization of rainfall thresholds for a flood warning system to Taiwan urban areas during storm events

    NASA Astrophysics Data System (ADS)

    Liao, Hao-Yu; Pan, Tsung-Yi; Su, Ming-Daw; Hsieh, Ming-Chang; Tan, Yih-Chi

    2016-04-01

    Flood is one of the most damage disaster that always happen around the world. Because of the extreme weather change, the flood disaster damage becomes higher than before. In recent years, Taiwan suffered from flood damage frequently by excessive rainfall induced by extreme weather, like typhoons. Therefore, it is necessary to build an effective flood warning system to reduce the flood damage. The operational flood warning system in Taiwan is based on the rainfall thresholds. When cumulative rainfall over the rainfall thresholds, the flood warning system would alert the local government where region would happen flood disaster. According to the flood warning system alert, the governments have more time to prepare how to face the flood disaster before happens. Although Taiwanese government has a preliminary flood warning system, the system has still lack of theoretical background. For this reason, the alert accuracy of the system is limited. Thus it is important to develop the effective rainfall thresholds that could predict flood disaster successfully. The research aims to improve the accuracy of the system through statistical methods. When the accumulated rainfall reaches the alert value, the warning message would be announced early to government for dealing with flooding damage which would happen. According to extreme events, the data driven and statistical methods are adopted to calculate the optimum rainfall thresholds. The results of this study could be applied to enhance rainfall thresholds forecasting accuracy, and could reduce the risk of floods.

  19. Flood Hazard in Barpeta District, Assam: Environmental Perspectives

    NASA Astrophysics Data System (ADS)

    Talukdar, Naba Kumar

    The study deals with various aspects of flood hazard in Barpeta district of Assam, Northeast India. It is broadly confined to three basic themes - general perspectives, environmental perspectives and flood hazard mitigation. The first theme includes the study on flow characteristics of the major rivers of the district during rainy season and zoning of flood prone areas. The second theme deals with some environmental aspects of floods in the district, such as river water quality during floods, effects of floods on soil quality, human health and socioeconomic losses. Flood mitigation study includes discussion on measures adopted for flood mitigation in the district and suggested management strategies. The study covers a wide range of database generated from both primary and secondary sources. Primary data on relevant parameters of soil and water are generated by using proper sampling procedures and standard laboratory methods. Suitable graphical and statistical methods have been used to analyze and interpret different kinds of data. All the relevant data and surveyed information on the perspective of the flood plain dwellers of the district are integrated together in formulating flood management strategies. The Barpeta District of Assam covers an area of 3245 sq. km. comprising 4.2% of the total area of the state. The district has fascinating diversified landscape sloping from north to south which includes highlands covered by forests, plain fertile lands suitable for agricultural activities and low lying areas containing-water bodies and swamps. Flood is a perennial problem and all kinds of common flood damages prevail in the district. Floods cause large-scale damages to the socio-economic life of the people as well as to the ecology and environment of the district to a certain extent. The rivers Manas, Beki, Pahumara and Kaldia and their tributaries, which emerge from Eastern Himalaya, create flood havocs in the district. During monsoon period, these rivers are

  20. Slope-Area Computation Program Graphical User Interface 1.0—A Preprocessing and Postprocessing Tool for Estimating Peak Flood Discharge Using the Slope-Area Method

    USGS Publications Warehouse

    Bradley, D. Nathan

    2012-01-01

    The slope-area method is a technique for estimating the peak discharge of a flood after the water has receded (Dalrymple and Benson, 1967). This type of discharge estimate is called an “indirect measurement” because it relies on evidence left behind by the flood, such as high-water marks (HWMs) on trees or buildings. These indicators of flood stage are combined with measurements of the cross-sectional geometry of the stream, estimates of channel roughness, and a mathematical model that balances the total energy of the flow between cross sections. This is in contrast to a “direct” measurement of discharge during the flood where cross-sectional area is measured and a current meter or acoustic equipment is used to measure the water velocity. When a direct discharge measurement cannot be made at a gage during high flows because of logistics or safety reasons, an indirect measurement of a peak discharge is useful for defining the high-flow section of the stage-discharge relation (rating curve) at the stream gage, resulting in more accurate computation of high flows. The Slope-Area Computation program (SAC; Fulford, 1994) is an implementation of the slope-area method that computes a peak-discharge estimate from inputs of water-surface slope (from surveyed HWMs), channel geometry, and estimated channel roughness. SAC is a command line program written in Fortran that reads input data from a formatted text file and prints results to another formatted text file. Preparing the input file can be time-consuming and prone to errors. This document describes the SAC graphical user interface (GUI), a crossplatform “wrapper” application that prepares the SAC input file, executes the program, and helps the user interpret the output. The SAC GUI is an update and enhancement of the slope-area method (SAM; Hortness, 2004; Berenbrock, 1996), an earlier spreadsheet tool used to aid field personnel in the completion of a slope-area measurement. The SAC GUI reads survey data

  1. Crowdsourcing detailed flood data

    NASA Astrophysics Data System (ADS)

    Walliman, Nicholas; Ogden, Ray; Amouzad*, Shahrzhad

    2015-04-01

    Over the last decade the average annual loss across the European Union due to flooding has been 4.5bn Euros, but increasingly intense rainfall, as well as population growth, urbanisation and the rising costs of asset replacements, may see this rise to 23bn Euros a year by 2050. Equally disturbing are the profound social costs to individuals, families and communities which in addition to loss of lives include: loss of livelihoods, decreased purchasing and production power, relocation and migration, adverse psychosocial effects, and hindrance of economic growth and development. Flood prediction, management and defence strategies rely on the availability of accurate information and flood modelling. Whilst automated data gathering (by measurement and satellite) of the extent of flooding is already advanced it is least reliable in urban and physically complex geographies where often the need for precise estimation is most acute. Crowdsourced data of actual flood events is a potentially critical component of this allowing improved accuracy in situations and identifying the effects of local landscape and topography where the height of a simple kerb, or discontinuity in a boundary wall can have profound importance. Mobile 'App' based data acquisition using crowdsourcing in critical areas can combine camera records with GPS positional data and time, as well as descriptive data relating to the event. This will automatically produce a dataset, managed in ArcView GIS, with the potential for follow up calls to get more information through structured scripts for each strand. Through this local residents can provide highly detailed information that can be reflected in sophisticated flood protection models and be core to framing urban resilience strategies and optimising the effectiveness of investment. This paper will describe this pioneering approach that will develop flood event data in support of systems that will advance existing approaches such as developed in the in the UK

  2. Flood hazard assessment for the Savannah River Site

    SciTech Connect

    Chen, K.F.

    2000-01-18

    A method was developed to determine the probabilistic flood elevation curves for certain Savannah River Site (SRS) facilities. This paper presents the method used to determine the probabilistic flood elevation curve for F-Area due to runoff from the Upper Three Runs basin. Department of Energy (DOE) Order 420.1, Facility Safety, outlines the requirements for Natural Phenomena Hazard (NPH) mitigation for new and existing DOE facilities. The NPH considered in this paper is flooding. The facility-specific probabilistic flood hazard curve defines as a function of water elevation the annual probability of occurrence or the return period in years. Based on facility-specific probabilistic flood hazard curves and the nature of facility operations (e.g., involving hazardous or radioactive materials), facility managers can design permanent or temporary devices to prevent the propagation of flood on site, and develop emergency preparedness plans to mitigate the consequences of floods. The flood hazard curves for the SRS F-Area due to flooding in the Upper Three Runs basin are presented in this paper.

  3. Floods in Colorado

    USGS Publications Warehouse

    Follansbee, Robert; Sawyer, Leon R.

    1948-01-01

    The first records of floods in Colorado antedated the settlement of the State by about 30 years. These were records of floods on the Arkansas and Republican Rivers in 1826. Other floods noted by traders, hunters and emigrants, some of whom were on their way to the Far West, occurred in 1844 on the Arkansas River, and by inference on the South Platte River. Other early floods were those on the Purgatoire, the Lower Arkansas, and the San Juan Rivers about 1859. The most serious flood since settlement began was that on the Arkansas River during June 1921, which caused the loss of about 100 lives and an estimated property loss of $19,000,000. Many floods of lesser magnitude have occurred, and some of these have caused loss of life and very considerable property damage. Topography is the chief factor in determining the location of storms and resulting floods. These occur most frequently on the eastern slope of the Front Range. In the mountains farther west precipitation is insufficient to cause floods except during periods of melting snow, in June. In the southwestern part of the State, where precipitation during periods of melting snow is insufficient to cause floods, the severest floods yet experienced resulted from heavy rains in September 1909 and October 1911. In the eastern foothills region, usually below an altitude of about 7,500 feet and extending for a distance of about 50 miles east of the mountains, is a zone subject to rainfalls of great intensity known as cloudbursts. These cloudbursts are of short duration and are confined to very small areas. At times the intensity is so great as to make breathing difficult for those exposed to a storm. The areas of intense rainfall are so small that Weather Bureau precipitation stations have not been located in them. Local residents, being cloudburst conscious, frequently measure the rainfall in receptacles in their yards, and such records constitute the only source of information regarding the intensity. A flood

  4. Development of alkaline fuel cells.

    SciTech Connect

    Hibbs, Michael R.; Jenkins, Janelle E.; Alam, Todd Michael; Janarthanan, Rajeswari; Horan, James L.; Caire, Benjamin R.; Ziegler, Zachary C.; Herring, Andrew M.; Yang, Yuan; Zuo, Xiaobing; Robson, Michael H.; Artyushkova, Kateryna; Patterson, Wendy; Atanassov, Plamen Borissov

    2013-09-01

    This project focuses on the development and demonstration of anion exchange membrane (AEM) fuel cells for portable power applications. Novel polymeric anion exchange membranes and ionomers with high chemical stabilities were prepared characterized by researchers at Sandia National Laboratories. Durable, non-precious metal catalysts were prepared by Dr. Plamen Atanassovs research group at the University of New Mexico by utilizing an aerosol-based process to prepare templated nano-structures. Dr. Andy Herrings group at the Colorado School of Mines combined all of these materials to fabricate and test membrane electrode assemblies for single cell testing in a methanol-fueled alkaline system. The highest power density achieved in this study was 54 mW/cm2 which was 90% of the project target and the highest reported power density for a direct methanol alkaline fuel cell.

  5. Alkaline Comet Assay for Assessing DNA Damage in Individual Cells.

    PubMed

    Pu, Xinzhu; Wang, Zemin; Klaunig, James E

    2015-08-06

    Single-cell gel electrophoresis, commonly called a comet assay, is a simple and sensitive method for assessing DNA damage at the single-cell level. It is an important technique in genetic toxicological studies. The comet assay performed under alkaline conditions (pH >13) is considered the optimal version for identifying agents with genotoxic activity. The alkaline comet assay is capable of detecting DNA double-strand breaks, single-strand breaks, alkali-labile sites, DNA-DNA/DNA-protein cross-linking, and incomplete excision repair sites. The inclusion of digestion of lesion-specific DNA repair enzymes in the procedure allows the detection of various DNA base alterations, such as oxidative base damage. This unit describes alkaline comet assay procedures for assessing DNA strand breaks and oxidative base alterations. These methods can be applied in a variety of cells from in vitro and in vivo experiments, as well as human studies.

  6. Chemical Method to Improve CO{sub 2} Flooding Sweep Efficiency for Oil Recovery Using SPI-CO{sub 2} Gels

    SciTech Connect

    Burns, Lyle D.

    2009-04-14

    The problem in CO{sub 2} flooding lies with its higher mobility causing low conformance or sweep efficiency. This is an issue in oilfield applications where an injected fluid or gas used to mobilize and produce the oil in a marginal field has substantially higher mobility (function of viscosity and density and relative permeability) relative to the crude oil promoting fingering and early breakthrough. Conformance is particularly critical in CO{sub 2} oilfield floods where the end result is less oil recovered and substantially higher costs related to the CO{sub 2}. The SPI-CO{sub 2} (here after called “SPI”) gel system is a unique silicate based gel system that offers a technically effective solution to the conformance problem with CO{sub 2} floods. This SPI gel system remains a low viscosity fluid until an external initiator (CO{sub 2}) triggers gelation. This is a clear improvement over current technologies where the gels set up as a function of time, regardless of where it is placed in the reservoir. In those current systems, the internal initiator is included in the injected fluid for water shut off applications. In this new research effort, the CO{sub 2} is an external initiator contacted after SPI gel solution placement. This concept ensures in the proper water wet reservoir environment that the SPI gel sets up in the precise high permeability path followed by the CO{sub 2}, therefore improving sweep efficiency to a greater degree than conventional systems. In addition, the final SPI product in commercial quantities is expected to be low cost over the competing systems. This Phase I research effort provided “proof of concept” that SPI gels possess strength and may be formed in a sand pack reducing the permeability to brine and CO{sub 2} flow. This SPI technology is a natural extension of prior R & D and the Phase I effort that together show a high potential for success in a Phase II follow-on project. Carbon dioxide (CO{sub 2}) is a major by-product of

  7. Estimation of flood losses to agricultural crops using remote sensing

    NASA Astrophysics Data System (ADS)

    Tapia-Silva, Felipe-Omar; Itzerott, Sibylle; Foerster, Saskia; Kuhlmann, Bernd; Kreibich, Heidi

    2011-01-01

    The estimation of flood damage is an important component of risk-oriented flood design, risk mapping, financial appraisals and comparative risk analyses. However, research on flood loss modelling, especially in the agricultural sector, has not yet gained much attention. Agricultural losses strongly depend on the crops affected, which need to be predicted accurately. Therefore, three different methods to predict flood-affected crops using remote sensing and ancillary data were developed, applied and validated. These methods are: (a) a hierarchical classification based on standard curves of spectral response using satellite images, (b) disaggregation of crop statistics using a Monte Carlo simulation and probabilities of crops to be cultivated on specific soils and (c) analysis of crop rotation with data mining Net Bayesian Classifiers (NBC) using soil data and crop data derived from a multi-year satellite image analysis. A flood loss estimation model for crops was applied and validated in flood detention areas (polders) at the Havel River (Untere Havelniederung) in Germany. The polders were used for temporary storage of flood water during the extreme flood event in August 2002. The flood loss to crops during the extreme flood event in August 2002 was estimated based on the results of the three crop prediction methods. The loss estimates were then compared with official loss data for validation purposes. The analysis of crop rotation with NBC obtained the best result, with 66% of crops correctly classified. The accuracy of the other methods reached 34% with identification using Normalized Difference Vegetation Index (NDVI) standard curves and 19% using disaggregation of crop statistics. The results were confirmed by evaluating the loss estimation procedure, in which the damage model using affected crops estimated by NBC showed the smallest overall deviation (1%) when compared to the official losses. Remote sensing offers various possibilities for the improvement of

  8. Dynamic Flood Vulnerability Mapping with Google Earth Engine

    NASA Astrophysics Data System (ADS)

    Tellman, B.; Kuhn, C.; Max, S. A.; Sullivan, J.

    2015-12-01

    Satellites capture the rate and character of environmental change from local to global levels, yet integrating these changes into flood exposure models can be cost or time prohibitive. We explore an approach to global flood modeling by leveraging satellite data with computing power in Google Earth Engine to dynamically map flood hazards. Our research harnesses satellite imagery in two main ways: first to generate a globally consistent flood inundation layer and second to dynamically model flood vulnerability. Accurate and relevant hazard maps rely on high quality observation data. Advances in publicly available spatial, spectral, and radar data together with cloud computing allow us to improve existing efforts to develop a comprehensive flood extent database to support model training and calibration. This talk will demonstrate the classification results of algorithms developed in Earth Engine designed to detect flood events by combining observations from MODIS, Landsat 8, and Sentinel-1. Our method to derive flood footprints increases the number, resolution, and precision of spatial observations for flood events both in the US, recorded in the NCDC (National Climatic Data Center) storm events database, and globally, as recorded events from the Colorado Flood Observatory database. This improved dataset can then be used to train machine learning models that relate spatial temporal flood observations to satellite derived spatial temporal predictor variables such as precipitation, antecedent soil moisture, and impervious surface. This modeling approach allows us to rapidly update models with each new flood observation, providing near real time vulnerability maps. We will share the water detection algorithms used with each satellite and discuss flood detection results with examples from Bihar, India and the state of New York. We will also demonstrate how these flood observations are used to train machine learning models and estimate flood exposure. The final stage of

  9. Assessment of flash flood warning procedures

    NASA Astrophysics Data System (ADS)

    Johnson, Lynn E.

    2000-01-01

    Assessment of four alternate flash flood warning procedures was conducted to ascertain their suitability for forecast operations using radar-rainfall imagery. The procedures include (1) areal mean basin effective rainfall, (2) unit hydrograph, (3) time-area, and (4) 2-D numerical modeling. The Buffalo Creek flash flood of July 12, 1996, was used as a case study for application of each of the procedures. A significant feature of the Buffalo Creek event was a forest fire that occurred a few months before the flood and significantly affected watershed runoff characteristics. Objectives were to assess the applicability of the procedures for watersheds having spatial and temporal scale similarities to Buffalo Creek, to compare their technical characteristics, and to consider forecaster usability. Geographic information system techniques for hydrologic database development and flash flood potential computations are illustrated. Generalizations of the case study results are offered relative to their suitability for flash flood forecasting operations. Although all four methods have relative advantages, their application to the Buffalo Creek event resulted in mixed performance. Failure of any method was due primarily to uncertainties of the land surface response (i.e., burn area imperviousness). Results underscore the need for model calibration; a difficult requirement for real-time forecasting.

  10. Alkaline oxide conversion coatings for aluminum alloys

    SciTech Connect

    Buchheit, R.G.

    1996-02-01

    Three related conversion coating methods are described that are based on film formation which occurs when aluminum alloys are exposed to alkaline Li salt solutions. Representative examples of the processing methods, resulting coating structure, composition and morphology are presented. The corrosion resistance of these coatings to aerated 0.5 M NaCl solution has been evaluated as a function of total processing time using electrochemical impedance spectroscopy (EIS). This evaluation shows that excellent corrosion resistance can be uniformly achieved using no more than 20 minutes of process time for 6061-T6. Using current methods a minimum of 80 minutes of process time is required to get marginally acceptable corrosion resistance for 2024-T3. Longer processing times are required to achieve uniformly good corrosion resistance.

  11. Benchmarking an operational procedure for rapid flood mapping and risk assessment in Europe

    NASA Astrophysics Data System (ADS)

    Dottori, Francesco; Salamon, Peter; Kalas, Milan; Bianchi, Alessandra; Feyen, Luc

    2016-04-01

    The development of real-time methods for rapid flood mapping and risk assessment is crucial to improve emergency response and mitigate flood impacts. This work describes the benchmarking of an operational procedure for rapid flood risk assessment based on the flood predictions issued by the European Flood Awareness System (EFAS). The daily forecasts produced for the major European river networks are translated into event-based flood hazard maps using a large map catalogue derived from high-resolution hydrodynamic simulations, based on the hydro-meteorological dataset of EFAS. Flood hazard maps are then combined with exposure and vulnerability information, and the impacts of the forecasted flood events are evaluated in near real-time in terms of flood prone areas, potential economic damage, affected population, infrastructures and cities. An extensive testing of the operational procedure is carried out using the catastrophic floods of May 2014 in Bosnia-Herzegovina, Croatia and Serbia. The reliability of the flood mapping methodology is tested against satellite-derived flood footprints, while ground-based estimations of economic damage and affected population is compared against modelled estimates. We evaluated the skill of flood hazard and risk estimations derived from EFAS flood forecasts with different lead times and combinations. The assessment includes a comparison of several alternative approaches to produce and present the information content, in order to meet the requests of EFAS users. The tests provided good results and showed the potential of the developed real-time operational procedure in helping emergency response and management.

  12. 78 FR 5822 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-28

    ... SECURITY Federal Emergency Management Agency Proposed Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice. SUMMARY: Comments are requested on proposed flood hazard determinations, which may include additions or modifications of any Base Flood Elevation (BFE), base flood...

  13. 77 FR 18846 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... SECURITY Federal Emergency Management Agency Proposed Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice. SUMMARY: Comments are requested on proposed flood hazard determinations, which may include additions or modifications of any Base Flood Elevation (BFE), base flood...

  14. 77 FR 18844 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... SECURITY Federal Emergency Management Agency Proposed Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice. SUMMARY: Comments are requested on proposed flood hazard determinations, which may include additions or modifications of any Base Flood Elevation (BFE), base flood...

  15. 77 FR 18841 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... SECURITY Federal Emergency Management Agency Proposed Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice. SUMMARY: Comments are requested on proposed flood hazard determinations, which may include additions or modifications of any Base Flood Elevation (BFE), base flood...

  16. 78 FR 5826 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-28

    ... SECURITY Federal Emergency Management Agency Proposed Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice. SUMMARY: Comments are requested on proposed flood hazard determinations, which may include additions or modifications of any Base Flood Elevation (BFE), base flood...

  17. 78 FR 5824 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-28

    ... SECURITY Federal Emergency Management Agency Proposed Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice. SUMMARY: Comments are requested on proposed flood hazard determinations, which may include additions or modifications of any Base Flood Elevation (BFE), base flood...

  18. 78 FR 49278 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-13

    ... SECURITY Federal Emergency Management Agency Proposed Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice. SUMMARY: Comments are requested on proposed flood hazard determinations, which may include additions or modifications of any Base Flood Elevation (BFE), base flood...

  19. 78 FR 21143 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-09

    ... SECURITY Federal Emergency Management Agency Proposed Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice. SUMMARY: Comments are requested on proposed flood hazard determinations, which may include additions or modifications of any Base Flood Elevation (BFE), base flood...

  20. 77 FR 18839 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... SECURITY Federal Emergency Management Agency Proposed Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice. SUMMARY: Comments are requested on proposed flood hazard determinations, which may include additions or modifications of any Base Flood Elevation (BFE), base flood...

  1. 77 FR 18842 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... SECURITY Federal Emergency Management Agency Proposed Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice. SUMMARY: Comments are requested on proposed flood hazard determinations, which may include additions or modifications of any Base Flood Elevation (BFE), base flood...

  2. 77 FR 18835 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... SECURITY Federal Emergency Management Agency Proposed Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice. SUMMARY: Comments are requested on proposed flood hazard determinations, which may include additions or modifications of any Base Flood Elevation (BFE), base flood...

  3. 78 FR 49277 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-13

    ... SECURITY Federal Emergency Management Agency Proposed Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice. SUMMARY: Comments are requested on proposed flood hazard determinations, which may include additions or modifications of any Base Flood Elevation (BFE), base flood...

  4. 77 FR 18837 - Proposed Flood Hazard Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... SECURITY Federal Emergency Management Agency Proposed Flood Hazard Determinations AGENCY: Federal Emergency Management Agency, DHS. ACTION: Notice. SUMMARY: Comments are requested on proposed flood hazard determinations, which may include additions or modifications of any Base Flood Elevation (BFE), base flood...

  5. The Global Flood Model

    NASA Astrophysics Data System (ADS)

    Williams, P.; Huddelston, M.; Michel, G.; Thompson, S.; Heynert, K.; Pickering, C.; Abbott Donnelly, I.; Fewtrell, T.; Galy, H.; Sperna Weiland, F.; Winsemius, H.; Weerts, A.; Nixon, S.; Davies, P.; Schiferli, D.

    2012-04-01

    Recently, a Global Flood Model (GFM) initiative has been proposed by Willis, UK Met Office, Esri, Deltares and IBM. The idea is to create a global community platform that enables better understanding of the complexities of flood risk assessment to better support the decisions, education and communication needed to mitigate flood risk. The GFM will provide tools for assessing the risk of floods, for devising mitigation strategies such as land-use changes and infrastructure improvements, and for enabling effective pre- and post-flood event response. The GFM combines humanitarian and commercial motives. It will benefit: - The public, seeking to preserve personal safety and property; - State and local governments, seeking to safeguard economic activity, and improve resilience; - NGOs, similarly seeking to respond proactively to flood events; - The insurance sector, seeking to understand and price flood risk; - Large corporations, seeking to protect global operations and supply chains. The GFM is an integrated and transparent set of modules, each composed of models and data. For each module, there are two core elements: a live "reference version" (a worked example) and a framework of specifications, which will allow development of alternative versions. In the future, users will be able to work with the reference version or substitute their own models and data. If these meet the specification for the relevant module, they will interoperate with the rest of the GFM. Some "crowd-sourced" modules could even be accredited and published to the wider GFM community. Our intent is to build on existing public, private and academic work, improve local adoption, and stimulate the development of multiple - but compatible - alternatives, so strengthening mankind's ability to manage flood impacts. The GFM is being developed and managed by a non-profit organization created for the purpose. The business model will be inspired from open source software (eg Linux): - for non-profit usage

  6. Flood of support.

    PubMed

    Musgrave, Shonagh

    A year on from the torrential floods that struck Cumbria, many people are still unable to return to their homes. A team of therapists is helping people to cope with the stress and frustration. The January 2005 floods came four years after foot and mouth disease hit Cumbria. The region depends on agriculture, hill walking and tourism for revenue. Therapists offer residents a choice of eight, different complementary therapies. Users of the service say the therapies reduce stress and help them relax. PMID:16629105

  7. Information Communication using Knowledge Engine on Flood Issues

    NASA Astrophysics Data System (ADS)

    Demir, I.; Krajewski, W. F.

    2012-04-01

    The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to and visualization of flood inundation maps, real-time flood conditions, flood forecasts both short-term and seasonal, and other flood-related data for communities in Iowa. The system is designed for use by general public, often people with no domain knowledge and poor general science background. To improve effective communication with such audience, we have introduced a new way in IFIS to get information on flood related issues - instead of by navigating within hundreds of features and interfaces of the information system and web-based sources-- by providing dynamic computations based on a collection of built-in data, analysis, and methods. The IFIS Knowledge Engine connects to distributed sources of real-time stream gauges, and in-house data sources, analysis and visualization tools to answer questions grouped into several categories. Users will be able to provide input based on the query within the categories of rainfall, flood conditions, forecast, inundation maps, flood risk and data sensors. Our goal is the systematization of knowledge on flood related issues, and to provide a single source for definitive answers to factual queries. Long-term goal of this knowledge engine is to make all flood related knowledge easily accessible to everyone, and provide educational geoinformatics tool. The future implementation of the system will be able to accept free-form input and voice recognition capabilities within browser and mobile applications. We intend to deliver increasing capabilities for the system over the coming releases of IFIS. This presentation provides an overview of our Knowledge Engine, its unique information interface and functionality as an educational tool, and discusses the future plans for providing knowledge on flood related issues and resources.

  8. POISON SPIDER FIELD CHEMICAL FLOOD PROJECT, WYOMING

    SciTech Connect

    Douglas Arnell; Malcolm Pitts; Jie Qi

    2004-11-01

    -rock compatibility, polymer injectivity, dynamic chemical retention by rock, and recommended injected polymer concentration. Average initial oil saturation was 0.796 Vp. Produced water injection recovered 53% OOIP leaving an average residual oil saturation of 0.375 Vp. Poison Spider rock was strongly water-wet with a mobility ratio for produced water displacing the 280 cp crude oil of 8.6. Core was not sensitive to either alkali or surfactant injection. Injectivity increased 60 to 80% with alkali plus surfactant injection. Low and medium molecular weight polyacrylamide polymers (Flopaam 3330S and Flopaam 3430S) dissolved in either an alkaline-surfactant solution or softened produced water injected and flowed through Poison Spider rock. Recommended injected polyacrylamide concentration is 2,100 mg/L for both polymers for a unit mobility ratio. Radial corefloods were performed to evaluate oil recovery efficiency of different chemical solutions. Waterflood oil recovery averaged 46.4 OOIP and alkaline-surfactant-polymer flood oil recovery averaged an additional 18.1% OIP for a total of 64.6% OOIP. Oil cut change due to injection of a 1.5 wt% Na{sub 2}CO{sub 3} plus 0.05 wt% Petrostep B-100 plus 0.05 wt% Stepantan AS1216 plus 2100 mg/L Flopaam 3430S was from 2% to a peak of 23.5%. Additional study might determine the impact on oil recovery of a lower polymer concentration. An alkaline-surfactant-polymer flood field implementation outline report was written.

  9. Evaluation of mixed surfactants for improved chemical flooding

    SciTech Connect

    Llave, F.M.; French, T.R.; Lorenz, P.B.

    1993-02-01

    Phase behavior studies were conducted using combinations of a primary surfactant component and several ethoxylated surfactants. The objective of the study is to evaluate combinations of surfactants, anionic-nonionic and anionic-anionic mixtures, that would yield favorable phase behavior and solubilization capacity. The dependence of the solution behavior on the additive surfactant structure, surfactant type, oil, surfactant proportion, salinity, HLB, and temperature was observed. The results showed that the ethoxylated surfactants can improve the solution behavior of the overall system. The increase in optimum salinity range of these solutions corresponded to an increase in the degree of ethoxylation of additive surfactant, up to a certain limit. The nonionic surfactant additives yielded much higher salinities compared to the results from the ethoxylated anionics tested. The proportion of surfactant component in solution was critical in achieving a balance between the solubilization capacity and the enhancement in the system`s salinity tolerance. Some combinations of these types of surfactants showed improved solution behavior with favorable solubilization capacity. The phase inversion temperature (PIT) method has been shown to be a relatively fast method for screening candidate surfactant systems. Comparisons were made using both the conventional salinity scan and the PIT method on selected chemical systems. The results showed good agreement between the salinity regions determined using both methods. A difference in the dependence of optimal salinity on HLB was observed for the different nonionics tested. The linear alkyl alcohol ethoxylates exhibited a behavior distinct from the dialkyl phenols at similar HLB levels with and without the primary sulfonate component in the solution. Other experiments performed at NIPER have shown that surfactant-enhanced alkaline flooding has good potential for the recovery of oil from Naval Petroleum Reserve Number 3 (NPR No. 3).

  10. Evaluation of mixed surfactants for improved chemical flooding

    SciTech Connect

    Llave, F.M.; French, T.R.; Lorenz, P.B.

    1993-02-01

    Phase behavior studies were conducted using combinations of a primary surfactant component and several ethoxylated surfactants. The objective of the study is to evaluate combinations of surfactants, anionic-nonionic and anionic-anionic mixtures, that would yield favorable phase behavior and solubilization capacity. The dependence of the solution behavior on the additive surfactant structure, surfactant type, oil, surfactant proportion, salinity, HLB, and temperature was observed. The results showed that the ethoxylated surfactants can improve the solution behavior of the overall system. The increase in optimum salinity range of these solutions corresponded to an increase in the degree of ethoxylation of additive surfactant, up to a certain limit. The nonionic surfactant additives yielded much higher salinities compared to the results from the ethoxylated anionics tested. The proportion of surfactant component in solution was critical in achieving a balance between the solubilization capacity and the enhancement in the system's salinity tolerance. Some combinations of these types of surfactants showed improved solution behavior with favorable solubilization capacity. The phase inversion temperature (PIT) method has been shown to be a relatively fast method for screening candidate surfactant systems. Comparisons were made using both the conventional salinity scan and the PIT method on selected chemical systems. The results showed good agreement between the salinity regions determined using both methods. A difference in the dependence of optimal salinity on HLB was observed for the different nonionics tested. The linear alkyl alcohol ethoxylates exhibited a behavior distinct from the dialkyl phenols at similar HLB levels with and without the primary sulfonate component in the solution. Other experiments performed at NIPER have shown that surfactant-enhanced alkaline flooding has good potential for the recovery of oil from Naval Petroleum Reserve Number 3 (NPR No. 3).

  11. Epic Flooding in Georgia, 2009

    USGS Publications Warehouse

    Gotvald, Anthony J.; McCallum, Brian E.

    2010-01-01

    Metropolitan Atlanta-September 2009 Floods * The epic floods experienced in the Atlanta area in September 2009 were extremely rare. Eighteen streamgages in the Metropolitan Atlanta area had flood magnitudes much greater than the estimated 0.2-percent (500-year) annual exceedance probability. * The Federal Emergency Management Agency (FEMA) reported that 23 counties in Georgia were declared disaster areas due to this flood and that 16,981 homes and 3,482 businesses were affected by floodwaters. Ten lives were lost in the flood. The total estimated damages exceed $193 million (H.E. Longenecker, Federal Emergency Management Agency, written commun., November 2009). * On Sweetwater Creek near Austell, Ga., just north of Interstate 20, the peak stage was more than 6 feet higher than the estimated peak stage of the 0.2-percent (500-year) flood. Flood magnitudes in Cobb County on Sweetwater, Butler, and Powder Springs Creeks greatly exceeded the estimated 0.2-percent (500-year) floods for these streams. * In Douglas County, the Dog River at Ga. Highway 5 near Fairplay had a peak stage nearly 20 feet higher than the estimated peak stage of the 0.2-percent (500-year) flood. * On the Chattahoochee River, the U.S. Geological Survey (USGS) gage at Vinings reached the highest level recorded in the past 81 years. Gwinnett, De Kalb, Fulton, and Rockdale Counties also had record flooding. South Georgia March and April 2009 Floods * The March and April 2009 floods in South Georgia were smaller in magnitude than the September floods but still caused significant damage. * No lives were lost in this flood. Approximately $60 million in public infrastructure damage occurred to roads, culverts, bridges and a water treatment facility (Joseph T. McKinney, Federal Emergency Management Agency, written commun., July 2009). * Flow at the Satilla River near Waycross, exceeded the 0.5-percent (200-year) flood. Flows at seven other stations in South Georgia exceeded the 1-percent (100-year) flood.

  12. Multivariate pluvial flood damage models

    SciTech Connect

    Van Ootegem, Luc; Verhofstadt, Elsy; Van Herck, Kristine; Creten, Tom

    2015-09-15

    Depth–damage-functions, relating the monetary flood damage to the depth of the inundation, are commonly used in the case of fluvial floods (floods caused by a river overflowing). We construct four multivariate damage models for pluvial floods (caused by extreme rainfall) by differentiating on the one hand between ground floor floods and basement floods and on the other hand between damage to residential buildings and damage to housing contents. We do not only take into account the effect of flood-depth on damage, but also incorporate the effects of non-hazard indicators (building characteristics, behavioural indicators and socio-economic variables). By using a Tobit-estimation technique on identified victims of pluvial floods in Flanders (Belgium), we take into account the effect of cases of reported zero damage. Our results show that the flood depth is an important predictor of damage, but with a diverging impact between ground floor floods and basement floods. Also non-hazard indicators are important. For example being aware of the risk just before the water enters the building reduces content damage considerably, underlining the importance of warning systems and policy in this case of pluvial floods. - Highlights: • Prediction of damage of pluvial floods using also non-hazard information • We include ‘no damage cases’ using a Tobit model. • The damage of flood depth is stronger for ground floor than for basement floods. • Non-hazard indicators are especially important for content damage. • Potential gain of policies that increase awareness of flood risks.

  13. Field measurement of alkalinity and pH

    USGS Publications Warehouse

    Barnes, Ivan

    1964-01-01

    The behavior of electrometric pH equipment under field conditions departs from the behavior predicted from Nernst's law. The response is a linear function of pH, and hence measured pH values may be corrected to true pH if the instrument is calibrated with two reference solutions for each measurement. Alkalinity titrations may also be made in terms of true pH. Standard methods, such as colorimetric titrations, were rejected as unreliable or too cumbersome for rapid field use. The true pH of the end point of the alkalinity titration as a function of temperature, ionic strength, and total alkalinity has been calculated. Total alkalinity in potable waters is the most important factor influencing the end point pH, which varies from 5.38 (0 ? C, 5 ppm (parts per million) HC0a-) to 4.32 (300 ppm HC0a-,35 ? C), for the ranges of variables considered. With proper precautions, the pH may be determined to =i:0.02 pH and the alkalinity to =i:0.6 ppm HCO3- for many naturally occurring bodies of fresh water.

  14. Reduction of nitrobenzene with alkaline ascorbic acid: Kinetics and pathways.

    PubMed

    Liang, Chenju; Lin, Ya-Ting; Shiu, Jia-Wei

    2016-01-25

    Alkaline ascorbic acid (AA) exhibits the potential to reductively degrade nitrobenzene (NB), which is the simplest of the nitroaromatic compounds. The nitro group (NO2(-)) of NB has a +III oxidation state of the N atom and tends to gain electrons. The effect of alkaline pH ranging from 9 to 13 was initially assessed and the results demonstrated that the solution pH, when approaching or above the pKa2 of AA (11.79), would increase reductive electron transfer to NB. The rate equation for the reactions between NB and AA at pH 12 can be described as r=((0.89±0.11)×10(-4) mM(1-(a+b))h(-1))×[NB](a=1.35±0.10)[AA](b=0.89±0.01). The GC/MS analytical method identified nitrosobenzene, azoxybenzene, and azobenzene as NB reduction intermediates, and aniline (AN) as a final product. These experimental results indicate that the alkaline AA reduction of NB to AN mainly proceeds via the direct route, consisting of a series of two-electron or four-electron transfers, and the condensation reaction plays a minor route. Preliminary evaluation of the remediation of spiked NB contaminated soils revealed that maintenance of alkaline pH and a higher water to soil ratio are essential for a successful alkaline AA application.

  15. Autonomous in situ measurements of seawater alkalinity.

    PubMed

    Spaulding, Reggie S; DeGrandpre, Michael D; Beck, James C; Hart, Robert D; Peterson, Brittany; De Carlo, Eric H; Drupp, Patrick S; Hammar, Terry R

    2014-08-19

    Total alkalinity (AT) is an important parameter for describing the marine inorganic carbon system and understanding the effects of atmospheric CO2 on the oceans. Measurements of AT are limited, however, because of the laborious process of collecting and analyzing samples. In this work we evaluate the performance of an autonomous instrument for high temporal resolution measurements of seawater AT. The Submersible Autonomous Moored Instrument for alkalinity (SAMI-alk) uses a novel tracer monitored titration method where a colorimetric pH indicator quantifies both pH and relative volumes of sample and titrant, circumventing the need for gravimetric or volumetric measurements. The SAMI-alk performance was validated in the laboratory and in situ during two field studies. Overall in situ accuracy was -2.2 ± 13.1 μmol kg(-1) (n = 86), on the basis of comparison to discrete samples. Precision on duplicate analyses of a carbonate standard was ±4.7 μmol kg(-1) (n = 22). This prototype instrument can measure in situ AT hourly for one month, limited by consumption of reagent and standard solutions.

  16. Advanced inorganic separators for alkaline batteries

    NASA Technical Reports Server (NTRS)

    Sheibley, D. W. (Inventor)

    1982-01-01

    A flexible, porous battery separator comprising a coating applied to a porous, flexible substrate is described. The coating comprises: (1) a thermoplastic rubber-based resin which is insoluble and unreactive in the alkaline electrolyte; (2) a polar organic plasticizer which is reactive with the alkaline electrolyte to produce a reaction product which contains a hydroxyl group and/or a carboxylic acid group; and (3) a mixture of polar particulate filler materials which are unreactive with the electrolyte, the mixture comprising at least one first filler material having a surface area of greater than 25 meters sq/gram, at least one second filler material having a surface area of 10 to 25 sq meters/gram, wherein the volume of the mixture of filler materials is less than 45% of the total volume of the fillers and the binder, the filler surface area per gram of binder is about 20 to 60 sq meters/gram, and the amount of plasticizer is sufficient to coat each filler particle. A method of forming the battery separator is also described.

  17. Autonomous in situ measurements of seawater alkalinity.

    PubMed

    Spaulding, Reggie S; DeGrandpre, Michael D; Beck, James C; Hart, Robert D; Peterson, Brittany; De Carlo, Eric H; Drupp, Patrick S; Hammar, Terry R

    2014-08-19

    Total alkalinity (AT) is an important parameter for describing the marine inorganic carbon system and understanding the effects of atmospheric CO2 on the oceans. Measurements of AT are limited, however, because of the laborious process of collecting and analyzing samples. In this work we evaluate the performance of an autonomous instrument for high temporal resolution measurements of seawater AT. The Submersible Autonomous Moored Instrument for alkalinity (SAMI-alk) uses a novel tracer monitored titration method where a colorimetric pH indicator quantifies both pH and relative volumes of sample and titrant, circumventing the need for gravimetric or volumetric measurements. The SAMI-alk performance was validated in the laboratory and in situ during two field studies. Overall in situ accuracy was -2.2 ± 13.1 μmol kg(-1) (n = 86), on the basis of comparison to discrete samples. Precision on duplicate analyses of a carbonate standard was ±4.7 μmol kg(-1) (n = 22). This prototype instrument can measure in situ AT hourly for one month, limited by consumption of reagent and standard solutions. PMID:25051401

  18. Comparison of flood regionalisation techniques in Lower Saxony.

    NASA Astrophysics Data System (ADS)

    Plötner, Stefan; Haberlandt, Uwe

    2016-04-01

    The index-flood method has become the standard method for peak flow regionalisation of given return periods at ungauged basins. Moreover grouping stations into regions of homogeneous flood characteristics increases the sample size and thus reduces the uncertainty of estimated peak flows even at gauged basins. At this context, this study investigates the performance of the index-flood method with regards to other regionalisation techniques and evaluates the influence of station density and data quality on the performance of the index-flood method. For this purpose 338 runoff stations in Lower Saxony with observed monthly peak flows and record lengths of annual peak flows between 10 and 75 years are analysed. Catchment descriptors of topography, soil, vegetation and climate are derived to group them into homogeneous regions. The regions are separated using 5 classification methods with 2 to 40 classes for selected catchment descriptors. The most suitable catchment descriptors are selected by their impact on classifying the mean annual peak flow and the variance of annual peak flows using random forest. Muliple linear regression, ordinary and external drift kriging, the standard and an extended index-flood method are compared with the at-site estimation as reference using cross-validation. Three station scenarios based on e.g. record length, known station specific experience and hydrological catchment complexity are used to evaluate the influence of station density and quality on the performance of the index-flood method. The results show the applicability of the index-flood method in Lower Saxony and the benefit of using regional samples for more robust estimations. Combining the index-flood method and geostatistics can improve the estimation of peak flows. The performance of the index-flood method is affected by the used sample respectively the selection of stations.

  19. Evaluation of flood hazard maps in print and web mapping services as information tools in flood risk communication

    NASA Astrophysics Data System (ADS)

    Hagemeier-Klose, M.; Wagner, K.

    2009-04-01

    Flood risk communication with the general public and the population at risk is getting increasingly important for flood risk management, especially as a precautionary measure. This is also underlined by the EU Flood Directive. The flood related authorities therefore have to develop adjusted information tools which meet the demands of different user groups. This article presents the formative evaluation of flood hazard maps and web mapping services according to the specific requirements and needs of the general public using the dynamic-transactional approach as a theoretical framework. The evaluation was done by a mixture of different methods; an analysis of existing tools, a creative workshop with experts and laymen and an online survey. The currently existing flood hazard maps or web mapping services or web GIS still lack a good balance between simplicity and complexity with adequate readability and usability for the public. Well designed and associative maps (e.g. using blue colours for water depths) which can be compared with past local flood events and which can create empathy in viewers, can help to raise awareness, to heighten the activity and knowledge level or can lead to further information seeking. Concerning web mapping services, a linkage between general flood information like flood extents of different scenarios and corresponding water depths and real time information like gauge levels is an important demand by users. Gauge levels of these scenarios are easier to understand than the scientifically correct return periods or annualities. The recently developed Bavarian web mapping service tries to integrate these requirements.

  20. Hydrologic Flood Routing.

    ERIC Educational Resources Information Center

    Heggen, Richard J.

    1982-01-01

    Discusses a short classroom-based BASIC program which routes stream flow through a system of channels and reservoirs. The program is suitable for analyses of open channel conveyance systems, flood detention reservoirs, and combinations of the two. (Author/JN)

  1. After the Flood

    ERIC Educational Resources Information Center

    Stanistreet, Paul

    2007-01-01

    When floodwater swept through the McVities biscuit factory in Carlisle in January 2005 few were confident that it would reopen. The factory, in the Caldewgate area of the city, was one of the first casualties of the flood, as water, nine feet deep in places, coursed trough the food preparation areas, destroying equipment and covering everything in…

  2. Flooding on Elbe River

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Heavy rains in Central Europe over the past few weeks have led to some of the worst flooding the region has witnessed in more than a century. The floods have killed more than 100 people in Germany, Russia, Austria, Hungary, and the Czech Republic and have led to as much as $20 billion in damage. This false-color image of the Elbe River and its tributaries was taken on August 20, 2002, by the Moderate Resolution Imaging Spectroradiometer (MODIS), flying aboard NASA's Terra satellite. The floodwaters that inundated Dresden, Germany, earlier this week have moved north. As can be seen, the river resembles a fairly large lake in the center of the image just south of the town of Wittenberg. Flooding was also bad further downriver in the towns of Maqgdeburge and Hitzacker. Roughly 20,000 people were evacuated from their homes in northern Germany. Fifty thousand troops, border police, and technical assistance workers were called in to combat the floods along with 100,000 volunteers. The floodwaters are not expected to badly affect Hamburg, which sits on the mouth of the river on the North Sea. Credit:Image courtesy Jacques Descloitres, MODIS Land Rapid Response Team at NASA GSFC

  3. The Stanford Flood.

    ERIC Educational Resources Information Center

    Leighton, Philip D.

    1979-01-01

    Describes, from the flood to the start of freeze-drying operations, the preservation efforts of Stanford University regarding books damaged by water in the Green Library in November 1978. Planning, action, and mopping-up activities are chronicled, and 20 suggestions are offered as guidance in future similar situations. (JD)

  4. Polyvinyl alcohol membranes as alkaline battery separators

    NASA Technical Reports Server (NTRS)

    Sheibley, D. W.; Gonzalez-Sanabria, O.; Manzo, M. A.

    1982-01-01

    Polyvinly alcohol (PVA) cross-linked with aldehyde reagents yields membranes that demonstrate properties that make them suitable for use as alkaline battery separators. Film properties can be controlled by the choice of cross-linker, cross-link density and the method of cross-linking. Three methods of cross-linking and their effects on film properties are discussed. Film properties can also be modified by using a copolymer of vinyl alcohol and acrylic acid as the base for the separator and cross-linking it similarly to the PVA. Fillers can be incorporated into the films to further modify film properties. Results of separator screening tests and cell tests for several variations of PBA films are discussed.

  5. Rethinking the relationship between flood risk perception and flood management.

    PubMed

    Birkholz, S; Muro, M; Jeffrey, P; Smith, H M

    2014-04-15

    Although flood risk perceptions and their concomitant motivations for behaviour have long been recognised as significant features of community resilience in the face of flooding events, there has, for some time now, been a poorly appreciated fissure in the accompanying literature. Specifically, rationalist and constructivist paradigms in the broader domain of risk perception provide different (though not always conflicting) contexts for interpreting evidence and developing theory. This contribution reviews the major constructs that have been applied to understanding flood risk perceptions and contextualises these within broader conceptual developments around risk perception theory and contemporary thinking around flood risk management. We argue that there is a need to re-examine and re-invigorate flood risk perception research, in a manner that is comprehensively underpinned by more constructivist thinking around flood risk management as well as by developments in broader risk perception research. We draw attention to an historical over-emphasis on the cognitive perceptions of those at risk to the detriment of a richer understanding of a wider range of flood risk perceptions such as those of policy-makers or of tax-payers who live outside flood affected areas as well as the linkages between these perspectives and protective measures such as state-supported flood insurance schemes. Conclusions challenge existing understandings of the relationship between risk perception and flood management, particularly where the latter relates to communication strategies and the extent to which those at risk from flooding feel responsible for taking protective actions.

  6. Alkaline fuel cell performance investigation

    NASA Technical Reports Server (NTRS)

    Martin, R. E.; Manzo, M. A.

    1988-01-01

    An exploratory experimental fuel cell test program was conducted to investigate the performance characteristics of alkaline laboratory research electrodes. The objective of this work was to establish the effect of temperature, pressure, and concentration upon performance and evaluate candidate cathode configurations having the potential for improved performance. The performance characterization tests provided data to empirically establish the effect of temperature, pressure, and concentration upon performance for cell temperatures up to 300 F and reactant pressures up to 200 psia. Evaluation of five gold alloy cathode catalysts revealed that three doped gold alloys had more that two times the surface areas of reference cathodes and therefore offered the best potential for improved performance.

  7. Alkaline fuel cell performance investigation

    NASA Technical Reports Server (NTRS)

    Martin, R. E.; Manzo, M. A.

    1988-01-01

    An exploratory experimental fuel cell test program was conducted to investigate the performance characteristics of alkaline laboratory research electrodes. The objective of this work was to establish the effect of temperature, pressure, and concentration upon performance and evaluate candidate cathode configurations having the potential for improved performance. The performance characterization tests provided data to empirically establish the effect of temperature, pressure, and concentration upon performance for cell temperatures up to 300 F and reactant pressures up to 200 psia. Evaluation of five gold alloy cathode catalysts revealed that three doped gold alloys had more than two times the surface areas of reference cathodes and therefore offered the best potential for improved performance.

  8. Alkalinity and hardness: Critical but elusive concepts in aquaculture

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Total alkalinity and total hardness are familiar variables to those involved in aquatic animal production. Aquaculturists – both scientists and practitioners alike – tend to have some understanding of the two variables and of methods for adjusting their concentrations. The chemistry and the biolog...

  9. - and Cloud-Supported Geospatial Service Aggregation for Flood Response

    NASA Astrophysics Data System (ADS)

    Tan, X.; Di, L.; Deng, M.; Chen, A.; Sun, Z.; Huang, C.; Shao, Y.; Ye, X.

    2015-07-01

    Flooding caused serious losses in China in the past two decades; therefore, responding to and mitigating the impact of flooding is a task of critical importance. The traditional flood response process is usually very time-consuming and labor-intensive. The Service-Oriented Architecture SOA-based flood response is a method with low efficiency due to the large volume of geospatial data transfer, and this method cannot meet the real-time requirement of a rapid response to flooding. This paper presents an Agent- and Cloud-supported geospatial service aggregation to obtain a more efficient geospatial service system for the response to flooding. The architecture of this method is designed and deployed on the Cloud environment, and the flooding response prototype system is built on the Amazon AWS Cloud to demonstrate that the proposed method can avoid transferring large volumes of geospatial data or Big Spatial Data. Consequently, this method is able to achieve better performance than that of the SOA-based method.

  10. Fuel cell flooding detection and correction

    DOEpatents

    DiPierno Bosco, Andrew; Fronk, Matthew Howard

    2000-08-15

    Method and apparatus for monitoring an H.sub.2 -O.sub.2 PEM fuel cells to detect and correct flooding. The pressure drop across a given H.sub.2 or O.sub.2 flow field is monitored and compared to predetermined thresholds of unacceptability. If the pressure drop exists a threshold of unacceptability corrective measures are automatically initiated.

  11. An algorithm for computing moments-based flood quantile estimates when historical flood information is available

    USGS Publications Warehouse

    Cohn, T.A.; Lane, W.L.; Baier, W.G.

    1997-01-01

    This paper presents the expected moments algorithm (EMA), a simple and efficient method for incorporating historical and paleoflood information into flood frequency studies. EMA can utilize three types of at-site flood information: systematic stream gage record: information about the magnitude of historical floods; and knowledge of the number of years in the historical period when no large flood occurred. EMA employs an iterative procedure to compute method-of-moments parameter estimates. Initial parameter estimates are calculated from systematic stream gage data. These moments are then updated by including the measured historical peaks and the expected moments, given the previously estimated parameters of the below-threshold floods from the historical period. The updated moments result in new parameter estimates, and the last two steps are repeated until the algorithm converges. Monte Carlo simulations compare EMA, Bulletin 17B's [United States Water Resources Council, 1982] historically weighted moments adjustment, and maximum likelihood estimators when fitting the three parameters of the log-Pearson type III distribution. These simulations demonstrate that EMA is more efficient than the Bulletin 17B method, and that it is nearly as efficient as maximum likelihood estimation (MLE). The experiments also suggest that EMA has two advantages over MLE when dealing with the log-Pearson type III distribution: It appears that EMA estimates always exist and that they are unique, although neither result has been proven. EMA can be used with binomial or interval-censored data and with any distributional family amenable to method-of-moments estimation.

  12. Alkaline Water and Longevity: A Murine Study.

    PubMed

    Magro, Massimiliano; Corain, Livio; Ferro, Silvia; Baratella, Davide; Bonaiuto, Emanuela; Terzo, Milo; Corraducci, Vittorino; Salmaso, Luigi; Vianello, Fabio

    2016-01-01

    The biological effect of alkaline water consumption is object of controversy. The present paper presents a 3-year survival study on a population of 150 mice, and the data were analyzed with accelerated failure time (AFT) model. Starting from the second year of life, nonparametric survival plots suggest that mice watered with alkaline water showed a better survival than control mice. Interestingly, statistical analysis revealed that alkaline water provides higher longevity in terms of "deceleration aging factor" as it increases the survival functions when compared with control group; namely, animals belonging to the population treated with alkaline water resulted in a longer lifespan. Histological examination of mice kidneys, intestine, heart, liver, and brain revealed that no significant differences emerged among the three groups indicating that no specific pathology resulted correlated with the consumption of alkaline water. These results provide an informative and quantitative summary of survival data as a function of watering with alkaline water of long-lived mouse models.

  13. Floods and Societies: Dynamic Modeling

    NASA Astrophysics Data System (ADS)

    Di Baldassarre, G.; Viglione, A.; Carr, G.; Kuil, L., Jr.; Brandimarte, L.; Bloeschl, G.

    2014-12-01

    There is growing concern that future flood losses and fatalities might increase significantly in many regions of the world because of rapid urbanization in deltas and floodplains, in addition to sea level rise and climate change. To better anticipate long-term trajectories of future flood risk, there is a need to treat floodplains and deltas as fully coupled human-physical systems. Here we propose a novel approach to explore the long-term behavior emerging from the mutual interactions and feedbacks between physical and social systems. The implementation of our modeling framework shows that green societies, which cope with flooding by resettling out of floodplains, are more resilient to increasing flood frequency than technological societies, which deal with flooding by building levees. Also, we show that when coupled dynamics are accounted for, flood-poor periods could (paradoxically) be more dangerous than flood-rich periods.

  14. Alkaline detergent recycling via ultrafiltration

    SciTech Connect

    Steffani, C.; Meltzer, M.

    1995-06-01

    The metal finishing industry uses alkaline cleaners and detergents to remove oils and dirt from manufactured parts, often before they are painted or plated. The use of these cleaners has grown because environmental regulations are phasing out ozone depleting substances and placing restrictions on the use and disposal of many hazardous solvents. Lawrence Livermore National Laboratory is examining ultrafiltration as a cleaning approach that reclaims the cleaning solutions and minimizes wastes. The ultrafiltration membrane is made from sheets of polymerized organic film. The sheets are rolled onto a supporting frame and installed in a tube. Spent cleaning solution is pumped into a filter chamber and filtered through the membrane that captures oils and dirt and allows water and detergent to pass. The membrane is monitored and when pressure builds from oil and dirt, an automatic system cleans the surface to maintain solution flow and filtration quality. The results show that the ultrafiltration does not disturb the detergent concentration or alkalinity but removed almost all the oils and dirt leaving the solution in condition to be reused.

  15. Grace DAKASEP alkaline battery separator

    NASA Technical Reports Server (NTRS)

    Giovannoni, R. T.; Lundquist, J. T.; Choi, W. M.

    1987-01-01

    The Grace DAKASEP separator was originally developed as a wicking layer for nickel-zinc alkaline batteries. The DAKASEP is a filled non-woven separator which is flexible and heat sealable. Through modification of formulation and processing variables, products with a variety of properties can be produced. Variations of DAKASEP were tested in Ni-H2, Ni-Zn, Ni-Cd, and primary alkaline batteries with good results. The properties of DAKASEP which are optimized for Hg-Zn primary batteries are shown in tabular form. This separator has high tensile strength, 12 micron average pore size, relatively low porosity at 46-48 percent, and consequently moderately high resistivity. Versions were produced with greater than 70 percent porosity and resistivities in 33 wt percent KOH as low as 3 ohm cm. Performance data for Hg-Zn E-1 size cells containing DAKASEP with the properties shown in tabular form, are more reproducible than data obtained with a competitive polypropylene non-woven separator. In addition, utilization of active material is in general considerably improved.

  16. Flood inundation modelling in data-poor areas: a case study

    NASA Astrophysics Data System (ADS)

    Yan, Kun; Di Baldassarre, Giuliano; Pappenberger, Florian; Solomatine, Dimitri

    2014-05-01

    One of the main obstacles in mapping flood hazard in data scarce areas is the difficulty in estimating the design flood, i.e. river discharge corresponding to a given return period. This exercise can be carried out via regionalization techniques, which are based on flood data of regions with similar hydro-climatic conditions, or physically based model cascades. In this context, we compared the flood extents maps derived for a river reach of the Blue Nile following two alternative methods: i) regional envelope curve (REC), whereby design floods (e.g. 1-in-20 and 1-in-100 year flood peaks) are derived from African envelope curves (Padi et al., 2011) and physical model cascade (PMC), whereby design floods are calculated from the physical model chain of the European Centre for Medium Range Weather Forecasts (ECMWF, Pappenberger et al., 2012). The two design flood estimates are then used as input of a 2D hydraulic model LISFLOOD-FP and the simulated flood extents are quantitatively evaluated by comparing to a reference flood extent model, which uses design floods estimated from in situ data. The results show the complexity in assessing flood hazard in data scarce area as PMC largely overestimates, while REC underestimates, the flood extents.

  17. Dynamics of flood water infiltration and ground water recharge in hyperarid desert.

    PubMed

    Dahan, Ofer; Tatarsky, Boaz; Enzel, Yehouda; Kulls, Christoph; Seely, Mary; Benito, Gererdo

    2008-01-01

    A study on flood water infiltration and ground water recharge of a shallow alluvial aquifer was conducted in the hyperarid section of the Kuiseb River, Namibia. The study site was selected to represent a typical desert ephemeral river. An instrumental setup allowed, for the first time, continuous monitoring of infiltration during a flood event through the channel bed and the entire vadose zone. The monitoring system included flexible time domain reflectometry probes that were designed to measure the temporal variation in vadose zone water content and instruments to concurrently measure the levels of flood and ground water. A sequence of five individual floods was monitored during the rainy season in early summer 2006. These newly generated data served to elucidate the dynamics of flood water infiltration. Each flood initiated an infiltration event which was expressed in wetting of the vadose zone followed by a measurable rise in the water table. The data enabled a direct calculation of the infiltration fluxes by various independent methods. The floods varied in their stages, peaks, and initial water contents. However, all floods produced very similar flux rates, suggesting that the recharge rates are less affected by the flood stages but rather controlled by flow duration and available aquifer storage under it. Large floods flood the stream channel terraces and promote the larger transmission losses. These, however, make only a negligible contribution to the recharge of the ground water. It is the flood duration within the active streambed, which may increase with flood magnitude that is important to the recharge process.

  18. Magnitude and frequency of floods in Alaska south of the Yukon River

    USGS Publications Warehouse

    Berwick, Vernon Kenneth; Childers, Joseph M.; Kuentzel, M.A.

    1964-01-01

    This report presents a method for evaluating the magnitude and frequency of floods on the basis of the analysis of flood records. One composite frequency curve is applied to the entire study region. This curve relates floods of various magnitudes at any site within the region to probable recurrence intervals (from 1.1 to 50 years) for those floods. Flood magnitudes are reduced to dimensionless form by expressing them as a ratio to mean annual flood. Magnitudes of mean annual floods vary with the flood-producing characteristics of stream basins. On the basis of the limited data available, drainage-area size is found to be the only significant factor affecting the magnitude of the mean annual flood. Trial and error groupings of gaging-station records show that the region can be split into three hydrologic areas: one curve defines the relation within each area between mean annual flood and drainage area. These three curves in combination with the composite flood-frequency curve permit, for natural-flow conditions at any site, the determination of flood magnitude for a given recurrence interval, or the determination of recurrence interval for a flood of known magnitude.

  19. The design of alkaline fuel cells

    NASA Astrophysics Data System (ADS)

    Strasser, K.

    1990-01-01

    Alkaline fuel cells recently developed have yielded satisfactory operation even in the cases of their use of mobile and matrix-type electrolytes; the advantages of realistic operation have been demonstrated by a major West German manufacturer's 100 kW alkaline fuel cell apparatus, which was operated in the role of an air-independent propulsion system. Development has begun for a spacecraft alkaline fuel cell of the matrix-electrolyte configuration.

  20. Flood marks of the 1813 flood in the Central Europe

    NASA Astrophysics Data System (ADS)

    Miklanek, Pavol; Pekárová, Pavla; Halmová, Dana; Pramuk, Branislav; Bačová Mitková, Veronika

    2014-05-01

    In August 2013, 200 years have passed since the greatest and most destructive floods known in the Slovak river basins. The flood affected almost the entire territory of Slovakia, northeastern Moravia, south of Poland. River basins of Váh (Orava, Kysuca), Poprad, Nitra, Hron, Torysa, Hornád, upper and middle Vistula, Odra have been most affected. The aim of this paper is to map the flood marks documenting this catastrophic flood in Slovakia. Flood marks and registrations on the 1813 flood in the Váh river basin are characterized by great diversity and are written in Bernolák modification of Slovak, in Latin, German and Hungarian. Their descriptions are stored in municipal chronicles and Slovak and Hungarian state archives. The flood in 1813 devastated the entire Váh valley, as well as its tributaries. Following flood marks were known in the Vah river basin: Dolná Lehota village in the Orava river basin, historical map from 1817 covering the Sučany village and showing three different cross-sections of the Váh river during the 1813 flood, flood mark in the city of Trenčín, Flood mark in the gate of the Brunovce mansion, cross preserved at the old linden tree at Drahovce, and some records in written documents, e.g. Cifer village. The second part of the study deals with flood marks mapping in the Hron, Hnilec and Poprad River basins, and Vistula River basin in Krakow. On the basis of literary documents and the actual measurement, we summarize the peak flow rates achieved during the floods in 1813 in the profile Hron: Banská Bystrica. According to recent situation the 1813 flood peak was approximately by 1.22 m higher, than the flood in 1974. Also in the Poprad basin is the August 1813 flood referred as the most devastating flood in last 400 years. The position of the flood mark is known, but the building was unfortunately removed later. The water level in 1813 was much higher than the water level during the recent flood in June 2010. In Cracow the water level

  1. Data expansion: the potential of grey literature for understanding floods

    NASA Astrophysics Data System (ADS)

    Uhlemann, S.; Bertelmann, R.; Merz, B.

    2013-03-01

    Sophisticated methods have been developed and become standard in analysing floods as well as for assessing flood risk. However, increasingly critique of the current standards and scientific practice can be found both in the flood hydrology community as well as in the risk community who argue that the considerable amount of information already available on natural disasters has not been adequately deployed and brought to effective use. We describe this phenomenon as a failure to synthesize knowledge that results from barriers and ignorance in awareness, use and management of the entire spectrum of relevant content, that is, data, information and knowledge. In this paper we argue that the scientific community in flood risk research ignores event-specific analysis and documentations as another source of data. We present results from a systematic search that includes an intensive study on sources and ways of information dissemination of flood-relevant publications. We obtain 186 documents that contain information on the sources, pathways, receptors and/or consequences for any of the 40 strongest trans-basin floods in Germany in the period 1952-2002. This study therefore provides the most comprehensive metadata collection of flood documentations for the considered geographical space and period. A total of 87.5% of all events have been documented, and especially the most severe floods have received extensive coverage. Only 30% of the material has been produced in the scientific/academic environment, and the majority of all documents (about 80%) can be considered grey literature (i.e. literature not controlled by commercial publishers). Therefore, ignoring grey sources in flood research also means ignoring the largest part of knowledge available on single flood events (in Germany). Further, the results of this study underpin the rapid changes in information dissemination of flood event literature over the last decade. We discuss the options and obstacles of incorporating

  2. Data expansion: the potential of grey literature for understanding floods

    NASA Astrophysics Data System (ADS)

    Uhlemann, S.; Bertelmann, R.; Merz, B.

    2012-09-01

    Sophisticated methods have been developed and become standard in analysing floods as well as for assessing the flood risk. However, increasingly critique of the current standards and scientific practice can be found both in the flood hydrology community as well as in the risk community who argue that the considerable amount of information already available on natural disasters has not been adequately deployed and brought to effective use. We describe this phenomenon as a failure to synthesize knowledge that results from barriers and ignorance in awareness, use and management of the entire spectrum of relevant content, that is, data, information and knowledge. In this paper we argue that the scientific community in flood risk research ignores event specific analysis and documentations as another source of data. We present results from a systematic search that includes an intensive study on sources and ways of information dissemination of flood relevant publications. We obtain 183 documents that contain information on the sources, pathways, receptors and/or consequences for any of the 40 strongest trans-basin floods in Germany in the period 1952-2002. This study therefore provides the most comprehensive meta-data collection of flood documentations for the considered geographical space and period. 87.5% of all events have been documented and especially the most severe floods have received extensive coverage. Only 30% of the material has been produced in the scientific/academic environment and the majority of all documents (about 80%) can be considered grey literature. Therefore, ignoring grey sources in flood research also means ignoring the largest part of knowledge available on single flood events (in Germany). Further, the results of this study underpin the rapid changes in information dissemination of flood event literature over the last decade. We discuss the options and obstacles of incorporating this data in the knowledge building process in the light of the

  3. Coupling the Alkaline-Surfactant-Polymer Technology and the Gelation Technology to Maximize Oil Production

    SciTech Connect

    Malcolm Pitts; Jie Qi; Dan Wilson; Phil Dowling; David Stewart; Bill Jones

    2005-12-01

    Gelation technologies have been developed to provide more efficient vertical sweep efficiencies for flooding naturally fractured oil reservoirs or reservoirs with different sand lenses with high permeability contrast. The field proven alkaline-surfactant-polymer technology economically recovers 15% to 25% OOIP more crude oil than waterflooding froin swept pore space of an oil reservoir. However, alkaline-surfactant-polymer technology is not amenable to naturally fractured reservoirs or reservoirs with high permeability contrast zones because much of injected solution bypasses target pore space containing oil. This work investigates whether combining these two technologies could broaden applicability of alkaline-surfactant-polymer flooding into these reservoirs. Fluid-fluid interaction with different gel chemical compositions and alkaline-surfactant-polymer solution with pH values ranging from 9.2 to 12.9 have been tested. Aluminum-polyacrylamide gels are not stable to alkaline-surfactant-polymer solutions at any pH. Chromium-polyacrylamide gels with polymer to chromium ion ratios of 25 or greater were stable to alkaline-surfactant-polymer solutions if solution pH was 10.6 or less. When the polymer to chromium ion was 15 or less, chromium-polyacrylamide gels were stable to alkaline-surfactant-polymer solutions with pH values up to 12.9. Chromium-xanthan gum gels were stable to alkaline-surfactant-polymer solutions with pH values of 12.9 at the polymer to chromium ion ratios tested. Silicate-polyacrylamide, resorcinol-formaldehyde, and sulfomethylated resorcinol-formaldehyde gels were also stable to alkaline-surfactant-polymer solutions with pH values ranging from 9.2 to 12.9. Iron-polyacrylamide gels were immediately destroyed when contacted with any of the alkaline-surfactant-polymer solutions with pH values ranging from 9.2 to 12.9. Gel solutions under dynamic conditions of linear corefloods showed similar stability to alkaline-surfactant-polymer solutions as in

  4. Alkaline and alkaline earth metal phosphate halides and phosphors

    SciTech Connect

    Lyons, Robert Joseph; Setlur, Anant Achyut; Cleaver, Robert John

    2012-11-13

    Compounds, phosphor materials and apparatus related to nacaphite family of materials are presented. Potassium and rubidium based nacaphite family compounds and phosphors designed by doping divalent rare earth elements in the sites of alkaline earth metals in the nacaphite material families are descried. An apparatus comprising the phosphors based on the nacaphite family materials are presented herein. The compounds presented is of formula A.sub.2B.sub.1-yR.sub.yPO.sub.4X where the elements A, B, R, X and suffix y are defined such that A is potassium, rubidium, or a combination of potassium and rubidium and B is calcium, strontium, barium, or a combination of any of calcium, strontium and barium. X is fluorine, chlorine, or a combination of fluorine and chlorine, R is europium, samarium, ytterbium, or a combination of any of europium, samarium, and ytterbium, and y ranges from 0 to about 0.1.

  5. Extreme flood estimations on a small alpine catchment in Switzerland, the case study of Limmerboden

    NASA Astrophysics Data System (ADS)

    Zeimetz, F.; García-Hernández, J.; Schleiss, A. J.

    2015-06-01

    In this paper, a case study on the estimations of extreme floods is described. The watershed chosen for the analysis is the catchment of the Limmernboden dam situated in Switzerland. Statistical methods and the simulation based "Probable Maximum Precipitation - Probable maximum Flood" (PMP-PMF) approach are applied for the estimation of the safety flood according to the Swiss flood directives. The results of both approaches are compared in order to determine the discrepancies between them. It can be outlined that the PMP-PMF method does not always overestimate the flood.

  6. Development of an alkaline fuel cell subsystem

    NASA Technical Reports Server (NTRS)

    1987-01-01

    A two task program was initiated to develop advanced fuel cell components which could be assembled into an alkaline power section for the Space Station Prototype (SSP) fuel cell subsystem. The first task was to establish a preliminary SSP power section design to be representative of the 200 cell Space Station power section. The second task was to conduct tooling and fabrication trials and fabrication of selected cell stack components. A lightweight, reliable cell stack design suitable for the SSP regenerative fuel cell power plant was completed. The design meets NASA's preliminary requirements for future multikilowatt Space Station missions. Cell stack component fabrication and tooling trials demonstrated cell components of the SSP stack design of the 1.0 sq ft area can be manufactured using techniques and methods previously evaluated and developed.

  7. Nest sites and conservation of endangered Interior Least Terns Sterna antillarum athalassos on an alkaline flat in the south-central Great Plains (USA)

    USGS Publications Warehouse

    Winton, Brian R.; Leslie, David M.

    2003-01-01

    We monitored nest sites of endangered Interior Least Terns on a 5 095 ha alkaline flat in north-central Oklahoma, USA. After nest loss, Least Terns commonly renested and experienced 30% apparent nest success in 1995-1996 (n = 233 nests). Nest success and predation differed by location on the alkaline flat in 1995 and overall, but nest success and flooding did not differ by microhabitat type. Predation was highest at nests ??? 5 cm from debris (driftwood/hay) in 1995. No differences in nesting success, flooding, or predation were observed on comparing nests inside and outside electrified enclosures. Coyotes and Striped Skunks were confirmed nest predators, and Ring-billed Gulls were suspected nest predators. We identified one location on the alkaline flat of about 1 000 ha with consistently lower nest losses attributable to flooding and predation and the highest hatching success compared with other parts of the alkaline flat; it was typified by open ground and bisected by several creeks. Management activities that minimize flooding and predation in this area could further enhance nest success and theoretically increase overall productivity of this population of Least Terns. However, the efficacy of electrified enclosures and nest-site enhancements, as currently undertaken, is questionable because of considerable annual variation in use by and protection of Least Terns.

  8. Flood insurance in Canada: implications for flood management and residential vulnerability to flood hazards.

    PubMed

    Oulahen, Greg

    2015-03-01

    Insurance coverage of damage caused by overland flooding is currently not available to Canadian homeowners. As flood disaster losses and water damage claims both trend upward, insurers in Canada are considering offering residential flood coverage in order to properly underwrite the risk and extend their business. If private flood insurance is introduced in Canada, it will have implications for the current regime of public flood management and for residential vulnerability to flood hazards. This paper engages many of the competing issues surrounding the privatization of flood risk by addressing questions about whether flood insurance can be an effective tool in limiting exposure to the hazard and how it would exacerbate already unequal vulnerability. A case study investigates willingness to pay for flood insurance among residents in Metro Vancouver and how attitudes about insurance relate to other factors that determine residential vulnerability to flood hazards. Findings indicate that demand for flood insurance is part of a complex, dialectical set of determinants of vulnerability. PMID:25526847

  9. Techniques for estimating flood hydrographs for ungaged urban watersheds

    SciTech Connect

    Stricker, V.A.; Sauer, V.B.

    1982-04-01

    The Clark Method, modified slightly, was used to develop a synthetic dimensionless hydrograph that can be used to estimate flood hydrographs for ungaged urban watersheds. Application of the technique results in a typical (average) flood hydrograph for a given peak discharge. Input necessary to apply the technique is an estimate of basin lagtime and the recurrence interval peak discharge. Equations for this purpose were obtained from a recent nationwide study on flood frequency in urban watersheds. A regression equation was developed which relates flood volumes to drainage area size, basin lagtime, and peak discharge. This equation is useful where storage of floodwater may be a part of design or flood prevention. 6 refs., 17 figs., 5 tabs.

  10. Generalized flood-frequency estimates for urban areas in Missouri

    USGS Publications Warehouse

    Gann, Ector Eugene

    1971-01-01

    A method is presented for estimating flood-frequency information for urban areas in Missouri. Flood-frequency relations are presented which provide an estimate of the flood-peak discharge for floods with recurrence intervals from 2.33 to 100 years for basins with various degrees of existing or projected urban development. Drainage area sizes for which the relations are applicable range from 0.1 to 50 square miles. These generalized relations will be useful to the urban planner and designer until more comprehensive studies are completed for the individual urban areas within the State. The relations will also be of use in the definition of flood-hazard areas in Missouri.

  11. Flooding in Central China

    NASA Technical Reports Server (NTRS)

    2002-01-01

    During the summer of 2002, frequent, heavy rains gave rise to floods and landslides throughout China that have killed over 1,000 people and affected millions. This false-color image of the western Yangtze River and Dongting Lake in central China was acquired on August 21, 2002, by the Moderate-resolution Imaging Spectroradiometer (MODIS), flying aboard NASA's Terra spacecraft. (right) The latest flooding crisis in China centers on Dingtong Lake in the center of the image. Heavy rains have caused it to swell over its banks and swamp lakefront towns in the province of Hunan. As of August 23, 2002, more than 250,000 people have been evacuated, and over one million people have been brought in to fortify the dikes around the lake. Normally the lake would appear much smaller and more defined in the MODIS image. Credit: Image courtesy Jacques Descloitres, MODIS Land Rapid Response Team at NASA GSFC.

  12. Flooding tolerance in halophytes.

    PubMed

    Colmer, Timothy D; Flowers, Timothy J

    2008-01-01

    Flooding is a common environmental variable with salinity. Submerged organs can suffer from O(2) deprivation and the resulting energy deficits can compromise ion transport processes essential for salinity tolerance. Tolerance of soil waterlogging in halophytes, as in glycophytes, is often associated with the production of adventitious roots containing aerenchyma, and the resultant internal O(2) supply. For some species, shallow rooting in aerobic upper soil layers appears to be the key to survival on frequently flooded soils, although little is known of the anoxia tolerance in halophytes. Halophytic species that inhabit waterlogged substrates are able to regulate their shoot ion concentrations in spite of the hypoxic (or anoxic) medium in which they are rooted, this being in stark contrast with most other plants which suffer when salinity and waterlogging occur in combination. Very few studies have addressed the consequences of submergence of the shoots by saline water; these have, however, demonstrated tolerance of temporary submergence in some halophytes.

  13. Flooding tolerance in halophytes.

    PubMed

    Colmer, Timothy D; Flowers, Timothy J

    2008-01-01

    Flooding is a common environmental variable with salinity. Submerged organs can suffer from O(2) deprivation and the resulting energy deficits can compromise ion transport processes essential for salinity tolerance. Tolerance of soil waterlogging in halophytes, as in glycophytes, is often associated with the production of adventitious roots containing aerenchyma, and the resultant internal O(2) supply. For some species, shallow rooting in aerobic upper soil layers appears to be the key to survival on frequently flooded soils, although little is known of the anoxia tolerance in halophytes. Halophytic species that inhabit waterlogged substrates are able to regulate their shoot ion concentrations in spite of the hypoxic (or anoxic) medium in which they are rooted, this being in stark contrast with most other plants which suffer when salinity and waterlogging occur in combination. Very few studies have addressed the consequences of submergence of the shoots by saline water; these have, however, demonstrated tolerance of temporary submergence in some halophytes. PMID:18482227

  14. Mapping a flood before it happens

    USGS Publications Warehouse

    Jones, Joseph L.

    2004-01-01

    What's missing from flood forecasts? Maps—The only maps generally available today are maps used for planning. They are maps of theoretical floods, not maps of flooding forecast for an approaching storm. The U.S. Geological Survey (USGS) and the National Weather Service (NWS) have developed a way to bring flood forecasting and flood mapping together, producing flood maps for tomorrow's flood today...and getting them on the Internet in time for those in harm's way to react.

  15. Cerberus Flood Features

    NASA Technical Reports Server (NTRS)

    2005-01-01

    16 October 2005 This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows streamlined landforms carved by catastrophic floods that occurred in the eastern Cerberus region, some time in the distant martian past.

    Location near: 15.1oN, 193.5oW Image width: width: 3 km (1.9 mi) Illumination from: lower left Season: Northern Autumn

  16. Intelligent Real-Time Reservoir Operation for Flood Control

    NASA Astrophysics Data System (ADS)

    Chang, L.; Hsu, H.

    2008-12-01

    Real-time flood control of a multi-purpose reservoir should consider decreasing the flood peak stage downstream and storing floodwaters for future usage during typhoon seasons. It is a continuous and instant decision-making process based on relevant operating rules, policy and water laws, in addition the immediate rainfall and the hydrology information; however, it is difficult to learn the intelligent experience from the elder operators. The main purpose of this study is to establish the automatic reservoir flood control model to achieve the goal of a reservoir operation during flood periods. In this study, we propose an intelligent reservoir operating methodology for real-time flood control. First, the genetic algorithm is used to search the optimal solutions, which can be considered as extracting the knowledge of reservoir operation strategies. Then, the adaptive network-based fuzzy inference system (ANFIS), which uses a hybrid learning procedure for extracting knowledge in the form of fuzzy if-then rules, is used to learn the input-output patterns and then to estimate the optimal flood operation. The Shihmen reservoir in Northern Taiwan was used as a case study, where its 26 typhoon events are investigated by the proposed method. The results demonstrate that the proposed control model can perform much better than the original reservoir operator in 26 flood events and effectively achieve decreasing peak flood stage downstream and storing floodwaters for future usage.

  17. Study of Beijiang catchment flash-flood forecasting model

    NASA Astrophysics Data System (ADS)

    Chen, Y.; Li, J.; Huang, S.; Dong, Y.

    2015-05-01

    Beijiang catchment is a small catchment in southern China locating in the centre of the storm areas of the Pearl River Basin. Flash flooding in Beijiang catchment is a frequently observed disaster that caused direct damages to human beings and their properties. Flood forecasting is the most effective method for mitigating flash floods, the goal of this paper is to develop the flash flood forecasting model for Beijiang catchment. The catchment property data, including DEM, land cover types and soil types, which will be used for model construction and parameter determination, are downloaded from the website freely. Based on the Liuxihe Model, a physically based distributed hydrological model, a model for flash flood forecasting of Beijiang catchment is set up. The model derives the model parameters from the terrain properties, and further optimized with the observed flooding process, which improves the model performance. The model is validated with a few observed floods occurred in recent years, and the results show that the model is reliable and is promising for flash flood forecasting.

  18. The credibility challenge for global fluvial flood risk analysis

    NASA Astrophysics Data System (ADS)

    Trigg, M. A.; Birch, C. E.; Neal, J. C.; Bates, P. D.; Smith, A.; Sampson, C. C.; Yamazaki, D.; Hirabayashi, Y.; Pappenberger, F.; Dutra, E.; Ward, P. J.; Winsemius, H. C.; Salamon, P.; Dottori, F.; Rudari, R.; Kappes, M. S.; Simpson, A. L.; Hadzilacos, G.; Fewtrell, T. J.

    2016-09-01

    Quantifying flood hazard is an essential component of resilience planning, emergency response, and mitigation, including insurance. Traditionally undertaken at catchment and national scales, recently, efforts have intensified to estimate flood risk globally to better allow consistent and equitable decision making. Global flood hazard models are now a practical reality, thanks to improvements in numerical algorithms, global datasets, computing power, and coupled modelling frameworks. Outputs of these models are vital for consistent quantification of global flood risk and in projecting the impacts of climate change. However, the urgency of these tasks means that outputs are being used as soon as they are made available and before such methods have been adequately tested. To address this, we compare multi-probability flood hazard maps for Africa from six global models and show wide variation in their flood hazard, economic loss and exposed population estimates, which has serious implications for model credibility. While there is around 30%-40% agreement in flood extent, our results show that even at continental scales, there are significant differences in hazard magnitude and spatial pattern between models, notably in deltas, arid/semi-arid zones and wetlands. This study is an important step towards a better understanding of modelling global flood hazard, which is urgently required for both current risk and climate change projections.

  19. Identification of flood-rich and flood-poor periods in flood series

    NASA Astrophysics Data System (ADS)

    Mediero, Luis; Santillán, David; Garrote, Luis

    2015-04-01

    Recently, a general concern about non-stationarity of flood series has arisen, as changes in catchment response can be driven by several factors, such as climatic and land-use changes. Several studies to detect trends in flood series at either national or trans-national scales have been conducted. Trends are usually detected by the Mann-Kendall test. However, the results of this test depend on the starting and ending year of the series, which can lead to different results in terms of the period considered. The results can be conditioned to flood-poor and flood-rich periods located at the beginning or end of the series. A methodology to identify statistically significant flood-rich and flood-poor periods is developed, based on the comparison between the expected sampling variability of floods when stationarity is assumed and the observed variability of floods in a given series. The methodology is applied to a set of long series of annual maximum floods, peaks over threshold and counts of annual occurrences in peaks over threshold series observed in Spain in the period 1942-2009. Mediero et al. (2014) found a general decreasing trend in flood series in some parts of Spain that could be caused by a flood-rich period observed in 1950-1970, placed at the beginning of the flood series. The results of this study support the findings of Mediero et al. (2014), as a flood-rich period in 1950-1970 was identified in most of the selected sites. References: Mediero, L., Santillán, D., Garrote, L., Granados, A. Detection and attribution of trends in magnitude, frequency and timing of floods in Spain, Journal of Hydrology, 517, 1072-1088, 2014.

  20. Estimating flood discharge using witness movies in post-flood hydrological surveys

    NASA Astrophysics Data System (ADS)

    Le Coz, Jérôme; Hauet, Alexandre; Le Boursicaud, Raphaël; Pénard, Lionel; Bonnifait, Laurent; Dramais, Guillaume; Thollet, Fabien; Braud, Isabelle

    2015-04-01

    The estimation of streamflow rates based on post-flood surveys is of paramount importance for the investigation of extreme hydrological events. Major uncertainties usually arise from the absence of information on the flow velocities and from the limited spatio-temporal resolution of such surveys. Nowadays, after each flood occuring in populated areas home movies taken from bridges, river banks or even drones are shared by witnesses through Internet platforms like YouTube. Provided that some topography data and additional information are collected, image-based velocimetry techniques can be applied to some of these movie materials, in order to estimate flood discharges. As a contribution to recent post-flood surveys conducted in France, we developed and applied a method for estimating velocities and discharges based on the Large Scale Particle Image Velocimetry (LSPIV) technique. Since the seminal work of Fujita et al. (1998), LSPIV applications to river flows were reported by a number of authors and LSPIV can now be considered a mature technique. However, its application to non-professional movies taken by flood witnesses remains challenging and required some practical developments. The different steps to apply LSPIV analysis to a flood home movie are as follows: (i) select a video of interest; (ii) contact the author for agreement and extra information; (iii) conduct a field topography campaign to georeference Ground Control Points (GCPs), water level and cross-sectional profiles; (iv) preprocess the video before LSPIV analysis: correct lens distortion, align the images, etc.; (v) orthorectify the images to correct perspective effects and know the physical size of pixels; (vi) proceed with the LSPIV analysis to compute the surface velocity field; and (vii) compute discharge according to a user-defined velocity coefficient. Two case studies in French mountainous rivers during extreme floods are presented. The movies were collected on YouTube and field topography

  1. Proper estimation of hydrological parameters from flood forecasting aspects

    NASA Astrophysics Data System (ADS)

    Miyamoto, Mamoru; Matsumoto, Kazuhiro; Tsuda, Morimasa; Yamakage, Yuzuru; Iwami, Yoichi; Yanami, Hitoshi; Anai, Hirokazu

    2016-04-01

    The hydrological parameters of a flood forecasting model are normally calibrated based on an entire hydrograph of past flood events by means of an error assessment function such as mean square error and relative error. However, the specific parts of a hydrograph, i.e., maximum discharge and rising parts, are particularly important for practical flood forecasting in the sense that underestimation may lead to a more dangerous situation due to delay in flood prevention and evacuation activities. We conducted numerical experiments to find the most proper parameter set for practical flood forecasting without underestimation in order to develop an error assessment method for calibration appropriate for flood forecasting. A distributed hydrological model developed in Public Works Research Institute (PWRI) in Japan was applied to fifteen past floods in the Gokase River basin of 1,820km2 in Japan. The model with gridded two-layer tanks for the entire target river basin included hydrological parameters, such as hydraulic conductivity, surface roughness and runoff coefficient, which were set according to land-use and soil-type distributions. Global data sets, e.g., Global Map and Digital Soil Map of the World (DSMW), were employed as input data for elevation, land use and soil type. The values of fourteen types of parameters were evenly sampled with 10,001 patterns of parameter sets determined by the Latin Hypercube Sampling within the search range of each parameter. Although the best reproduced case showed a high Nash-Sutcliffe Efficiency of 0.9 for all flood events, the maximum discharge was underestimated in many flood cases. Therefore, two conditions, which were non-underestimation in the maximum discharge and rising parts of a hydrograph, were added in calibration as the flood forecasting aptitudes. The cases with non-underestimation in the maximum discharge and rising parts of the hydrograph also showed a high Nash-Sutcliffe Efficiency of 0.9 except two flood cases

  2. A technique for estimating flood heights on small streams in the city of Charlotte and Mecklenburg County, North Carolina

    USGS Publications Warehouse

    Eddins, William H.; Jackson, N.M., Jr.

    1980-01-01

    A method for estimating the height reached by floods having recurrence intervals of 10, 20, and 100 years is defined for unregulated streams in Charlotte and Mecklenburg County draining areas of less than 1.0 square mile. Flood heights, defined as the vertical distance between the streambed at riffles and the floodwater surface, can be used to estimate flood elevations on small streams where flood profiles and flood inundation maps are not available. An illustrative example is given of how the method can be used with streambed elevation data and topographic maps to estimate flood elevations and delineate inundated areas.

  3. Flood Risk Assessments of Architectural Heritage - Case of Changgyeonggung Palace

    NASA Astrophysics Data System (ADS)

    Lee, Hyosang; Kim, Ji-sung; Lee, Ho-jin

    2014-05-01

    The risk of natural disasters such as flood and earthquake has increased due to recent extreme weather events. Therefore, the necessity of the risk management system to protect architectural properties, a cultural heritage of humanity, from natural disasters has been consistently felt. The solutions for managing flood risk focusing on architectural heritage are suggested and applied to protect Changgyeonggung Palace, a major palace heritage in Seoul. After the probable rainfall scenario for risk assessment (frequency: 100 years, 200 years, and 500 years) and the scenario of a probable maximum precipitation (PMP) are made and a previous rainfall event (from July 26th to 28th in 2011) is identified, they are used for the model (HEC-HMS, SWMM) to assess flood risk of certain areas covering Changgyeonggung Palace to do flood amount. Such flood amount makes it possible to identify inundation risks based on GIS models to assess flood risk of individual architectural heritage. The results of assessing such risk are used to establish the disaster risk management system that managers of architectural properties can utilize. According to the results of assessing flood risk of Changgyeonggung Palace, inundation occurs near outlets of Changgyeonggung Palace and sections of river channel for all scenarios of flood risk but the inundation risk of major architectural properties was estimated low. The methods for assessing flood risk of architectural heritage proposed in this study and the risk management system for Changgyeonggung Palace using the methods show thorough solutions for flood risk management and the possibility of using the solutions seems high. A comprehensive management system for architectural heritage will be established in the future through the review on diverse factors for disasters.

  4. Swiss Re Global Flood Hazard Zones: Know your flood risk

    NASA Astrophysics Data System (ADS)

    Vinukollu, R. K.; Castaldi, A.; Mehlhorn, J.

    2012-12-01

    Floods, among all natural disasters, have a great damage potential. On a global basis, there is strong evidence of increase in the number of people affected and economic losses due to floods. For example, global insured flood losses have increased by 12% every year since 1970 and this is expected to further increase with growing exposure in the high risk areas close to rivers and coastlines. Recently, the insurance industry has been surprised by the large extent of losses, because most countries lack reliable hazard information. One example has been the 2011 Thailand floods where millions of people were affected and the total economic losses were 30 billion USD. In order to assess the flood risk across different regions and countries, the flood team at Swiss Re based on a Geomorphologic Regression approach, developed in house and patented, produced global maps of flood zones. Input data for the study was obtained from NASA's Shuttle Radar Topographic Mission (SRTM) elevation data, Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) Global Digital Elevation Model (GDEM) and HydroSHEDS. The underlying assumptions of the approach are that naturally flowing rivers shape their channel and flood plain according to basin inherent forces and characteristics and that the flood water extent strongly depends on the shape of the flood plain. On the basis of the catchment characteristics, the model finally calculates the probability of a location to be flooded or not for a defined return period, which in the current study was set to 100 years. The data is produced at a 90-m resolution for latitudes 60S to 60N. This global product is now used in the insurance industry to inspect, inform and/or insure the flood risk across the world.

  5. Flood Deposition Analysis of Northern California's Eel River (Flood- DANCER)

    NASA Astrophysics Data System (ADS)

    Ahlgren, S.; Bauman, P. D.; Dillon, R. J.; Gallagher, N.; Jamison, M. E.; King, A.; Lee, J.; Siwicke, K. A.; Harris, C. K.; Wheatcroft, R. A.; Borgeld, J. C.; Goldthwait, S. A.

    2006-12-01

    Characterizing and quantifying the fate of river born sediment is critical to our understanding of sediment supply and erosion in impacted coastal areas. Strata deposited in coastal zones provide an invaluable record of recent and historical environmental events. The Eel River in northern California has one of the highest sediment yields of any North American river and has preserved evidence of the impact of recent flood events. Previous research has documented sediment deposits associated with Eel River flood events in January 1995, March 1995, and January 1997. These deposits were found north of the river mouth on the mid shelf in water depths from 50-100 m. Sediment strata were up to 5-10 cm thick and were composed of fine to very fine grained silts and clays. Until recently, no model had been able to correctly reproduce the sediment deposits associated with these floods. In 2005, Harris et al. developed a model that accurately represents the volume and location of the flood deposit associated with the January 1997 event. However, rigorous assessment of the predictive capability of this model requires that a new flood of the Eel River be used as a test case. During the winter of 2005-06 the Eel River rose above flood stage reaching discharge similar to the flood of January 1995 which resulted in flood sedimentation on the Eel River shelf. A flood-related deposit 1-5 cm thick was found in water depths of 60-90 m approximately 20-35 km north of the river mouth. Flood deposits were recognized in box cores collected in the months following the flood. As in previously studied events, flood- related strata near the sediment surface were recognized in core x-radiographs, resistivity and porosity profiles, and were composed of fine to very fine grained silts and clays. In addition, surface flood sediments were associated with lower concentrations of benthic foraminifera compared with deeper sediments. The January 2006 flood deposit was similar in thickness to the

  6. Flood of October 1986 at Seward, Alaska

    USGS Publications Warehouse

    Jones, S.H.; Zenone, Chester

    1988-01-01

    Broad areas along the lower Resurrection River and Salmon Creek as well as the surfaces of several adjacent alluvial fans in the Seward area were flooded as a result of the intensive rainstorm of October 9-11, 1986. Severe erosion took place through the steep gradient, mountain canyons and near the apex of the fans, while rock and debris were deposited on the distal parts of the fans. In Godwin, Lost, Box Canyon, Japanese, and Spruce Creek basins, and perhaps others, landslides or debris avalanches dammed the streams temporarily. Subsequent failure or overtopping of these dams led to ' surge-release ' flooding; peak discharge of such a flood at Spruce Creek was 13,600 cu ft/sec, four times as great as any previously known maximum discharge from the basin and 2.5 times as great as the runoff rate from the debris dam. Flood discharges were determined indirectly--using the slope-area method--at ten high-gradient reaches on nine streams. Computed peak discharges for several small basins were the largest since records began in 1963. The largest rainfall-runoff rate unaffected by surge-release was 1 ,020 cu ft per sec per sq mi at Rudolph Creek, which has a drainage area of 1.00 sq mi. The 15.05 inches of rain that fell in one 24-hour period during the storm was assigned a recurrence interval of 100 years or greater. The length of the streamflow record available for most Seward area streams-25 years or less-is inadequate to reliably define flood frequency relations for recurrence intervals as great as 100 years. However, the slope-area determined discharge of Spruce Creek above the debris avalanche indicates a recurrence interval of a 100 years or greater. In addition, conventional flood-frequency analysis techniques are not applicable to peak discharges that are affected by surge-release phenomena. Large, damaging floods have repeatedly caused major damage in the Seward area, and the potential for catastrophic, debris-laden floods is an ever-present threat to areas

  7. Alkaline Phosphatase, Soluble Extracellular Adenine Nucleotides, and Adenosine Production after Infant Cardiopulmonary Bypass

    PubMed Central

    Davidson, Jesse A.; Urban, Tracy; Tong, Suhong; Twite, Mark; Woodruff, Alan

    2016-01-01

    Rationale Decreased alkaline phosphatase activity after infant cardiac surgery is associated with increased post-operative cardiovascular support requirements. In adults undergoing coronary artery bypass grafting, alkaline phosphatase infusion may reduce inflammation. Mechanisms underlying these effects have not been explored but may include decreased conversion of extracellular adenine nucleotides to adenosine. Objectives 1) Evaluate the association between alkaline phosphatase activity and serum conversion of adenosine monophosphate to adenosine after infant cardiac surgery; 2) assess if inhibition/supplementation of serum alkaline phosphatase modulates this conversion. Methods and Research Pre/post-bypass serum samples were obtained from 75 infants <4 months of age. Serum conversion of 13C5-adenosine monophosphate to 13C5-adenosine was assessed with/without selective inhibition of alkaline phosphatase and CD73. Low and high concentration 13C5-adenosine monophosphate (simulating normal/stress concentrations) were used. Effects of alkaline phosphatase supplementation on adenosine monophosphate clearance were also assessed. Changes in serum alkaline phosphatase activity were strongly correlated with changes in 13C5-adenosine production with or without CD73 inhibition (r = 0.83; p<0.0001). Serum with low alkaline phosphatase activity (≤80 U/L) generated significantly less 13C5-adenosine, particularly in the presence of high concentration 13C5-adenosine monophosphate (10.4μmol/L vs 12.9μmol/L; p = 0.0004). Inhibition of alkaline phosphatase led to a marked decrease in 13C5-adenosine production (11.9μmol/L vs 2.7μmol/L; p<0.0001). Supplementation with physiologic dose human tissue non-specific alkaline phosphatase or high dose bovine intestinal alkaline phosphatase doubled 13C5-adenosine monophosphate conversion to 13C5-adenosine (p<0.0001). Conclusions Alkaline phosphatase represents the primary serum ectonucleotidase after infant cardiac surgery and low post

  8. The role of Natural Flood Management in managing floods in large scale basins during extreme events

    NASA Astrophysics Data System (ADS)

    Quinn, Paul; Owen, Gareth; ODonnell, Greg; Nicholson, Alex; Hetherington, David

    2016-04-01

    There is a strong evidence database showing the negative impacts of land use intensification and soil degradation in NW European river basins on hydrological response and to flood impact downstream. However, the ability to target zones of high runoff production and the extent to which we can manage flood risk using nature-based flood management solution are less known. A move to planting more trees and having less intense farmed landscapes is part of natural flood management (NFM) solutions and these methods suggest that flood risk can be managed in alternative and more holistic ways. So what local NFM management methods should be used, where in large scale basin should they be deployed and how does flow is propagate to any point downstream? Generally, how much intervention is needed and will it compromise food production systems? If we are observing record levels of rainfall and flow, for example during Storm Desmond in Dec 2015 in the North West of England, what other flood management options are really needed to complement our traditional defences in large basins for the future? In this paper we will show examples of NFM interventions in the UK that have impacted at local scale sites. We will demonstrate the impact of interventions at local, sub-catchment (meso-scale) and finally at the large scale. These tools include observations, process based models and more generalised Flood Impact Models. Issues of synchronisation and the design level of protection will be debated. By reworking observed rainfall and discharge (runoff) for observed extreme events in the River Eden and River Tyne, during Storm Desmond, we will show how much flood protection is needed in large scale basins. The research will thus pose a number of key questions as to how floods may have to be managed in large scale basins in the future. We will seek to support a method of catchment systems engineering that holds water back across the whole landscape as a major opportunity to management water

  9. Flood Extent Mapping for Namibia Using Change Detection and Thresholding with SAR

    NASA Technical Reports Server (NTRS)

    Long, Stephanie; Fatoyinbo, Temilola E.; Policelli, Frederick

    2014-01-01

    A new method for flood detection change detection and thresholding (CDAT) was used with synthetic aperture radar (SAR) imagery to delineate the extent of flooding for the Chobe floodplain in the Caprivi region of Namibia. This region experiences annual seasonal flooding and has seen a recent renewal of severe flooding after a long dry period in the 1990s. Flooding in this area has caused loss of life and livelihoods for the surrounding communities and has caught the attention of disaster relief agencies. There is a need for flood extent mapping techniques that can be used to process images quickly, providing near real-time flooding information to relief agencies. ENVISAT/ASAR and Radarsat-2 images were acquired for several flooding seasons from February 2008 to March 2013. The CDAT method was used to determine flooding from these images and includes the use of image subtraction, decision based classification with threshold values, and segmentation of SAR images. The total extent of flooding determined for 2009, 2011 and 2012 was about 542 km2, 720 km2, and 673 km2 respectively. Pixels determined to be flooded in vegetation were typically <0.5 % of the entire scene, with the exception of 2009 where the detection of flooding in vegetation was much greater (almost one third of the total flooded area). The time to maximum flooding for the 2013 flood season was determined to be about 27 days. Landsat water classification was used to compare the results from the new CDAT with SAR method; the results show good spatial agreement with Landsat scenes.

  10. Flooding and Mental Health: A Systematic Mapping Review

    PubMed Central

    Fernandez, Ana; Black, John; Jones, Mairwen; Wilson, Leigh; Salvador-Carulla, Luis; Astell-Burt, Thomas; Black, Deborah

    2015-01-01

    Background Floods are the most common type of global natural disaster. Floods have a negative impact on mental health. Comprehensive evaluation and review of the literature are lacking. Objective To systematically map and review available scientific evidence on mental health impacts of floods caused by extended periods of heavy rain in river catchments. Methods We performed a systematic mapping review of published scientific literature in five languages for mixed studies on floods and mental health. PUBMED and Web of Science were searched to identify all relevant articles from 1994 to May 2014 (no restrictions). Results The electronic search strategy identified 1331 potentially relevant papers. Finally, 83 papers met the inclusion criteria. Four broad areas are identified: i) the main mental health disorders—post-traumatic stress disorder, depression and anxiety; ii] the factors associated with mental health among those affected by floods; iii) the narratives associated with flooding, which focuses on the long-term impacts of flooding on mental health as a consequence of the secondary stressors; and iv) the management actions identified. The quantitative and qualitative studies have consistent findings. However, very few studies have used mixed methods to quantify the size of the mental health burden as well as exploration of in-depth narratives. Methodological limitations include control of potential confounders and short-term follow up. Limitations Floods following extreme events were excluded from our review. Conclusions Although the level of exposure to floods has been systematically associated with mental health problems, the paucity of longitudinal studies and lack of confounding controls precludes strong conclusions. Implications We recommend that future research in this area include mixed-method studies that are purposefully designed, using more rigorous methods. Studies should also focus on vulnerable groups and include analyses of policy and practical

  11. Flood Hazards - A National Threat

    USGS Publications Warehouse

    ,

    2006-01-01

    In the late summer of 2005, the remarkable flooding brought by Hurricane Katrina, which caused more than $200 billion in losses, constituted the costliest natural disaster in U.S. history. However, even in typical years, flooding causes billions of dollars in damage and threatens lives and property in every State. Natural processes, such as hurricanes, weather systems, and snowmelt, can cause floods. Failure of levees and dams and inadequate drainage in urban areas can also result in flooding. On average, floods kill about 140 people each year and cause $6 billion in property damage. Although loss of life to floods during the past half-century has declined, mostly because of improved warning systems, economic losses have continued to rise due to increased urbanization and coastal development.

  12. Intestinal alkaline phosphatase to treat necrotizing enterocolitis

    PubMed Central

    Biesterveld, Ben E.; Koehler, Shannon M.; Heinzerling, Nathan P.; Rentea, Rebecca M.; Fredrich, Katherine; Welak, Scott R.; Gourlay, David M.

    2015-01-01

    Background Intestinal alkaline phosphatase (IAP) activity is decreased in necrotizing enterocolitis (NEC), and IAP supplementation prevents NEC development. It is not known if IAP given after NEC onset can reverse the course of the disease. We hypothesized that enteral IAP given after NEC induction would not reverse intestinal injury. Materials and methods NEC was induced in Sprague–Dawley pups by delivery preterm followed by formula feedings with lipopolysaccharide (LPS) and hypoxia exposure and continued up to 4 d. IAP was added to feeds on day 2 until being sacrificed on day 4. NEC severity was scored based on hematoxylin and eosin-stained terminal ileum sections, and AP activity was measured using a colorimetric assay. IAP and interleukin-6 expression were measured using real time polymerase chain reaction. Results NEC pups' alkaline phosphatase (AP) activity was decreased to 0.18 U/mg compared with controls of 0.57 U/mg (P < 0.01). Discontinuation of LPS and hypoxia after 2 d increased AP activity to 0.36 U/mg (P < 0.01). IAP supplementation in matched groups did not impact total AP activity or expression. Discontinuing LPS and hypoxia after NEC onset improved intestinal injury scores to 1.14 compared with continued stressors, score 2.25 (P < 0.01). IAP supplementation decreased interleukin-6 expression two-fold (P < 0.05), though did not reverse NEC intestinal damage (P = 0.5). Conclusions This is the first work to demonstrate that removing the source of NEC improves intestinal damage and increases AP activity. When used as a rescue treatment, IAP decreased intestinal inflammation though did not impact injury making it likely that IAP is best used preventatively to those neonates at risk. PMID:25840489

  13. Alkaline pH sensor molecules.

    PubMed

    Murayama, Takashi; Maruyama, Ichiro N

    2015-11-01

    Animals can survive only within a narrow pH range. This requires continual monitoring of environmental and body-fluid pH. Although a variety of acidic pH sensor molecules have been reported, alkaline pH sensor function is not well understood. This Review describes neuronal alkaline pH sensors, grouped according to whether they monitor extracellular or intracellular alkaline pH. Extracellular sensors include the receptor-type guanylyl cyclase, the insulin receptor-related receptor, ligand-gated Cl- channels, connexin hemichannels, two-pore-domain K+ channels, and transient receptor potential (TRP) channels. Intracellular sensors include TRP channels and gap junction channels. Identification of molecular mechanisms underlying alkaline pH sensing is crucial for understanding how animals respond to environmental alkaline pH and how body-fluid pH is maintained within a narrow range.

  14. Generating precipitation ensembles for flood alert and risk management

    NASA Astrophysics Data System (ADS)

    Caseri, Angelica; Javelle, Pierre; Ramos, Maria-Helena; Leblois, Etienne

    2015-04-01

    Floods represent one of the major natural disasters that are often responsible for fatalities and economic losses. Flood warning systems are needed to anticipate the arrival of severe events and mitigate their impacts. Flood alerts are particularly important for risk management and response in the nowcasting of flash floods. In this case, precipitation fields observed in real time play a crucial role and observational uncertainties must be taken into account. In this study, we investigate the potential of a framework which combines a geostatistical conditional simulation method that considers information from precipitation radar and rain gauges, and a distributed rainfall-runoff model to generate an ensemble of precipitation fields and produce probabilistic flood alert maps. We adapted the simulation method proposed by Leblois and Creutin (2013), based on the Turning Band Method (TBM) and a conditional simulation approach, to consider the temporal and spatial characteristics of radar data and rain gauge measurements altogether and generate precipitation ensembles. The AIGA system developed by Irstea and Météo-France for predicting flash floods in the French Mediterranean region (Javelle et al., 2014) was used to transform the generated precipitation ensembles into ensembles of discharge at the outlet of the studied catchments. Finally, discharge ensembles were translated into maps providing information on the probability of exceeding a given flood threshold. A total of 19 events that occurred between 2009 and 2013 in the Var region (southeastern France), a region prone to flash floods, was used to illustrate the approach. Results show that the proposed method is able to simulate an ensemble of realistic precipitation fields and capture peak flows of flash floods. This was shown to be particularly useful at ungauged catchments, where uncertainties on the evaluation of flood peaks are high. The results obtained also show that the approach developed can be used to

  15. 7 CFR 1788.3 - Flood insurance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Flood Insurance Program (see 44 CFR part 59 et seq.) provides for a standard flood insurance policy... 7 Agriculture 12 2010-01-01 2010-01-01 false Flood insurance. 1788.3 Section 1788.3 Agriculture... Insurance Requirements § 1788.3 Flood insurance. (a) Borrowers shall purchase and maintain flood...

  16. 7 CFR 1788.3 - Flood insurance.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Flood Insurance Program (see 44 CFR part 59 et seq.) provides for a standard flood insurance policy... 7 Agriculture 12 2014-01-01 2013-01-01 true Flood insurance. 1788.3 Section 1788.3 Agriculture... Insurance Requirements § 1788.3 Flood insurance. (a) Borrowers shall purchase and maintain flood...

  17. 7 CFR 1788.3 - Flood insurance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Flood Insurance Program (see 44 CFR part 59 et seq.) provides for a standard flood insurance policy... 7 Agriculture 12 2011-01-01 2011-01-01 false Flood insurance. 1788.3 Section 1788.3 Agriculture... Insurance Requirements § 1788.3 Flood insurance. (a) Borrowers shall purchase and maintain flood...

  18. 7 CFR 1788.3 - Flood insurance.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Flood Insurance Program (see 44 CFR part 59 et seq.) provides for a standard flood insurance policy... 7 Agriculture 12 2012-01-01 2012-01-01 false Flood insurance. 1788.3 Section 1788.3 Agriculture... Insurance Requirements § 1788.3 Flood insurance. (a) Borrowers shall purchase and maintain flood...

  19. 7 CFR 1788.3 - Flood insurance.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Flood Insurance Program (see 44 CFR part 59 et seq.) provides for a standard flood insurance policy... 7 Agriculture 12 2013-01-01 2013-01-01 false Flood insurance. 1788.3 Section 1788.3 Agriculture... Insurance Requirements § 1788.3 Flood insurance. (a) Borrowers shall purchase and maintain flood...

  20. Estimation of phosphorus flux in rivers during flooding.

    PubMed

    Chen, Yen-Chang; Liu, Jih-Hung; Kuo, Jan-Tai; Lin, Cheng-Fang

    2013-07-01

    Reservoirs in Taiwan are inundated with nutrients that result in algal growth, and thus also reservoir eutrophication. Controlling the phosphorus load has always been the most crucial issue for maintaining reservoir water quality. Numerous agricultural activities, especially the production of tea in riparian areas, are conducted in watersheds in Taiwan. Nutrients from such activities, including phosphorus, are typically flushed into rivers during flooding, when over 90% of the yearly total amount of phosphorous enters reservoirs. Excessive or enhanced soil erosion from rainstorms can dramatically increase the river sediment load and the amount of particulate phosphorus flushed into rivers. When flow rates are high, particulate phosphorus is the dominant form of phosphorus, but sediment and discharge measurements are difficult during flooding, which makes estimating phosphorus flux in rivers difficult. This study determines total amounts of phosphorus transport by measuring flood discharge and phosphorous levels during flooding. Changes in particulate phosphorus, dissolved phosphorus, and their adsorption behavior during a 24-h period are analyzed owing to the fact that the time for particulate phosphorus adsorption and desorption approaching equilibrium is about 16 h. Erosion of the reservoir watershed was caused by adsorption and desorption of suspended solids in the river, a process which can be summarily described using the Lagmuir isotherm. A method for estimating the phosphorus flux in the Daiyujay Creek during Typhoon Bilis in 2006 is presented in this study. Both sediment and phosphorus are affected by the drastic discharge during flooding. Water quality data were collected during two flood events, flood in June 9, 2006 and Typhoon Bilis, to show the concentrations of suspended solids and total phosphorus during floods are much higher than normal stages. Therefore, the drastic changes of total phosphorus, particulate phosphorus, and dissolved phosphorus in

  1. Flood Water Level Mapping and Prediction Due to Dam Failures

    NASA Astrophysics Data System (ADS)

    Musa, S.; Adnan, M. S.; Ahmad, N. A.; Ayob, S.

    2016-07-01

    Sembrong dam has undergone overflow failure. Flooding has been reported to hit the town, covering an area of up to Parit Raja, located in the district of Batu Pahat. This study aims to identify the areas that will be affected by flood in the event of a dam failure in Sembrong Dam, Kluang, Johor at a maximum level. To grasp the extent, the flood inundation maps have been generated by using the InfoWorks ICM and GIS software. By using these maps, information such as the depth and extent of floods can be identified the main ares flooded. The flood map was created starting with the collection of relevant data such as measuring the depth of the river and a maximum flow rate for Sembrong Dam. The data were obtained from the Drainage and Irrigation Department Malaysia and the Department of Survey and Mapping and HLA Associates Sdn. Bhd. Then, the data were analyzed according to the established Info Works ICM method. The results found that the flooded area were listed at Sri Lalang, Parit Sagil, Parit Sonto, Sri Paya, Parit Raja, Parit Sempadan, Talang Bunut, Asam Bubok, Tanjung Sembrong, Sungai Rambut and Parit Haji Talib. Flood depth obtained for the related area started from 0.5 m up to 1.2 m. As a conclusion, the flood emanating from this study include the area around the town of Ayer Hitam up to Parit Raja approximately of more than 20 km distance. This may give bad implication to residents around these areas. In future studies, other rivers such as Sungai Batu Pahat should be considered for this study to predict and reduce the yearly flood victims for this area.

  2. A vulnerability function for Mediterranean flash flood risk assessment

    NASA Astrophysics Data System (ADS)

    Karagiorgos, Konstantinos; Hübl, Johannes; Thaler, Thomas; Fuchs, Sven

    2014-05-01

    Flood risk is a major type of environmental hazard jeopardizing human development, and is usually defined as a functional relation between the hazard, such as the physical and statistical aspects of flooding (e.g. return period of a certain flow height, spatial extend of inundation), and the associated vulnerability, i.e. the exposure of people and assets to floods and the susceptibility of the elements at risk to suffer from flood damage. The assessment of vulnerability -from the quantitative point of view- expresses vulnerability as the expected degree of loss for a given element at risk as a consequence of a certain event. It is ranges on a scale from 0 (no damage) to 1 (complete destruction) and focuses on direct flood loss which is estimated by damage or loss functions. A methodology for the development of a vulnerability curve for Mediterranean flash flood risk assessment is presented. This curve is based on a relationship between the intensity of the process and the associated degree of loss of elements at risk. The computation procedure is based on a method combining spatially explicit loss data, data on the value of exposed elements at risk and data on flood intensities on an individual building scale (local scale). The developed methodology is applied for the district of East Attica in Greece, a Mediterranean region influenced by mountain and coastal characteristics of land development. The aim of the study is to provide a valuable tool for the local authorities and the decision makers, a necessary implementation of flood risk management emerging from the requirements laid down in the European Flood Directive, as well as for an assessment of potential costs emerging from future flood events in order to protect individual households.

  3. Dynamic building risk assessment theoretic model for rainstorm-flood utilization ABM and ABS

    NASA Astrophysics Data System (ADS)

    Lai, Wenze; Li, Wenbo; Wang, Hailei; Huang, Yingliang; Wu, Xuelian; Sun, Bingyun

    2015-12-01

    Flood is one of natural disasters with the worst loss in the world. It needs to assess flood disaster risk so that we can reduce the loss of flood disaster. Disaster management practical work needs the dynamic risk results of building. Rainstorm flood disaster system is a typical complex system. From the view of complex system theory, flood disaster risk is the interaction result of hazard effect objects, rainstorm flood hazard factors, and hazard environments. Agent-based modeling (ABM) is an important tool for complex system modeling. Rainstorm-flood building risk dynamic assessment method (RFBRDAM) was proposed using ABM in this paper. The interior structures and procedures of different agents in proposed meth had been designed. On the Netlogo platform, the proposed method was implemented to assess the building risk changes of the rainstorm flood disaster in the Huaihe River Basin using Agent-based simulation (ABS). The results indicated that the proposed method can dynamically assess building risk of the whole process for the rainstorm flood disaster. The results of this paper can provide one new approach for flood disaster building risk dynamic assessment and flood disaster management.

  4. Flood Risk Due to Hurricane Flooding

    NASA Astrophysics Data System (ADS)

    Olivera, Francisco; Hsu, Chih-Hung; Irish, Jennifer

    2015-04-01

    In this study, we evaluated the expected economic losses caused by hurricane inundation. We used surge response functions, which are physics-based dimensionless scaling laws that give surge elevation as a function of the hurricane's parameters (i.e., central pressure, radius, forward speed, approach angle and landfall location) at specified locations along the coast. These locations were close enough to avoid significant changes in surge elevations between consecutive points, and distant enough to minimize calculations. The probability of occurrence of a surge elevation value at a given location was estimated using a joint probability distribution of the hurricane parameters. The surge elevation, at the shoreline, was assumed to project horizontally inland within a polygon of influence. Individual parcel damage was calculated based on flood water depth and damage vs. depth curves available for different building types from the HAZUS computer application developed by the Federal Emergency Management Agency (FEMA). Parcel data, including property value and building type, were obtained from the county appraisal district offices. The expected economic losses were calculated as the sum of the products of the estimated parcel damages and their probability of occurrence for the different storms considered. Anticipated changes for future climate scenarios were considered by accounting for projected hurricane intensification, as indicated by sea surface temperature rise, and sea level rise, which modify the probability distribution of hurricane central pressure and change the baseline of the damage calculation, respectively. Maps of expected economic losses have been developed for Corpus Christi in Texas, Gulfport in Mississippi and Panama City in Florida. Specifically, for Port Aransas, in the Corpus Christi area, it was found that the expected economic losses were in the range of 1% to 4% of the property value for current climate conditions, of 1% to 8% for the 2030's and

  5. Flood Risk and Flood hazard maps - Visualisation of hydrological risks

    NASA Astrophysics Data System (ADS)

    Spachinger, Karl; Dorner, Wolfgang; Metzka, Rudolf; Serrhini, Kamal; Fuchs, Sven

    2008-11-01

    Hydrological models are an important basis of flood forecasting and early warning systems. They provide significant data on hydrological risks. In combination with other modelling techniques, such as hydrodynamic models, they can be used to assess the extent and impact of hydrological events. The new European Flood Directive forces all member states to evaluate flood risk on a catchment scale, to compile maps of flood hazard and flood risk for prone areas, and to inform on a local level about these risks. Flood hazard and flood risk maps are important tools to communicate flood risk to different target groups. They provide compiled information to relevant public bodies such as water management authorities, municipalities, or civil protection agencies, but also to the broader public. For almost each section of a river basin, run-off and water levels can be defined based on the likelihood of annual recurrence, using a combination of hydrological and hydrodynamic models, supplemented by an analysis of historical records and mappings. In combination with data related to the vulnerability of a region risk maps can be derived. The project RISKCATCH addressed these issues of hydrological risk and vulnerability assessment focusing on the flood risk management process. Flood hazard maps and flood risk maps were compiled for Austrian and German test sites taking into account existing national and international guidelines. These maps were evaluated by eye-tracking using experimental graphic semiology. Sets of small-scale as well as large-scale risk maps were presented to test persons in order to (1) study reading behaviour as well as understanding and (2) deduce the most attractive components that are essential for target-oriented risk communication. A cognitive survey asking for negative and positive aspects and complexity of each single map complemented the experimental graphic semiology. The results indicate how risk maps can be improved to fit the needs of different user

  6. [Granulocyte alkaline phosphatase--a biomarker of chronic benzene exposure].

    PubMed

    Khristeva, V; Meshkov, T

    1994-01-01

    In tracing the cellular population status in the peripheral blood of workers, exposed to benzene, was included and cytochemical determination of the alkaline phosphatase activity in leucocytes. This enzyme is accepted as marker of the neutrophilic granulocytes, as maturation of the cells and their antibacterial activity are parallel to the cytochemical activity of the enzyme. 78 workers from the coke-chemical production from state firm "Kremikovtsi" and 41 workers from the production "Benzene" and "Isopropylbenzene"--Oil Chemical Plant, Burgas are included. The benzene concentrations in the air of the working places in all productions are in the range of 5 to 50 mg/m3. For cytochemical determination of the alkaline phosphatase activity is used the method of L. Kaplow and phosphatase index was calculated. It was established that in 98.4% of all examined the alkaline phosphatase activity is inhibited to different rate, as from 46.5% [61 workers] it is zero. In considerably lower percentage of workers were established and other deviations: leucocytosis or leucopenia, neutropenia, increased percent of band neutrophils and toxic granules. The results of the investigation of the granulocyte population show that from all indices, the activity of granulocyte alkaline phosphatase demonstrates most convincing the early myelotoxic effect of benzene.

  7. The effects of floodplain forest restoration and logjams on flood risk and flood hydrology

    NASA Astrophysics Data System (ADS)

    Dixon, Simon; Sear, David A.; Sykes, Tim; Odoni, Nicholas

    2015-04-01

    Flooding is the most common natural catastrophe, accounting for around half of all natural disaster related deaths and causing economic losses in Europe estimated at over € 2bn per year. In addition flooding is expected to increase in magnitude and frequency with climate change, effectively shortening the return period for a given magnitude flood. Increasing the height and extent of hard engineered defences in response to increased risk is both unsustainable and undesirable. Thus alternative approaches to flood mitigation are needed such as harnessing vegetation processes to slow the passage of flood waves and increase local flood storage. However, our understanding of these effects at the catchment scale is limited. In this presentation we demonstrate the effects of two river restoration approaches upon catchment scale flood hydrology. The addition of large wood to river channels during river restoration projects is a popular method of attempting to improve physical and biological conditions in degraded river systems. Projects utilising large wood can involve the installation of engineered logjams (ELJs), the planting and enhancement of riparian forests, or a combination of both. Altering the wood loading of a channel through installation of ELJs and increasing floodplain surface complexity through encouraging mature woodland could be expected to increase the local hydraulic resistance, increasing the timing and duration of overbank events locally and therefore increasing the travel time of a flood wave through a reach. This reach-scale effect has been documented in models and the field; however the impacts of these local changes at a catchment scale remains to be illustrated. Furthermore there is limited knowledge of how changing successional stages of a restored riparian forest through time may affect its influence on hydromorphic processes. We present results of a novel paired numerical modelling study. We model changes in flood hydrology based on a 98km

  8. Real-time simulation of large-scale floods

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  9. Participatory approaches to understanding practices of flood management across borders

    NASA Astrophysics Data System (ADS)

    Bracken, L. J.; Forrester, J.; Oughton, E. A.; Cinderby, S.; Donaldson, A.; Anness, L.; Passmore, D.

    2012-04-01

    The aim of this paper is to outline and present initial results from a study designed to identify principles of and practices for adaptive co-management strategies for resilience to flooding in borderlands using participatory methods. Borderlands are the complex and sometimes undefined spaces existing at the interface of different territories and draws attention towards messy connections and disconnections (Strathern 2004; Sassen 2006). For this project the borderlands concerned are those between professional and lay knowledge, between responsible agencies, and between one nation and another. Research was focused on the River Tweed catchment, located on the Scottish-English border. This catchment is subject to complex environmental designations and rural development regimes that make integrated management of the whole catchment difficult. A multi-method approach was developed using semi-structured interviews, Q methodology and participatory GIS in order to capture wide ranging practices for managing flooding, the judgements behind these practices and to 'scale up' participation in the study. Professionals and local experts were involved in the research. The methodology generated a useful set of options for flood management, with research outputs easily understood by key management organisations and the wider public alike. There was a wide endorsement of alternative flood management solutions from both managers and local experts. The role of location was particularly important for ensuring communication and data sharing between flood managers from different organisations and more wide ranging stakeholders. There were complex issues around scale; both the mismatch between communities and evidence of flooding and the mismatch between governance and scale of intervention for natural flood management. The multi-method approach was essential in capturing practice and the complexities around governance of flooding. The involvement of key flood management organisations was

  10. Flood resilience urban territories. Flood resilience urban territories.

    NASA Astrophysics Data System (ADS)

    Beraud, Hélène; Barroca, Bruno; Hubert, Gilles

    2010-05-01

    The flood's impact during the last twenty years on French territory reveals our lack of preparation towards large-extended floods which might cause the stopping of companies' activity, services, or lead to housing unavailability during several months. New Orleans' case has to exemplify us: four years after the disaster, the city still couldn't get back its dynamism. In France, more than 300 towns are flood-exposed. While these towns are the mainspring of territory's development, it is likely that the majority of them couldn't get up quickly after a large-extended flood. Therefore, to understand and improve the urban territory's resilience facing floods is a real stake for territory's development. Urban technical networks supply, unify and irrigate all urban territories' constituents. Characterizing their flood resilience can be interesting to understand better urban resilience. In this context, waste management during and after floods is completely crucial. During a flood, the waste management network can become dysfunctional (roads cut, waste storage installations or waste treatment flooded). How can the mayor respect his obligation to guarantee salubrity and security in his city? In post flood the question is even more problematic. The waste management network presents a real stake for territory's restart. After a flood, building materials, lopped-of branches, furniture, business stocks, farm stocks, mud, rubbles, animal cadavers are wet, mixed, even polluted by hydrocarbons or toxic substances. The waste's volume can be significant. Sanitary and environmental risks can be crucial. In view of this situation, waste's management in post crisis period raises a real problem. What to make of this waste? How to collect it? Where to stock it? How to process it? Who is responsible? Answering these questions is all the more strategic since this waste is the mark of disaster. Thus, cleaning will be the first population's and local actor's reflex in order to forget the

  11. Improving Flood Damage Assessment Models in Italy

    NASA Astrophysics Data System (ADS)

    Amadio, M.; Mysiak, J.; Carrera, L.; Koks, E.

    2015-12-01

    The use of Stage-Damage Curve (SDC) models is prevalent in ex-ante assessments of flood risk. To assess the potential damage of a flood event, SDCs describe a relation between water depth and the associated potential economic damage over land use. This relation is normally developed and calibrated through site-specific analysis based on ex-post damage observations. In some cases (e.g. Italy) SDCs are transferred from other countries, undermining the accuracy and reliability of simulation results. Against this background, we developed a refined SDC model for Northern Italy, underpinned by damage compensation records from a recent flood event. Our analysis considers both damage to physical assets and production losses from business interruptions. While the first is calculated based on land use information, production losses are measured through the spatial distribution of Gross Value Added (GVA). An additional component of the model assesses crop-specific agricultural losses as a function of flood seasonality. Our results show an overestimation of asset damage from non-calibrated SDC values up to a factor of 4.5 for tested land use categories. Furthermore, we estimate that production losses amount to around 6 per cent of the annual GVA. Also, maximum yield losses are less than a half of the amount predicted by the standard SDC methods.

  12. Amazon flood wave hydraulics

    NASA Astrophysics Data System (ADS)

    Trigg, Mark A.; Wilson, Matthew D.; Bates, Paul D.; Horritt, Matthew S.; Alsdorf, Douglas E.; Forsberg, Bruce R.; Vega, Maria C.

    2009-07-01

    SummaryA bathymetric survey of 575 km of the central Amazon River and one of its tributaries, the Purus, are combined with gauged data to characterise the Amazon flood wave, and for hydraulic modelling of the main channel for the period June 1995-March 1997 with the LISFLOOD-FP and HEC-RAS hydraulic models. Our investigations show that the Amazon flood wave is subcritical and diffusive in character and, due to shallow bed slopes, backwater conditions control significant reach lengths and are present for low and high water states. Comparison of the different models shows that it is necessary to include at least the diffusion term in any model, and the RMSE error in predicted water elevation at all cross sections introduced by ignoring the acceleration and advection terms is of the order of 0.02-0.03 m. The use of a wide rectangular channel approximation introduces an error of 0.10-0.15 m on the predicted water levels. Reducing the bathymetry to a simple bed slope and with mean cross section only, introduces an error in the order of 0.5 m. These results show that when compared to the mean annual amplitude of the Amazon flood wave of 11-12 m, water levels are relatively insensitive to the bathymetry of the channel model. The implication for remote sensing studies of the central Amazon channel, such as those proposed with the Surface Water and Ocean Topography mission (SWOT), is that even relatively crude assumptions regarding the channel bathymetry will be valid in order to derive discharge from water surface slope of the main channel, as long as the mean channel area is approximately correct.

  13. Flash flood modelling for ungauged catchments

    NASA Astrophysics Data System (ADS)

    Garambois, P.-A.; Roux, H.; Larnier, K.; Dartus, D.

    2012-04-01

    Flash flood is a very intense and quick hydrologic response of a catchment to rainfall. This phenomenon has a high spatial-temporal variability as its generating storm, often hitting small catchments (few km2). Data collected by (Gaume et al. 2009) about 500 flash floods over the last 50 years showed that they could occur everywhere in Europe and more often in the Mediterranean regions, Alpine regions and continental Europe. Given the small spatial-temporal scales and high variability of flash floods, their prediction remains a hard exercise as the necessary data are often scarce. Flash flood prediction on ungauged catchments is one of the challenges of hydrological modelling as defined by (Sivapalan et al. 2003). Several studies have been headed up with the MARINE model (Modélisation de l'Anticipation du Ruissellement et des Inondations pour des évèNements Extrêmes) for the Gard region (France), (Roux et al. 2011), (Castaings et al. 2009). This physically based spatially distributed rainfall runoff model is dedicated to flash flood prediction. The study aims at finding a methodology for flash flood prediction at ungauged locations in the Cévennes-Vivarais region in particular. The regionalization method is based on multiple calibrations on gauged catchments in order to extract model structures (model + parameter values) for each catchment. Several mathematical methods (multiple regressions, transfer functions, krigging…) will then be tested to calculate a regional parameter set. The study also investigates the usability of additional hydrologic indices at different time scales to constrain model predictions from parameters obtained using these indices, and this independently of the model considered. These hydrologic indices gather information on hydrograph shape or catchment dynamic for instance. Results explaining global catchments behaviour are expected that way. The spatial-temporal variability of storms is also described through indices and linked with

  14. Flooding in Central Siberia

    NASA Technical Reports Server (NTRS)

    2002-01-01

    A mixture of snowmelt and ice jams in late May and June of this year caused the Taz River (left) and the Yenisey River (right) in central Siberia to overflow their banks. The flooding can be seen in this image taken on June 11, 2002, by the MODIS (Moderate Resolution Imaging Spectroradiometer) instrument aboard the Terra satellite. Normally, the rivers would resemble thin black lines in MODIS imagery. In the false-color images sage green and rusty orange is land, and water is black. Clouds are white and pink. Credit: Image courtesy Jacques Descloitres, MODIS Land Rapid Response Team at NASA GSFC

  15. Tharsis Flood Features

    NASA Technical Reports Server (NTRS)

    2005-01-01

    17 July 2005 This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows channels carved by catastrophic floods in the Tharsis region of Mars. This area is located northwest of the volcano, Jovis Tholus, and east of the large martian volcano, Olympus Mons. The terrain is presently mantled with fine dust.

    Location near: 20.8oN, 118.8oW Image width: width: 3 km (1.9 mi) Illumination from: lower left Season: Northern Autumn

  16. Floods in Central China

    NASA Technical Reports Server (NTRS)

    2002-01-01

    This pair of true- and false-color images from the Moderate resolution Imaging Spectroradiometer (MODIS) shows flooding in central China on July 4, 2002. In the false-color image vegetation appears orange and water appears dark blue to black. Because of the cloud cover and the fact that some of the water is filled with sediment, the false-color image provides a clearer picture of where rivers have exceeded their banks and lakes have risen. The river in this image is the Yangtze River, and the large lake is the Poyang Hu. Credits: Jacques Descloitres, MODIS Land Rapid Response Team, NASA/GSFC

  17. Uncertainty introduced by flood frequency analysis in the estimation of climate change impacts on flooding

    NASA Astrophysics Data System (ADS)

    Lawrence, Deborah

    2016-04-01

    Potential changes in extreme flooding under a future climate are of much interest in climate change adaptation work, and estimates for high flows with long return periods are often based on an application of flood frequency analysis methods. The uncertainty introduced by this estimation is, however, only rarely considered when assessing changes in flood magnitude. In this study, an ensemble of hydrological projections for each of 115 catchments distributed across Norway is analysed to derive an estimate for the percentage change in the magnitude of the 200-year flood under a future climate. This is the return level used for flood hazard mapping in Norway. The ensemble of projections is based on climate data from 10 EUROCORDEX GCM/RCM combinations, two bias correction methods (empirical quantile mapping and double gamma function), and 25 alternative parameterisations of the HBV hydrological model. For each hydrological simulation, the annual maximum series is used to estimate the 200-year flood for the reference period, 1971-2000 and a future period, 2071-2100, based on two and three-parameter GEV distributions. In addition, bootstrap resampling is used to estimate the 95% confidence levels for the extreme value estimates, and this range is incorporated into the ensemble estimates for each catchment. As has been shown in previous work based on earlier climate projections, there are large regional differences in the projected changes in the 200-year flood across Norway, with median ensemble projections ranging from 44% to +56% for the daily-averaged flood magnitude. These differences reflect the relative importance of rainfall vs. snowmelt as the dominant flood generating process in different regions, at differing altitudes and as a function of catchment area, in addition to dominant storm tracks. Variance decomposition is used to assess the relative contributions of the following components to the total spread (given by the 5 to 95% range) in the ensemble for each

  18. Ab Initio Thermochemistry and Elastic Properties of Alkaline Earth Hydrides

    NASA Astrophysics Data System (ADS)

    Hector, Louis, Jr.; Herbst, Jan; Wolf, Walter; Saxe, Paul

    2006-03-01

    In addition to comprising a scientifically interesting class of materials, the binary alkaline earth hydrides are important components of hydrogen sorption/desorption reactions. Of critical importance for predicting the thermodynamic stability of hydrides is the enthalpy of hydride formation, δH, which links the temperature and pressure of hydrogen sorption via the van't Hoff relation. We compare LDA and GGA predictions of the heats of formation and elastic properties of alkaline earth metals and their binary hydrides BeH2, MgH2, CaH2, SrH2, and BaH2 using a plane wave density functional method. Phonon calculations using the direct method enabled prediction of the zero point energies of each material and the 0K and 298K heats of formation. We also computed the 0K and 298K cohesive energies for the alkaline earth metals. Born effective charge tensors were computed via the Berry phase method and enabled prediction of the phonon dispersion curves with LO/TO zone center splittings. It was found that the LO/TO splittings have no effect on the computed zero point energies and heats of formation. The elastic constants were computed with a least squares fitting method using a set of sequentially-applied strains to improve the accuracy of each calculation. Comparison of results from the least squares methodology with prior results using the Hartree-Fock method suggest that the former is substantially more accurate for predicting hydride elastic properties.

  19. Detection of flooded urban areas in high resolution Synthetic Aperture Radar images using double scattering

    NASA Astrophysics Data System (ADS)

    Mason, D. C.; Giustarini, L.; Garcia-Pintado, J.; Cloke, H. L.

    2014-05-01

    Flooding is a particular hazard in urban areas worldwide due to the increased risks to life and property in these regions. Synthetic Aperture Radar (SAR) sensors are often used to image flooding because of their all-weather day-night capability, and now possess sufficient resolution to image urban flooding. The flood extents extracted from the images may be used for flood relief management and improved urban flood inundation modelling. A difficulty with using SAR for urban flood detection is that, due to its side-looking nature, substantial areas of urban ground surface may not be visible to the SAR due to radar layover and shadow caused by buildings and taller vegetation. This paper investigates whether urban flooding can be detected in layover regions (where flooding may not normally be apparent) using double scattering between the (possibly flooded) ground surface and the walls of adjacent buildings. The method estimates double scattering strengths using a SAR image in conjunction with a high resolution LiDAR (Light Detection and Ranging) height map of the urban area. A SAR simulator is applied to the LiDAR data to generate maps of layover and shadow, and estimate the positions of double scattering curves in the SAR image. Observations of double scattering strengths were compared to the predictions from an electromagnetic scattering model, for both the case of a single image containing flooding, and a change detection case in which the flooded image was compared to an un-flooded image of the same area acquired with the same radar parameters. The method proved successful in detecting double scattering due to flooding in the single-image case, for which flooded double scattering curves were detected with 100% classification accuracy (albeit using a small sample set) and un-flooded curves with 91% classification accuracy. The same measures of success were achieved using change detection between flooded and un-flooded images. Depending on the particular flooding

  20. Comparison of Hydrograph Deconvolutions using Residual Alkalinity, Chloride, and Oxygen 18 as Hydrochemical Tracers

    NASA Astrophysics Data System (ADS)

    Ribolzi, O.; VallèS, V.; Bariac, T.

    1996-04-01

    Hydrograph deconvolution using geochemical tracers is currently widely used for determining the hydrologic mechanisms occurring in watersheds. However, few chemical parameters can be used as tracers because their involvement in biogeochemical processes prevents them from behaving in a conservative way. The aim of this study was to combine several geochemically controlled parameters into a single tracer. Residual alkalinity is a combination of several controlled parameters and is conservative in a wide range of natural environments. It was used in this study for quantifying the contributions of surface runoff and of groundwater flow during a flood in a Mediterranean watershed underlain by sedimentary rock. A preliminary geochemical study revealed that interactions with calcite, dolomite, and the clay-humus complex controlled calcium and magnesium concentrations as well as carbonate alkalinity (Alkc), which prevented using them as tracers. Nevertheless, although residual alkalinity (Alkresidual) is a combination of these three parameters (Alkresidual = Alkc - 2[Ca2+]T-2[Mg2+]T), it provided results that were highly comparable to those obtained using chloride and δ18O. Contrary to the most cases in the literature, the contribution of direct runoff was dominant (about 80% at peak discharge). Accuracy estimates, which took into account analytical errors, temporal variations in the isotopic signature of rainfall, and the spatial variability of chemical elements, supported this result and confirmed that residual alkalinity is a useful concept in hydrology.