Sample records for framework petrophysical quantification

  1. Applying petrophysical models to radar travel time and electrical resistivity tomograms: Resolution-dependent limitations

    USGS Publications Warehouse

    Day-Lewis, F. D.; Singha, K.; Binley, A.M.

    2005-01-01

    Geophysical imaging has traditionally provided qualitative information about geologic structure; however, there is increasing interest in using petrophysical models to convert tomograms to quantitative estimates of hydrogeologic, mechanical, or geochemical parameters of interest (e.g., permeability, porosity, water content, and salinity). Unfortunately, petrophysical estimation based on tomograms is complicated by limited and variable image resolution, which depends on (1) measurement physics (e.g., electrical conduction or electromagnetic wave propagation), (2) parameterization and regularization, (3) measurement error, and (4) spatial variability. We present a framework to predict how core-scale relations between geophysical properties and hydrologic parameters are altered by the inversion, which produces smoothly varying pixel-scale estimates. We refer to this loss of information as "correlation loss." Our approach upscales the core-scale relation to the pixel scale using the model resolution matrix from the inversion, random field averaging, and spatial statistics of the geophysical property. Synthetic examples evaluate the utility of radar travel time tomography (RTT) and electrical-resistivity tomography (ERT) for estimating water content. This work provides (1) a framework to assess tomograms for geologic parameter estimation and (2) insights into the different patterns of correlation loss for ERT and RTT. Whereas ERT generally performs better near boreholes, RTT performs better in the interwell region. Application of petrophysical models to the tomograms in our examples would yield misleading estimates of water content. Although the examples presented illustrate the problem of correlation loss in the context of near-surface geophysical imaging, our results have clear implications for quantitative analysis of tomograms for diverse geoscience applications. Copyright 2005 by the American Geophysical Union.

  2. Joint inversion of geophysical data using petrophysical clustering and facies deformation wth the level set technique

    NASA Astrophysics Data System (ADS)

    Revil, A.

    2015-12-01

    Geological expertise and petrophysical relationships can be brought together to provide prior information while inverting multiple geophysical datasets. The merging of such information can result in more realistic solution in the distribution of the model parameters, reducing ipse facto the non-uniqueness of the inverse problem. We consider two level of heterogeneities: facies, described by facies boundaries and heteroegenities inside each facies determined by a correlogram. In this presentation, we pose the geophysical inverse problem in terms of Gaussian random fields with mean functions controlled by petrophysical relationships and covariance functions controlled by a prior geological cross-section, including the definition of spatial boundaries for the geological facies. The petrophysical relationship problem is formulated as a regression problem upon each facies. The inversion of the geophysical data is performed in a Bayesian framework. We demonstrate the usefulness of this strategy using a first synthetic case for which we perform a joint inversion of gravity and galvanometric resistivity data with the stations located at the ground surface. The joint inversion is used to recover the density and resistivity distributions of the subsurface. In a second step, we consider the possibility that the facies boundaries are deformable and their shapes are inverted as well. We use the level set approach to perform such deformation preserving prior topological properties of the facies throughout the inversion. With the help of prior facies petrophysical relationships and topological characteristic of each facies, we make posterior inference about multiple geophysical tomograms based on their corresponding geophysical data misfits. The method is applied to a second synthetic case showing that we can recover the heterogeneities inside the facies, the mean values for the petrophysical properties, and, to some extent, the facies boundaries using the 2D joint inversion of gravity and galvanometric resistivity data. For this 2D synthetic example, we note that the position of the facies are well-recovered except far from the ground surfce where the sensitivity is too low. The figure shows the evolution of the shape of the facies during the inversion itertion by iteration.

  3. Joint inversion of multiple geophysical and petrophysical data using generalized fuzzy clustering algorithms

    NASA Astrophysics Data System (ADS)

    Sun, Jiajia; Li, Yaoguo

    2017-02-01

    Joint inversion that simultaneously inverts multiple geophysical data sets to recover a common Earth model is increasingly being applied to exploration problems. Petrophysical data can serve as an effective constraint to link different physical property models in such inversions. There are two challenges, among others, associated with the petrophysical approach to joint inversion. One is related to the multimodality of petrophysical data because there often exist more than one relationship between different physical properties in a region of study. The other challenge arises from the fact that petrophysical relationships have different characteristics and can exhibit point, linear, quadratic, or exponential forms in a crossplot. The fuzzy c-means (FCM) clustering technique is effective in tackling the first challenge and has been applied successfully. We focus on the second challenge in this paper and develop a joint inversion method based on variations of the FCM clustering technique. To account for the specific shapes of petrophysical relationships, we introduce several different fuzzy clustering algorithms that are capable of handling different shapes of petrophysical relationships. We present two synthetic and one field data examples and demonstrate that, by choosing appropriate distance measures for the clustering component in the joint inversion algorithm, the proposed joint inversion method provides an effective means of handling common petrophysical situations we encounter in practice. The jointly inverted models have both enhanced structural similarity and increased petrophysical correlation, and better represent the subsurface in the spatial domain and the parameter domain of physical properties.

  4. Modeling dolomitized carbonate-ramp reservoirs: A case study of the Seminole San Andres unit. Part 2 -- Seismic modeling, reservoir geostatistics, and reservoir simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, F.P.; Dai, J.; Kerans, C.

    1998-11-01

    In part 1 of this paper, the authors discussed the rock-fabric/petrophysical classes for dolomitized carbonate-ramp rocks, the effects of rock fabric and pore type on petrophysical properties, petrophysical models for analyzing wireline logs, the critical scales for defining geologic framework, and 3-D geologic modeling. Part 2 focuses on geophysical and engineering characterizations, including seismic modeling, reservoir geostatistics, stochastic modeling, and reservoir simulation. Synthetic seismograms of 30 to 200 Hz were generated to study the level of seismic resolution required to capture the high-frequency geologic features in dolomitized carbonate-ramp reservoirs. Outcrop data were collected to investigate effects of sampling interval andmore » scale-up of block size on geostatistical parameters. Semivariogram analysis of outcrop data showed that the sill of log permeability decreases and the correlation length increases with an increase of horizontal block size. Permeability models were generated using conventional linear interpolation, stochastic realizations without stratigraphic constraints, and stochastic realizations with stratigraphic constraints. Simulations of a fine-scale Lawyer Canyon outcrop model were used to study the factors affecting waterflooding performance. Simulation results show that waterflooding performance depends strongly on the geometry and stacking pattern of the rock-fabric units and on the location of production and injection wells.« less

  5. Geological implications and controls on the determination of water saturation in shale gas reservoirs

    NASA Astrophysics Data System (ADS)

    Hartigan, David; Lovell, Mike; Davies, Sarah

    2014-05-01

    A significant challenge to the petrophysical evaluation of shale gas systems can be attributed to the conductivity behaviour of clay minerals and entrained clay bound waters. This is compounded by centimetre to sub-millimetre vertical and lateral heterogeneity in formation composition and structure. Where despite significant variation in formation geological and therefore petrophysical properties, we routinely rely on conventional resistivity methods for the determination of water saturation (Sw), and hence the free gas saturation (Sg) in gas bearing mudstones. The application of resistivity based methods is the subject of continuing debate, and there is often significant uncertainty in both how they are applied and the saturation estimates they produce. This is partly a consequence of the view that "the quantification of the behaviour of shale conductivity....has only limited geological significance" (Rider 1986). As a result, there is a separation between our geological understanding of shale gas systems and the petrophysical rational and methods employed to evaluate them. In response to this uncertainty, many petrophysicists are moving away from the use of more complex 'shaly-sand' based evaluation techniques and returning to traditional Archie methods for answers. The Archie equation requires various parameter inputs such as porosity and saturation exponents (m and n), as well as values for connate fluid resistivity (Rw). Many of these parameters are difficult to determine in shale gas systems, where obtaining a water sample, or carrying out laboratory experiments on recovered core is often technically impractical. Here we assess the geological implications and controls on variations in pseudo Archie parameters across two geological formations, using well data spanning multiple basinal settings for a prominent shale gas play in the northern Gulf of Mexico basin. The results, of numerical analysis and systematic modification of parameter values to minimise the error between core derived Sw (Dean Stark analysis) and computed Sw, links sample structure with composition, highlighting some unanticipated impacts of clay minerals on the effective bulk fluid resistivity (Rwe) and thus formation resistivity (Rt). In addition, it highlights simple corrective empirical adaptations that can significantly reduce the error in Sw estimation for some wells. Observed results hint at the possibility of developing a predictive capability in selecting Archie parameter values based on geological facies association and log composition indicators (i.e. V Clay), establishing a link between formation depositional systems and their petrophysical properties in gas bearing mudstones. Rider, M.H., 1986. The Geological Interpretation of Well Logs, Blackie.

  6. Impact of petrophysical uncertainty on Bayesian hydrogeophysical inversion and model selection

    NASA Astrophysics Data System (ADS)

    Brunetti, Carlotta; Linde, Niklas

    2018-01-01

    Quantitative hydrogeophysical studies rely heavily on petrophysical relationships that link geophysical properties to hydrogeological properties and state variables. Coupled inversion studies are frequently based on the questionable assumption that these relationships are perfect (i.e., no scatter). Using synthetic examples and crosshole ground-penetrating radar (GPR) data from the South Oyster Bacterial Transport Site in Virginia, USA, we investigate the impact of spatially-correlated petrophysical uncertainty on inferred posterior porosity and hydraulic conductivity distributions and on Bayes factors used in Bayesian model selection. Our study shows that accounting for petrophysical uncertainty in the inversion (I) decreases bias of the inferred variance of hydrogeological subsurface properties, (II) provides more realistic uncertainty assessment and (III) reduces the overconfidence in the ability of geophysical data to falsify conceptual hydrogeological models.

  7. New approaches in the indirect quantification of thermal rock properties in sedimentary basins: the well-log perspective

    NASA Astrophysics Data System (ADS)

    Fuchs, Sven; Balling, Niels; Förster, Andrea

    2016-04-01

    Numerical temperature models generated for geodynamic studies as well as for geothermal energy solutions heavily depend on rock thermal properties. Best practice for the determination of those parameters is the measurement of rock samples in the laboratory. Given the necessity to enlarge databases of subsurface rock parameters beyond drill core measurements an approach for the indirect determination of these parameters is developed, for rocks as well a for geological formations. We present new and universally applicable prediction equations for thermal conductivity, thermal diffusivity and specific heat capacity in sedimentary rocks derived from data provided by standard geophysical well logs. The approach is based on a data set of synthetic sedimentary rocks (clastic rocks, carbonates and evaporates) composed of mineral assemblages with variable contents of 15 major rock-forming minerals and porosities varying between 0 and 30%. Petrophysical properties are assigned to both the rock-forming minerals and the pore-filling fluids. Using multivariate statistics, relationships then were explored between each thermal property and well-logged petrophysical parameters (density, sonic interval transit time, hydrogen index, volume fraction of shale and photoelectric absorption index) on a regression sub set of data (70% of data) (Fuchs et al., 2015). Prediction quality was quantified on the remaining test sub set (30% of data). The combination of three to five well-log parameters results in predictions on the order of <15% for thermal conductivity and thermal diffusivity, and of <10% for specific heat capacity. Comparison of predicted and benchmark laboratory thermal conductivity from deep boreholes of the Norwegian-Danish Basin, the North German Basin, and the Molasse Basin results in 3 to 5% larger uncertainties with regard to the test data set. With regard to temperature models, the use of calculated TC borehole profiles approximate measured temperature logs with an error of <3°C along a 4 km deep profile. A benchmark comparison for thermal diffusivity and specific heat capacity is pending. Fuchs, Sven; Balling, Niels; Förster, Andrea (2015): Calculation of thermal conductivity, thermal diffusivity and specific heat capacity of sedimentary rocks using petrophysical well logs, Geophysical Journal International 203, 1977-2000, doi: 10.1093/gji/ggv403

  8. Petrophysics of Lower Silurian sandstones and integration with the tectonic-stratigraphic framework, Appalachian basin, United States

    USGS Publications Warehouse

    Castle, J.W.; Byrnes, A.P.

    2005-01-01

    Petrophysical properties were determined for six facies in Lower Silurian sandstones of the Appalachian basin: fluvial, estuarine, upper shoreface, lower shoreface, tidal channel, and tidal flat. Fluvial sandstones have the highest permeability for a given porosity and exhibit a wide range of porosity (2-18%) and permeability (0.002-450 md). With a transition-zone thickness of only 1-6 m (3-20 ft), fluvial sandstones with permeability greater than 5 md have irreducible water saturation (Siw) less than 20%, typical of many gas reservoirs. Upper shoreface sandstones exhibit good reservoir properties with high porosity (10-21%), high permeability (3-250 md), and low S iw (<20%). Lower shoreface sandstones, which are finer grained, have lower porosity (4-12%), lower permeability (0.0007-4 md), thicker transition zones (6-180 m [20-600 ft]), and higher S iw. In the tidal-channel, tidal-flat, and estuarine facies, low porosity (average < 6%), low permeability (average < 0.02 md), and small pore throats result in large transition zones (30-200 m; 100-650 ft) and high water saturations. The most favorable reservoir petrophysical properties and the best estimated production from the Lower Silurian sandstones are associated with fluvial and upper shoreface facies of incised-valley fills, which we interpret to have formed predominantly in areas of structural recesses that evolved from promontories along a collisional margin during the Taconic orogeny. Although the total thickness of the sandstone may not be as great in these areas, reservoir quality is better than in adjacent structural salients, which is attributed to higher energy depositional processes and shallower maximum burial depth in the recesses than in the salients. Copyright ??2005. The American Association of Petroleum Geologists. All rights reserved.

  9. On uncertainty quantification in hydrogeology and hydrogeophysics

    NASA Astrophysics Data System (ADS)

    Linde, Niklas; Ginsbourger, David; Irving, James; Nobile, Fabio; Doucet, Arnaud

    2017-12-01

    Recent advances in sensor technologies, field methodologies, numerical modeling, and inversion approaches have contributed to unprecedented imaging of hydrogeological properties and detailed predictions at multiple temporal and spatial scales. Nevertheless, imaging results and predictions will always remain imprecise, which calls for appropriate uncertainty quantification (UQ). In this paper, we outline selected methodological developments together with pioneering UQ applications in hydrogeology and hydrogeophysics. The applied mathematics and statistics literature is not easy to penetrate and this review aims at helping hydrogeologists and hydrogeophysicists to identify suitable approaches for UQ that can be applied and further developed to their specific needs. To bypass the tremendous computational costs associated with forward UQ based on full-physics simulations, we discuss proxy-modeling strategies and multi-resolution (Multi-level Monte Carlo) methods. We consider Bayesian inversion for non-linear and non-Gaussian state-space problems and discuss how Sequential Monte Carlo may become a practical alternative. We also describe strategies to account for forward modeling errors in Bayesian inversion. Finally, we consider hydrogeophysical inversion, where petrophysical uncertainty is often ignored leading to overconfident parameter estimation. The high parameter and data dimensions encountered in hydrogeological and geophysical problems make UQ a complicated and important challenge that has only been partially addressed to date.

  10. Rock formation characterization for carbon dioxide geosequestration: 3D seismic amplitude and coherency anomalies, and seismic petrophysical facies classification, Wellington and Anson-Bates Fields, Kansas, USA

    NASA Astrophysics Data System (ADS)

    Ohl, Derek; Raef, Abdelmoneam

    2014-04-01

    Higher resolution rock formation characterization is of paramount priority, amid growing interest in injecting carbon dioxide, CO2, into subsurface rock formations of depeleting/depleted hydrocarbon reservoirs or saline aquifers in order to reduce emissions of greenhouse gases. In this paper, we present a case study for a Mississippian carbonate characterization integrating post-stack seismic attributes, well log porosities, and seismic petrophysical facies classification. We evaluated changes in petrophysical lithofacies and reveal structural facies-controls in the study area. Three cross-plot clusters in a plot of well log porosity and acoustic impedance corroborated a Neural Network petrophysical facies classification, which was based on training and validation utilizing three petrophysically-different wells and three volume seismic attributes, extracted from a time window including the wavelet of the reservoir-top reflection. Reworked lithofacies along small-throw faults has been revealed based on comparing coherency and seismic petrophysical facies. The main objective of this study is to put an emphasis on reservoir characterization that is both optimized for and subsequently benefiting from pilot tertiary CO2 carbon geosequestration in a depleting reservoir and also in the deeper saline aquifer of the Arbuckle Group, south central Kansas. The 3D seismic coherency attribute, we calculated from a window embracing the Mississippian top reflection event, indicated anomalous features that can be interpreted as a change in lithofacies or faulting effect. An Artificial Neural Network (ANN) lithofacies modeling has been used to better understand these subtle features, and also provide petrophysical classes, which will benefit flow-simulation modeling and/or time-lapse seismic monitoring feasibility analysis. This paper emphasizes the need of paying greater attention to small-scale features when embarking upon characterization of a reservoir or saline-aquifer for CO2 based carbon geosequestration.

  11. The main factors controlling petrophysical alteration in hydrothermal systems of the Kuril-Kamchatka island arch

    NASA Astrophysics Data System (ADS)

    Frolova, J.; Ladygin, V.; Rychagov, S.; Shanina, V.; Blyumkina, M.

    2009-04-01

    This report is based on the results of petrophysical studies obtained on a number of hydrothermal systems in the Kuril-Kamchatka island arc (Pauzhetsky, Mutnovsky, Koshelevsky, Essovsky, a volcano of Ebeko, Oceansky). Mineral composition and pore-space structure of primary rocks change intensively during hydrothermal process, results in alteration of petrophysical properties - porosity, density, permeability, hygroscopy, sonic velocity, elastic modulus, mechanical properties, thermal and magnetic characteristics. Petrophysical alterations gradually lead to the change of the structure of hydrothermal system, and its hydrodynamic and temperature regime. The tendency of petrophysical alteration can be different. In some cases rocks "improvement" is observed i.e. consolidation, hardening, decrease of porosity and permeability, removal of hygroscopy. In other cases rocks "deterioration" occurs, i.e. formation of secondary porosity and permeability, a decrease of density, strength, and elastic modulus, and occurrence of hygroscopic moisture. The classical example of cardinal petrophysical alteration is the transformation of hard basalts to plastic clays. The opposite example is the transformation of only slightly consolidates porous tuffs to hard and dense secondary quartzite. The character of petrophysical alteration depends on a number of factors including peculiarities of primary rocks, temperature, pressure and composition of thermal fluids, duration of fluid-rock interaction, and condition of fluid (steam, water, boiling water). The contribution of each factor to change of volcanic rocks properties is considered and analyzed in details. In particular, primary rocks controls speed, intensity and character of petrophysical alterations. Factors favorable for alteration are high porosity and permeability, micro crakes, weak cementation, glassy structure, basaltic composition. Kuril-Kamchatka region represents the volcanic island arch so host rocks in hydrothermal systems are mainly volcanic or volcaniclastic types of Neogene-Quaternary age. Volcanic rocks (lava rocks) are dense with high strength and elastic modulus and low porosity and permeability. The speed of their alteration is low. Basically volcanic rocks form impermeable horizons in the structure of hydrothermal system. But sometimes they form fracture-type reservoir. The origin of fracturing can be various. Volcanoclastic rocks are characterized by lower physical and mechanical properties, higher porosity and permeability. Due to high porosity and permeability they are greatly exposed to thermal fluids so they are altered intensively. Volcaniclastic rocks are the most common host rocks of geothermal reservoirs. Typically they form porous or fracture-porous aquifers. But in some cases they form water confining layers. The well-studied example is Pauzhetskaya hydrothermal system. The main reservoir is composed of highly porous (30-40%) and permeable medium-grained tuffs. The caprock is composed of fine-grained argillized tuffs. They are highly porous but due to small pore size porosity is un-effective for fluid and permeability is low. The temperature and pressure in a hydrothermal system cardinally influence on rocks properties. High-temperature deep fluids (Т>200C) cause the perfect tendency of petrophysical alteration - consolidation, hardening, a decrease of porosity and permeability, and removal of a hygroscopic moisture. This petrophysical tendency is observed independently of composition of fluids. This is the result of the development of high-temperature secondary minerals, which fill pores and cracks, and substitute matrix and phenocrystals. The contacts between grains become strong and dense, intergranular porosity is disappeared that reinforces cementation of rock. The petrophysical alteration caused by low-temperature subsurface fluids (Т<150C) are more difficult and diverse. Depending on what process prevails - rocks leaching, sedimentation of secondary minerals in pores and cracks or replacement of primary minerals by secondary minerals, it can lead to both: an increase or a decrease in petrophysical properties. Financial support from RFBR (project 05-07-00118-a)

  12. Small County: Development of a Virtual Environment for Instruction in Geological Characterization of Petroleum Reservoirs

    NASA Astrophysics Data System (ADS)

    Banz, B.; Bohling, G.; Doveton, J.

    2008-12-01

    Traditional programs of geological education continue to be focused primarily on the evaluation of surface or near-surface geology accessed at outcrops and shallow boreholes. However, most students who graduate to careers in geology work almost entirely on subsurface problems, interpreting drilling records and petrophysical logs from exploration and production wells. Thus, college graduates commonly find themselves ill-prepared when they enter the petroleum industry and require specialized training in drilling and petrophysical log interpretation. To aid in this training process, we are developing an environment for interactive instruction in the geological aspects of petroleum reservoir characterization employing a virtual subsurface closely reflecting the geology of the US mid-continent, in the fictional setting of Small County, Kansas. Stochastic simulation techniques are used to generate the subsurface characteristics, including the overall geological structure, distributions of facies, porosity, and fluid saturations, and petrophysical logs. The student then explores this subsurface by siting exploratory wells and examining drilling and petrophysical log records obtained from those wells. We are developing the application using the Eclipse Rich Client Platform, which allows for the rapid development of a platform-agnostic application while providing an immersive graphical interface. The application provides an array of views to enable relevant data display and student interaction. One such view is an interactive map of the county allowing the student to view the locations of existing well bores and select pertinent data overlays such as a contour map of the elevation of an interesting interval. Additionally, from this view a student may choose the site of a new well. Another view emulates a drilling log, complete with drilling rate plot and iconic representation of examined drill cuttings. From here, students are directed to stipulate subsurface lithology and interval tops as they progress through the drilling operation. Once the interpretation process is complete, the student is guided through an exercise emulating a drill stem test and then is prompted to decide on perforation intervals. The application provides a graphical framework by which the student is guided through well site selection, drilling data interpretation, and well completion or dry-hole abandonment, creating a tight feedback loop by which the student gains an over-arching view of drilling logistics and the subsurface data evaluation process.

  13. Petrophysical evaluation of subterranean formations

    DOEpatents

    Klein, James D; Schoderbek, David A; Mailloux, Jason M

    2013-05-28

    Methods and systems are provided for evaluating petrophysical properties of subterranean formations and comprehensively evaluating hydrate presence through a combination of computer-implemented log modeling and analysis. Certain embodiments include the steps of running a number of logging tools in a wellbore to obtain a variety of wellbore data and logs, and evaluating and modeling the log data to ascertain various petrophysical properties. Examples of suitable logging techniques that may be used in combination with the present invention include, but are not limited to, sonic logs, electrical resistivity logs, gamma ray logs, neutron porosity logs, density logs, NRM logs, or any combination or subset thereof.

  14. Parts-based geophysical inversion with application to water flooding interface detection and geological facies detection

    NASA Astrophysics Data System (ADS)

    Zhang, Junwei

    I built parts-based and manifold based mathematical learning model for the geophysical inverse problem and I applied this approach to two problems. One is related to the detection of the oil-water encroachment front during the water flooding of an oil reservoir. In this application, I propose a new 4D inversion approach based on the Gauss-Newton approach to invert time-lapse cross-well resistance data. The goal of this study is to image the position of the oil-water encroachment front in a heterogeneous clayey sand reservoir. This approach is based on explicitly connecting the change of resistivity to the petrophysical properties controlling the position of the front (porosity and permeability) and to the saturation of the water phase through a petrophysical resistivity model accounting for bulk and surface conductivity contributions and saturation. The distributions of the permeability and porosity are also inverted using the time-lapse resistivity data in order to better reconstruct the position of the oil water encroachment front. In our synthetic test case, we get a better position of the front with the by-products of porosity and permeability inferences near the flow trajectory and close to the wells. The numerical simulations show that the position of the front is recovered well but the distribution of the recovered porosity and permeability is only fair. A comparison with a commercial code based on a classical Gauss-Newton approach with no information provided by the two-phase flow model fails to recover the position of the front. The new approach could be also used for the time-lapse monitoring of various processes in both geothermal fields and oil and gas reservoirs using a combination of geophysical methods. A paper has been published in Geophysical Journal International on this topic and I am the first author of this paper. The second application is related to the detection of geological facies boundaries and their deforation to satisfy to geophysica data and prior distributions. We pose the geophysical inverse problem in terms of Gaussian random fields with mean functions controlled by petrophysical relationships and covariance functions controlled by a prior geological cross-section, including the definition of spatial boundaries for the geological facies. The petrophysical relationship problem is formulated as a regression problem upon each facies. The inversion is performed in a Bayesian framework. We demonstrate the usefulness of this strategy using a first synthetic case study, performing a joint inversion of gravity and galvanometric resistivity data with the stations all located at the ground surface. The joint inversion is used to recover the density and resistivity distributions of the subsurface. In a second step, we consider the possibility that the facies boundaries are deformable and their shapes are inverted as well. We use the level set approach to deform the facies boundaries preserving prior topological properties of the facies throughout the inversion. With the additional help of prior facies petrophysical relationships, topological characteristic of each facies, we make posterior inference about multiple geophysical tomograms based on their corresponding geophysical data misfits. The result of the inversion technique is encouraging when applied to a second synthetic case study, showing that we can recover the heterogeneities inside the facies, the mean values for the petrophysical properties, and, to some extent, the facies boundaries. A paper has been submitted to Geophysics on this topic and I am the first author of this paper. During this thesis, I also worked on the time lapse inversion problem of gravity data in collaboration with Marios Karaoulis and a paper was published in Geophysical Journal international on this topic. I also worked on the time-lapse inversion of cross-well geophysical data (seismic and resistivity) using both a structural approach named the cross-gradient approach and a petrophysical approach. A paper was published in Geophysics on this topic.

  15. Petrophysical Properties (Density and Magnetization) of Rocks from the Suhbaatar-Ulaanbaatar-Dalandzadgad Geophysical Profile in Mongolia and Their Implications

    PubMed Central

    Gao, Jintian; Gu, Zuowen; Dagva, Baatarkhuu; Tserenpil, Batsaikhan

    2013-01-01

    Petrophysical properties of 585 rock samples from the Suhbaatar-Ulaanbaatar-Dalandzadgad geophysical profile in Mongolia are presented. Based on the rock classifications and tectonic units, petrophysical parameters (bulk density, magnetic susceptibility, intensity of natural remanent magnetization, and Köenigsberger ratio) of these rocks are summarized. Results indicate that (1) significant density contrast of different rocks would result in variable gravity anomalies along the profile; (2) magnetic susceptibility and natural remanent magnetization of all rocks are variable, covering 5-6 orders of magnitude, which would make a variable induced magnetization and further links to complex magnetic anomalies in ground surface; (3) the distribution of rocks with different lithologies controls the pattern of lithospheric magnetic anomaly along the profile. The petrophysical database thus provides not only one of the keys to understand the geological history and structure of the profile, but also essential information for analysis and interpretation of the geophysical (e.g., magnetic and gravity) survey data. PMID:24324382

  16. Petrophysical properties (density and magnetization) of rocks from the Suhbaatar-Ulaanbaatar-Dalandzadgad geophysical profile in Mongolia and their implications.

    PubMed

    Yang, Tao; Gao, Jintian; Gu, Zuowen; Dagva, Baatarkhuu; Tserenpil, Batsaikhan

    2013-01-01

    Petrophysical properties of 585 rock samples from the Suhbaatar-Ulaanbaatar-Dalandzadgad geophysical profile in Mongolia are presented. Based on the rock classifications and tectonic units, petrophysical parameters (bulk density, magnetic susceptibility, intensity of natural remanent magnetization, and Köenigsberger ratio) of these rocks are summarized. Results indicate that (1) significant density contrast of different rocks would result in variable gravity anomalies along the profile; (2) magnetic susceptibility and natural remanent magnetization of all rocks are variable, covering 5-6 orders of magnitude, which would make a variable induced magnetization and further links to complex magnetic anomalies in ground surface; (3) the distribution of rocks with different lithologies controls the pattern of lithospheric magnetic anomaly along the profile. The petrophysical database thus provides not only one of the keys to understand the geological history and structure of the profile, but also essential information for analysis and interpretation of the geophysical (e.g., magnetic and gravity) survey data.

  17. Petrophysical rock properties of the Bazhenov Formation of the South-Eastern part of Kaymysovsky Vault (Tomsk Region)

    NASA Astrophysics Data System (ADS)

    Gorshkov, A. M.; Kudryashova, L. K.; Lee-Van-Khe, O. S.

    2016-09-01

    The article presents the results of studying petrophysical rock properties of the Bazhenov Formation of the South-Eastern part of Kaymysovsky Vault with the Gas Research Institute (GRI) method. The authors have constructed dependence charts for bulk and grain density, open porosity and matrix permeability vs. depth. The results of studying petrophysical properties with the GRI method and core description have allowed dividing the entire section into three intervals each of which characterized by different conditions of Bazhenov Formation rock formation. The authors have determined a correlation between the compensated neutron log and the rock density vs. depth chart on the basis of complex well logging and petrophysical section analysis. They have determined a promising interval for producing hydrocarbons from the Bazhenov Formation in the well under study. Besides, they have determined the typical behavior of compensated neutron logs and SP logs on well logs for this interval. These studies will allow re-interpreting available well logs in order to determine the most promising interval to be involved in Bazhenov Formation development in Tomsk Region.

  18. Multiscale approach to (micro)porosity quantification in continental spring carbonate facies: Case study from the Cakmak quarry (Denizli, Turkey)

    NASA Astrophysics Data System (ADS)

    De Boever, Eva; Foubert, Anneleen; Oligschlaeger, Dirk; Claes, Steven; Soete, Jeroen; Bertier, Pieter; Özkul, Mehmet; Virgone, Aurélien; Swennen, Rudy

    2016-07-01

    Carbonate spring deposits gained renewed interest as potential contributors to subsurface reservoirs and as continental archives of environmental changes. In contrast to their fabrics, petrophysical characteristics - and especially the importance of microporosity (< 1µm) - are less understood. This study presents the combination of advanced petrophysical and imaging techniques to investigate the pore network characteristics of three, common and widespread spring carbonate facies, as exposed in the Pleistocene Cakmak quarry (Denizli, Turkey): the extended Pond, the dipping crystalline Proximal Slope Facies and the draping Apron and Channel Facies deposits formed by encrustation of biological substrate. Integrating mercury injection capillary pressure, bulk and diffusion Nuclear Magnetic Resonance (NMR), NMR profiling and Brunauer-Emmett-Teller (BET) measurements with microscopy and micro-computer tomography (µ-CT), shows that NMR T2 distributions systematically display a single group of micro-sized pore bodies, making up between 6 and 33% of the pore space (average NMR T2 cut-off value: 62 ms). Micropore bodies are systematically located within cloudy crystal cores of granular and dendritic crystal textures in all facies. The investigated properties therefore do not reveal differences in micropore size or shape with respect to more or less biology-associated facies. The pore network of the travertine facies is distinctive in terms of (i) the percentage of microporosity, (ii) the connectivity of micropores with meso- to macropores, and (ii) the degree of heterogeneity at micro- and macroscale. Results show that an approach involving different NMR experiments provided the most complete view on the 3-D pore network especially when microporosity and connectivity are of interest.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ernest A. Mancini

    The University of Alabama in cooperation with Texas A&M University, McGill University, Longleaf Energy Group, Strago Petroleum Corporation, and Paramount Petroleum Company are undertaking an integrated, interdisciplinary geoscientific and engineering research project. The project is designed to characterize and model reservoir architecture, pore systems and rock-fluid interactions at the pore to field scale in Upper Jurassic Smackover reef and carbonate shoal reservoirs associated with varying degrees of relief on pre-Mesozoic basement paleohighs in the northeastern Gulf of Mexico. The project effort includes the prediction of fluid flow in carbonate reservoirs through reservoir simulation modeling which utilizes geologic reservoir characterization andmore » modeling and the prediction of carbonate reservoir architecture, heterogeneity and quality through seismic imaging. The primary objective of the project is to increase the profitability, producibility and efficiency of recovery of oil from existing and undiscovered Upper Jurassic fields characterized by reef and carbonate shoals associated with pre-Mesozoic basement paleohighs. The principal research effort for Year 2 of the project has been reservoir characterization, 3-D modeling and technology transfer. This effort has included six tasks: (1) the study of rockfluid interactions, (2) petrophysical and engineering characterization, (3) data integration, (4) 3-D geologic modeling, (5) 3-D reservoir simulation and (6) technology transfer. This work was scheduled for completion in Year 2. Overall, the project work is on schedule. Geoscientific reservoir characterization is essentially completed. The architecture, porosity types and heterogeneity of the reef and shoal reservoirs at Appleton and Vocation Fields have been characterized using geological and geophysical data. The study of rock-fluid interactions is near completion. Observations regarding the diagenetic processes influencing pore system development and heterogeneity in these reef and shoal reservoirs have been made. Petrophysical and engineering property characterization has been essentially completed. Porosity and permeability data at Appleton and Vocation Fields have been analyzed, and well performance analysis has been conducted. Data integration is up to date, in that, the geological, geophysical, petrophysical and engineering data collected to date for Appleton and Vocation Fields have been compiled into a fieldwide digital database. 3-D geologic modeling of the structures and reservoirs at Appleton and Vocation Fields has been completed. The model represents an integration of geological, petrophysical and seismic data. 3-D reservoir simulation of the reservoirs at Appleton and Vocation Fields has been completed. The 3-D geologic model served as the framework for the simulations. A technology workshop on reservoir characterization and modeling at Appleton and Vocation Fields was conducted to transfer the results of the project to the petroleum industry.« less

  20. Integrated ultrasonic and petrographical characterization of carbonate building materials

    NASA Astrophysics Data System (ADS)

    Ligas, Paola; Fais, Silvana; Cuccuru, Francesco

    2014-05-01

    This paper presents the application of non-destructive ultrasonic techniques in evaluating the conservation state and quality of monumental carbonate building materials. Ultrasonic methods are very effective in detecting the elastic characteristics of the materials and thus their mechanical behaviour. They are non-destructive and effective both for site and laboratory tests, though it should be pointed out that ultrasonic data interpretation is extremely complex, since elastic wave velocity heavily depends on moisture, heterogeneity, porosity and other physical properties of the materials. In our study, considering both the nature of the building materials and the constructive types of the investigated monuments, the ultrasonic investigation was carried out in low frequency ultrasonic range (24 kHz - 54 kHz) with the aim of detecting damages and degradation zones and assessing the alterability of the investigated stones by studying the propagation of the longitudinal ultrasonic pulses. In fact alterations in the materials generally cause a decrease in longitudinal pulse velocity values. Therefore starting from longitudinal velocity values the elasto-mechanical behaviour of the stone materials can be deduced. To this aim empirical and effective relations between longitudinal velocity and mechanical properties of the rocks can be used, by transferring the fundamental concepts of the studies of reservoir rocks in the framework of hydrocarbon research to the diagnostic process on stone materials. The ultrasonic measurements were performed both in laboratory and in situ using the Portable Ultrasonic Non-Destructive Digital Indicating Tester (PUNDIT) by C.N.S. Electronics LTD. A number of experimental sessions were carried out choosing different modalities of data acquisition. On the basis of the results of the laboratory measurements, an in situ ultrasonic survey on significant monuments, have been carried out. The ultrasonic measurements were integrated by a petrographical and petrophysical study of the investigated stone materials to correlate their petrographical-petrophysical features with the elastic ones. From this integrated study results that the modifications in the elasto-mechanical and petrographical-petrophysical features of the investigated carbonate materials are the main causes which reduce their quality as building materials. The use of the ultrasonic method integrated with information on petrography and petrophysics of the rocks has been successful to assess the rock quality and better understanding their alteration process. Acknowledgments: This work was financially supported by Sardinian Local Administration (RAS - LR 7 August 2007, n.7, Promotion of Scientific Research and Innovation in Sardinia - Italy, Responsible Scientist: S. Fais).

  1. StakeMeter: value-based stakeholder identification and quantification framework for value-based software systems.

    PubMed

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N A; Bin Zaheer, Kashif

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.

  2. An Uncertainty Quantification Framework for Prognostics and Condition-Based Monitoring

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Goebel, Kai

    2014-01-01

    This paper presents a computational framework for uncertainty quantification in prognostics in the context of condition-based monitoring of aerospace systems. The different sources of uncertainty and the various uncertainty quantification activities in condition-based prognostics are outlined in detail, and it is demonstrated that the Bayesian subjective approach is suitable for interpreting uncertainty in online monitoring. A state-space model-based framework for prognostics, that can rigorously account for the various sources of uncertainty, is presented. Prognostics consists of two important steps. First, the state of the system is estimated using Bayesian tracking, and then, the future states of the system are predicted until failure, thereby computing the remaining useful life of the system. The proposed framework is illustrated using the power system of a planetary rover test-bed, which is being developed and studied at NASA Ames Research Center.

  3. Optical Methods for Identifying Hard Clay Core Samples During Petrophysical Studies

    NASA Astrophysics Data System (ADS)

    Morev, A. V.; Solovyeva, A. V.; Morev, V. A.

    2018-01-01

    X-ray phase analysis of the general mineralogical composition of core samples from one of the West Siberian fields was performed. Electronic absorption spectra of the clay core samples with an added indicator were studied. The speed and availability of applying the two methods in petrophysical laboratories during sample preparation for standard and special studies were estimated.

  4. Use of petrophysical data for siting of deep geological repository of radioactive waste

    NASA Astrophysics Data System (ADS)

    Petrenko, Liliana; Shestopalov, Vyacheslav

    2017-11-01

    The paper is devoted to analyzing the petrophysical properties and petrographical characteristics of Volyn region with the view to choosing the least permeable and so the most suitable geological formation for the radioactive waste disposal. On a basis of the petrophysical estimations of the granitoids properties the argumentation of permeability has been developed for the petrotypes of Volyn region. Also method of classification of the petrotypes with their relative rate of suitability for radioactive waste disposal was developed. As a result of studying the perspectives were shown of the zhytomyr and korosten types of the granitoids as host rock for the radioactive waste disposal. According to the results of investigations performed by Swedish researchers a comparative analysis of rocks based on the age of formation, composition, structural features and some petrophysical properties of granitoids as host rocks for repository of radioactive waste was performed. Detail comparison the data of the granitoids of the Forsmark site in Sweden and the data of the granitoids of the Volyn megablock can be one of the next steps in researching the host rocks for the development of the RW disposal system in Ukraine.

  5. Reservoir and Source Rock Identification Based on Geologycal, Geophysics and Petrophysics Analysis Study Case: South Sumatra Basin

    NASA Astrophysics Data System (ADS)

    Anggit Maulana, Hiska; Haris, Abdul

    2018-05-01

    Reservoir and source rock Identification has been performed to deliniate the reservoir distribution of Talangakar Formation South Sumatra Basin. This study is based on integrated geophysical, geological and petrophysical data. The aims of study to determine the characteristics of the reservoir and source rock, to differentiate reservoir and source rock in same Talangakar formation, to find out the distribution of net pay reservoir and source rock layers. The method of geophysical included seismic data interpretation using time and depth structures map, post-stack inversion, interval velocity, geological interpretations included the analysis of structures and faults, and petrophysical processing is interpret data log wells that penetrating Talangakar formation containing hydrocarbons (oil and gas). Based on seismic interpretation perform subsurface mapping on Layer A and Layer I to determine the development of structures in the Regional Research. Based on the geological interpretation, trapping in the form of regional research is anticline structure on southwest-northeast trending and bounded by normal faults on the southwest-southeast regional research structure. Based on petrophysical analysis, the main reservoir in the field of research, is a layer 1,375 m of depth and a thickness 2 to 8.3 meters.

  6. StakeMeter: Value-Based Stakeholder Identification and Quantification Framework for Value-Based Software Systems

    PubMed Central

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N. A.; Zaheer, Kashif Bin

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called ‘StakeMeter’. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error. PMID:25799490

  7. Improved reservoir characterisation using fuzzy logic platform: an integrated petrophysical, seismic structural and poststack inversion study

    NASA Astrophysics Data System (ADS)

    Jafri, Muhammad Kamran; Lashin, Aref; Ibrahim, El-Khedr Hassan; Hassanein, Kamal A.; Al Arifi, Nassir; Naeem, Muhammad

    2017-06-01

    There is a tendency for applying different integrated geophysical approaches for better hydrocarbon reservoir characterisation and interpretation. In this study, petrophysical properties, seismic structural and poststack seismic inversion results are integrated using the fuzzy logic AND operator to characterise the Tensleep Sandstone Formation (TSF) at Powder River Basin (PRB), Wyoming, USA. TSF is deposited in a coastal plain setting during the Pennsylvanian era, and contains cross-bedded sandstone of Aeolian origin as a major lithology with alternative sabkha dolomite/carbonates. Wireline logging datasets from 17 wells are used for the detailed petrophysical evaluation. Three units of the TSF (A-sandstone, B-dolomite and B-sandstone) are targeted and their major rock properties estimated (i.e. shale/clay volume, Vsh; porosity, φEff permeability, K; fluid saturations, Sw and SH; and bulk volume water, BVW). The B-sandstone zone, with its petrophysical properties of 5-20% effective porosity, 0.10-250 mD permeability and hydrocarbon potential up to 72%, is considered the best reservoir zone among the three studied units. Distributions of the most important petrophysical parameters of the B-sandstone reservoir (Vsh, φEff, K, Sw) are generated as GIS thematic layers. The two-dimensional (2D) and three-dimensional (3D) seismic structural interpretations revealed that the hydrocarbons are entrapped in an anticlinal structure bounded with fault closures at the west of the study area. Poststack acoustic impedance (PSAI) inversion is performed on 3D seismic data to extract the inverted acoustic impedance (AI) cube. Two attribute slices (inverted AI and seismic amplitude) were extracted at the top of the B-sandstone unit as GIS thematic layers. The reservoir properties and inverted seismic attributes were then integrated using fuzzy AND operator. Finally, a fuzzy reservoir quality map was produced, and a prospective reservoir area with best reservoir characteristics is proposed for future exploration. The current study showed that integration of petrophysical, seismic structural and poststack inversion under a fuzzy logic platform can be used as an effective tool for interpreting multiple reservoir zones.

  8. Attenuation and velocity dispersion in the exploration seismic frequency band

    NASA Astrophysics Data System (ADS)

    Sun, Langqiu

    In an anelastic medium, seismic waves are distorted by attenuation and velocity dispersion, which depend on petrophysical properties of reservoir rocks. The effective attenuation and velocity dispersion is a combination of intrinsic attenuation and apparent attenuation due to scattering, transmission response, and data acquisition system. Velocity dispersion is usually neglected in seismic data processing partly because of insufficient observations in the exploration seismic frequency band. This thesis investigates the methods of measuring velocity dispersion in the exploration seismic frequency band and interprets the velocity dispersion data in terms of petrophysical properties. Broadband, uncorrelated vibrator data are suitable for measuring velocity dispersion in the exploration seismic frequency band, and a broad bandwidth optimizes the observability of velocity dispersion. Four methods of measuring velocity dispersion in uncorrelated vibrator VSP data are investigated, which are the sliding window crosscorrelation (SWCC) method, the instantaneous phase method, the spectral decomposition method, and the cross spectrum method. Among them, the SWCC method is a new method and has satisfactory robustness, accuracy, and efficiency. Using the SWCC method, velocity dispersion is measured in the uncorrelated vibrator VSP data from three areas with different geological settings, i.e., Mallik gas hydrate zone, McArthur River uranium mines, and Outokumpu crystalline rocks. The observed velocity dispersion is fitted to a straight line with respect to log frequency for a constant (frequency-independent) Q value. This provides an alternative method for calculating Q. A constant Q value does not directly link to petrophysical properties. A modeling study is implemented for the Mallik and McArthur River data to interpret the velocity dispersion observations in terms of petrophysical properties. The detailed multi-parameter petrophysical reservoir models are built according to the well logs; the models' parameters are adjusted by fitting the synthetic data to the observed data. In this way, seismic attenuation and velocity dispersion provide new insight into petrophysics properties at the Mallik and McArthur River sites. Potentially, observations of attenuation and velocity dispersion in the exploration seismic frequency band can improve the deconvolution process for vibrator data, Q-compensation, near-surface analysis, and first break picking for seismic data.

  9. Continued development of mature field: west Cameron Block 45 field, Gulf of Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brand, S.R.; Fox, J.F.

    Initial acreage acquisition and exploration of the West Cameron Block 45 field, located offshore Louisiana, were based on reconnaissance gravity surveys that revealed an anomalous high across the area. Several phases of development drilling activity have been conducted in the field since its discovery in March 1949. Nearly four decades after initial exploration began, an integrated field study incorporating all available geological, geophysical, petrophysical, and engineering data was undertaken to evaluate the remaining potential of the field. As a result of this study, a detailed structural and stratigraphic framework was developed, the controls on reservoir production performance were established, andmore » additional drillable prospects were delineated.« less

  10. A computational framework to detect normal and tuberculosis infected lung from H and E-stained whole slide images

    NASA Astrophysics Data System (ADS)

    Niazi, M. Khalid Khan; Beamer, Gillian; Gurcan, Metin N.

    2017-03-01

    Accurate detection and quantification of normal lung tissue in the context of Mycobacterium tuberculosis infection is of interest from a biological perspective. The automatic detection and quantification of normal lung will allow the biologists to focus more intensely on regions of interest within normal and infected tissues. We present a computational framework to extract individual tissue sections from whole slide images having multiple tissue sections. It automatically detects the background, red blood cells and handwritten digits to bring efficiency as well as accuracy in quantification of tissue sections. For efficiency, we model our framework with logical and morphological operations as they can be performed in linear time. We further divide these individual tissue sections into normal and infected areas using deep neural network. The computational framework was trained on 60 whole slide images. The proposed computational framework resulted in an overall accuracy of 99.2% when extracting individual tissue sections from 120 whole slide images in the test dataset. The framework resulted in a relatively higher accuracy (99.7%) while classifying individual lung sections into normal and infected areas. Our preliminary findings suggest that the proposed framework has good agreement with biologists on how define normal and infected lung areas.

  11. The integration of elastic wave properties and machine learning for the distribution of petrophysical properties in reservoir modeling

    NASA Astrophysics Data System (ADS)

    Ratnam, T. C.; Ghosh, D. P.; Negash, B. M.

    2018-05-01

    Conventional reservoir modeling employs variograms to predict the spatial distribution of petrophysical properties. This study aims to improve property distribution by incorporating elastic wave properties. In this study, elastic wave properties obtained from seismic inversion are used as input for an artificial neural network to predict neutron porosity in between well locations. The method employed in this study is supervised learning based on available well logs. This method converts every seismic trace into a pseudo-well log, hence reducing the uncertainty between well locations. By incorporating the seismic response, the reliance on geostatistical methods such as variograms for the distribution of petrophysical properties is reduced drastically. The results of the artificial neural network show good correlation with the neutron porosity log which gives confidence for spatial prediction in areas where well logs are not available.

  12. Induced polarization imaging of volcanoes

    NASA Astrophysics Data System (ADS)

    Revil, Andre; Soueid Ahmed, Abdellahi

    2017-04-01

    The first part of the presentation is related to the petrophysics of induced polarization of volcanic rocks. We described induced polarization of these rocks using a dynamic Stern layer model describing the polarization of the electrical double layer around the mineral grains. This model shows that the normalized chargeability and quadrature conductivity of volcanic rocks is sensitive to the cation exchange capacity (CEC) of these materials and therefore to their alteration. In the second part pf the presentation, we use a geostatistical inversion framework to image chargeability in 2.5D or in 3D. This new framework is benchmarked using synthetic data and data from various volcanoes (Kilaua, Furnas, Yellowstone). We show that chargeability tomography is very complementary to the now classical electrical resistivity tomography in order to image volcanic structures and to separate the conduction in the bulk pore network from interfacial effects such as surface conductivity. This approach appears to be promising as a first step toward joint inversion with seismic and gravity data.

  13. Geologic setting, petrophysical characteristics, and regional heterogeneity patterns of the Smackover in southwest Alabama. Draft topical report on Subtasks 2 and 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kopaska-Merkel, D.C.; Mann, S.D.; Tew, B.H.

    1992-06-01

    This is the draft topical report on Subtasks 2 and 3 of DOE contract number DE-FG22-89BC14425, entitled ``Establishment of an oil and gas database for increased recovery and characterization of oil and gas carbonate reservoir heterogeneity.`` This volume constitutes the final report on Subtask 3, which had as its primary goal the geological modeling of reservoir heterogeneity in Smackover reservoirs of southwest Alabama. This goal was interpreted to include a thorough analysis of Smackover reservoirs, which was required for an understanding of Smackover reservoir heterogeneity. This report is divided into six sections (including this brief introduction). Section two, entitled ``Geologicmore » setting,`` presents a concise summary of Jurassic paleogeography, structural setting, and stratigraphy in southwest Alabama. This section also includes a brief review of sedimentologic characteristics and stratigraphic framework of the Smackover, and a summary of the diagenetic processes that strongly affected Smackover reservoirs in Alabama. Section three, entitled ``Analytical methods,`` summarizes all nonroutine aspects of the analytical procedures used in this project. The major topics are thin-section description, analysis of commercial porosity and permeability data, capillary-pressure analysis, and field characterization. ``Smackover reservoir characteristics`` are described in section four, which begins with a general summary of the petrographic characteristics of porous and permeable Smackover strata. This is followed by a more-detailed petrophysical description of Smackover reservoirs.« less

  14. Field-scale permeability and temperature of volcanic crust from borehole data: Campi Flegrei, southern Italy

    NASA Astrophysics Data System (ADS)

    Carlino, Stefano; Piochi, Monica; Tramelli, Anna; Mormone, Angela; Montanaro, Cristian; Scheu, Bettina; Klaus, Mayer

    2018-05-01

    We report combined measurements of petrophysical and geophysical parameters for a 501-m deep borehole located on the eastern side of the active Campi Flegrei caldera (Southern Italy), namely (i) in situ permeability by pumping tests, (ii) laboratory-determined permeability of the drill core, and (iii) thermal gradients by distributed fiber optic and thermocouple sensors. The borehole was drilled during the Campi Flegrei Deep Drilling Project (in the framework of the International Continental Scientific Drilling Program) and gives information on the least explored caldera sector down to pre-caldera deposits. The results allow comparative assessment of permeability obtained from both borehole (at depth between 422 a 501 m) and laboratory tests (on a core sampled at the same depth) for permeability values of 10-13 m2 (borehole test) and 10-15 m2 (laboratory test) confirm the scale-dependency of permeability at this site. Additional geochemical and petrophysical determinations (porosity, density, chemistry, mineralogy and texture), together with gas flow measurements, corroborate the hypothesis that discrepancies in the permeability values are likely related to in-situ fracturing. The continuous distributed temperature profile points to a thermal gradient of about 200 °C km-1. Our findings (i) indicate that scale-dependency of permeability has to be carefully considered in modelling of the hydrothermal system at Campi Flegrei, and (ii) improve the understanding of caldera dynamics for monitoring and mitigation of this very high volcanic risk area.

  15. Assessment of Undiscovered Oil and Gas Resources of the Uinta-Piceance Province of Colorado and Utah, 2002

    USGS Publications Warehouse

    ,

    2002-01-01

    The U.S. Geological Survey (USGS) recently completed an assessment of the undiscovered oil and gas potential of the UintaPiceance Province of northwestern Colorado and northeastern Utah (fig. 1). The assessment of the Uinta-Piceance Province is geology based and uses the Total Petroleum System concept. The geologic elements of Total Petroleum Systems include hydrocarbon source rocks (source rock maturation, hydrocarbon generation and migration), reservoir rocks (sequence stratigraphy, petrophysical properties), and hydrocarbon traps (trap formation and timing). Using this geologic framework, the USGS defined five Total Petroleum Systems and 20 Assessment Units within these Total Petroleum Systems, and quantitatively estimated the undiscovered oil and gas resources within each Assessment Unit (table 1).

  16. Joint two-dimensional inversion of magnetotelluric and gravity data using correspondence maps

    NASA Astrophysics Data System (ADS)

    Carrillo, Jonathan; Gallardo, Luis A.

    2018-05-01

    An accurate characterization of subsurface targets relies on the interpretation of multiple geophysical properties and their relationships. There are mainly two links to jointly invert different geophysical parameters: structural and petrophysical relationships. Structural approaches aim at minimizing topological differences and are widely popular since they need only a few assumptions about models. Conversely, methods based on petrophysical links rely mostly on the property values themselves and can provide a strong coupling between models, but they need to be treated carefully because specific direct relationship must be known or assumed. While some petrophysical relationships are widely accepted, it remains the question whether we may be able to detect them directly from the geophysical data. Currently, there is no reported development that takes full advantage of the flexibility of jointly estimating in-situ empirical relationships and geophysical models for a given geological scenario. We thus developed an algorithm for the two dimensional joint inversion of gravity and magnetotelluric data that seeks simultaneously for a density-resistivity relationship optimal for each studied site described trough a polynomial function. The iterative two-dimensional scheme is tested using synthetic and field data from Cerro Prieto, Mexico. The resulting models show an enhanced resolution with an increased structural and petrophysical correlation. We show that by fitting a functional relationship we increased significantly the coupled geological sense of the models at a little cost in terms of data misfit.

  17. Development of a Framework for Model-Based Analysis, Uncertainty Quantification, and Robust Control Design of Nonlinear Smart Composite Systems

    DTIC Science & Technology

    2015-06-04

    control, vibration and noise control, health monitoring, and energy harvesting . However, these advantages come at the cost of rate-dependent hysteresis...configuration used for energy harvesting . Uncertainty Quantification Uncertainty quantification is pursued in two steps: (i) determination of densities...Crews and R.C. Smith, “Quantification of parameter and model uncertainty for shape mem- ory alloy bending actuators,” Journal of Intelligent material

  18. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    DOE PAGES

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...

    2014-12-31

    Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less

  19. Statistical analyses on sandstones: Systematic approach for predicting petrographical and petrophysical properties

    NASA Astrophysics Data System (ADS)

    Stück, H. L.; Siegesmund, S.

    2012-04-01

    Sandstones are a popular natural stone due to their wide occurrence and availability. The different applications for these stones have led to an increase in demand. From the viewpoint of conservation and the natural stone industry, an understanding of the material behaviour of this construction material is very important. Sandstones are a highly heterogeneous material. Based on statistical analyses with a sufficiently large dataset, a systematic approach to predicting the material behaviour should be possible. Since the literature already contains a large volume of data concerning the petrographical and petrophysical properties of sandstones, a large dataset could be compiled for the statistical analyses. The aim of this study is to develop constraints on the material behaviour and especially on the weathering behaviour of sandstones. Approximately 300 samples from historical and presently mined natural sandstones in Germany and ones described worldwide were included in the statistical approach. The mineralogical composition and fabric characteristics were determined from detailed thin section analyses and descriptions in the literature. Particular attention was paid to evaluating the compositional and textural maturity, grain contact respectively contact thickness, type of cement, degree of alteration and the intergranular volume. Statistical methods were used to test for normal distributions and calculating the linear regression of the basic petrophysical properties of density, porosity, water uptake as well as the strength. The sandstones were classified into three different pore size distributions and evaluated with the other petrophysical properties. Weathering behavior like hygric swelling and salt loading tests were also included. To identify similarities between individual sandstones or to define groups of specific sandstone types, principle component analysis, cluster analysis and factor analysis were applied. Our results show that composition and porosity evolution during diagenesis is a very important control on the petrophysical properties of a building stone. The relationship between intergranular volume, cementation and grain contact, can also provide valuable information to predict the strength properties. Since the samples investigated mainly originate from the Triassic German epicontinental basin, arkoses and feldspar-arenites are underrepresented. In general, the sandstones can be grouped as follows: i) quartzites, highly mature with a primary porosity of about 40%, ii) quartzites, highly mature, showing a primary porosity of 40% but with early clay infiltration, iii) sublitharenites-lithic arenites exhibiting a lower primary porosity, higher cementation with quartz and Fe-oxides ferritic and iv) sublitharenites-lithic arenites with a higher content of pseudomatrix. However, in the last two groups the feldspar and lithoclasts can also show considerable alteration. All sandstone groups differ with respect to the pore space and strength data, as well as water uptake properties, which were obtained by linear regression analysis. Similar petrophysical properties are discernible for each type when using principle component analysis. Furthermore, strength as well as the porosity of sandstones shows distinct differences considering their stratigraphic ages and the compositions. The relationship between porosity, strength as well as salt resistance could also be verified. Hygric swelling shows an interrelation to pore size type, porosity and strength but also to the degree of alteration (e.g. lithoclasts, pseudomatrix). To summarize, the different regression analyses and the calculated confidence regions provide a significant tool to classify the petrographical and petrophysical parameters of sandstones. Based on this, the durability and the weathering behavior of the sandstone groups can be constrained. Keywords: sandstones, petrographical & petrophysical properties, predictive approach, statistical investigation

  20. Secondary Students' Quantification of Ratio and Rate: A Framework for Reasoning about Change in Covarying Quantities

    ERIC Educational Resources Information Center

    Johnson, Heather Lynn

    2015-01-01

    Contributing to a growing body of research addressing secondary students' quantitative and covariational reasoning, the multiple case study reported in this article investigated secondary students' quantification of ratio and rate. This article reports results from a study investigating students' quantification of rate and ratio as…

  1. Framework for hyperspectral image processing and quantification for cancer detection during animal tumor surgery.

    PubMed

    Lu, Guolan; Wang, Dongsheng; Qin, Xulei; Halig, Luma; Muller, Susan; Zhang, Hongzheng; Chen, Amy; Pogue, Brian W; Chen, Zhuo Georgia; Fei, Baowei

    2015-01-01

    Hyperspectral imaging (HSI) is an imaging modality that holds strong potential for rapid cancer detection during image-guided surgery. But the data from HSI often needs to be processed appropriately in order to extract the maximum useful information that differentiates cancer from normal tissue. We proposed a framework for hyperspectral image processing and quantification, which includes a set of steps including image preprocessing, glare removal, feature extraction, and ultimately image classification. The framework has been tested on images from mice with head and neck cancer, using spectra from 450- to 900-nm wavelength. The image analysis computed Fourier coefficients, normalized reflectance, mean, and spectral derivatives for improved accuracy. The experimental results demonstrated the feasibility of the hyperspectral image processing and quantification framework for cancer detection during animal tumor surgery, in a challenging setting where sensitivity can be low due to a modest number of features present, but potential for fast image classification can be high. This HSI approach may have potential application in tumor margin assessment during image-guided surgery, where speed of assessment may be the dominant factor.

  2. Framework for hyperspectral image processing and quantification for cancer detection during animal tumor surgery

    NASA Astrophysics Data System (ADS)

    Lu, Guolan; Wang, Dongsheng; Qin, Xulei; Halig, Luma; Muller, Susan; Zhang, Hongzheng; Chen, Amy; Pogue, Brian W.; Chen, Zhuo Georgia; Fei, Baowei

    2015-12-01

    Hyperspectral imaging (HSI) is an imaging modality that holds strong potential for rapid cancer detection during image-guided surgery. But the data from HSI often needs to be processed appropriately in order to extract the maximum useful information that differentiates cancer from normal tissue. We proposed a framework for hyperspectral image processing and quantification, which includes a set of steps including image preprocessing, glare removal, feature extraction, and ultimately image classification. The framework has been tested on images from mice with head and neck cancer, using spectra from 450- to 900-nm wavelength. The image analysis computed Fourier coefficients, normalized reflectance, mean, and spectral derivatives for improved accuracy. The experimental results demonstrated the feasibility of the hyperspectral image processing and quantification framework for cancer detection during animal tumor surgery, in a challenging setting where sensitivity can be low due to a modest number of features present, but potential for fast image classification can be high. This HSI approach may have potential application in tumor margin assessment during image-guided surgery, where speed of assessment may be the dominant factor.

  3. Ensemble-based uncertainty quantification for coordination and control of thermostatically controlled loads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Weixuan; Lian, Jianming; Engel, Dave

    2017-07-27

    This paper presents a general uncertainty quantification (UQ) framework that provides a systematic analysis of the uncertainty involved in the modeling of a control system, and helps to improve the performance of a control strategy.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ernest A. Mancini

    The University of Alabama in cooperation with Texas A&M University, McGill University, Longleaf Energy Group, Strago Petroleum Corporation, and Paramount Petroleum Company are undertaking an integrated, interdisciplinary geoscientific and engineering research project. The project is designed to characterize and model reservoir architecture, pore systems and rock-fluid interactions at the pore to field scale in Upper Jurassic Smackover reef and carbonate shoal reservoirs associated with varying degrees of relief on pre-Mesozoic basement paleohighs in the northeastern Gulf of Mexico. The project effort includes the prediction of fluid flow in carbonate reservoirs through reservoir simulation modeling that utilizes geologic reservoir characterization andmore » modeling and the prediction of carbonate reservoir architecture, heterogeneity and quality through seismic imaging. The primary objective of the project is to increase the profitability, producibility and efficiency of recovery of oil from existing and undiscovered Upper Jurassic fields characterized by reef and carbonate shoals associated with pre-Mesozoic basement paleohighs. The principal research effort for Year 3 of the project has been reservoir characterization, 3-D modeling, testing of the geologic-engineering model, and technology transfer. This effort has included six tasks: (1) the study of seismic attributes, (2) petrophysical characterization, (3) data integration, (4) the building of the geologic-engineering model, (5) the testing of the geologic-engineering model and (6) technology transfer. This work was scheduled for completion in Year 3. Progress on the project is as follows: geoscientific reservoir characterization is completed. The architecture, porosity types and heterogeneity of the reef and shoal reservoirs at Appleton and Vocation Fields have been characterized using geological and geophysical data. The study of rock-fluid interactions has been completed. Observations regarding the diagenetic processes influencing pore system development and heterogeneity in these reef and shoal reservoirs have been made. Petrophysical and engineering property characterization has been completed. Porosity and permeability data at Appleton and Vocation Fields have been analyzed, and well performance analysis has been conducted. Data integration is up to date, in that, the geological, geophysical, petrophysical and engineering data collected to date for Appleton and Vocation Fields have been compiled into a fieldwide digital database. 3-D geologic modeling of the structures and reservoirs at Appleton and Vocation Fields has been completed. The models represent an integration of geological, petrophysical and seismic data. 3-D reservoir simulation of the reservoirs at Appleton and Vocation Fields has been completed. The 3-D geologic models served as the framework for the simulations. The geologic-engineering models of the Appleton and Vocation Field reservoirs have been developed. These models are being tested. The geophysical interpretation for the paleotopographic feature being tested has been made, and the study of the data resulting from drilling of a well on this paleohigh is in progress. Numerous presentations on reservoir characterization and modeling at Appleton and Vocation Fields have been made at professional meetings and conferences and a short course on microbial reservoir characterization and modeling based on these fields has been prepared.« less

  5. Establishing the Relationship between Fracture-Related Dolomite and Primary Rock Fabric on the Distribution of Reservoirs in the Michigan Basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    G. Michael Grammer

    2006-09-30

    This topical report covers the year 2 of the subject 3-year grant, evaluating the relationship between fracture-related dolomite and dolomite constrained by primary rock fabric in the 3 most prolific reservoir intervals in the Michigan Basin (Ordovician Trenton-Black River Formations; Silurian Niagara Group; and the Devonian Dundee Formation). The characterization of select dolomite reservoirs has been the major focus of our efforts in Phase II/Year 2. Fields have been prioritized based upon the availability of rock data for interpretation of depositional environments, fracture density and distribution as well as thin section, geochemical, and petrophysical analyses. Structural mapping and log analysismore » in the Dundee (Devonian) and Trenton/Black River (Ordovician) suggest a close spatial relationship among gross dolomite distribution and regional-scale, wrench fault related NW-SE and NE-SW structural trends. A high temperature origin for much of the dolomite in the 3 studied intervals (based upon initial fluid inclusion homogenization temperatures and stable isotopic analyses,) coupled with persistent association of this dolomite in reservoirs coincident with wrench fault-related features, is strong evidence for these reservoirs being influenced by hydrothermal dolomitization. For the Niagaran (Silurian), a comprehensive high resolution sequence stratigraphic framework has been developed for a pinnacle reef in the northern reef trend where we had 100% core coverage throughout the reef section. Major findings to date are that facies types, when analyzed at a detailed level, have direct links to reservoir porosity and permeability in these dolomites. This pattern is consistent with our original hypothesis of primary facies control on dolomitization and resulting reservoir quality at some level. The identification of distinct and predictable vertical stacking patterns within a hierarchical sequence and cycle framework provides a high degree of confidence at this point that results will be exportable throughout the basin. Ten petrophysically significant facies have been described in the northern reef trend, providing significantly more resolution than the standard 4-6 that are used most often in the basin (e.g. Gill, 1977). Initial petrophysical characterization (sonic velocity analysis under confining pressures) shows a clear pattern that is dependent upon facies and resulting pore architecture. Primary facies is a key factor in the ultimate diagenetic modification of the rock and the resulting pore architecture. Facies with good porosity and permeability clearly show relatively slow velocity values as would be expected, and low porosity and permeability samples exhibit fast sonic velocity values, again as expected. What is significant is that some facies that have high porosity values, either measured directly or from wireline logs, also have very fast sonic velocity values. This is due to these facies having a pore architecture characterized by more localized pores (vugs, molds or fractures) that are not in communication.« less

  6. Geologic framework for the national assessment of carbon dioxide storage resources: Denver Basin, Colorado, Wyoming, and Nebraska: Chapter G in Geologic framework for the national assessment of carbon dioxide storage resources

    USGS Publications Warehouse

    Drake II, Ronald M.; Brennan, Sean T.; Covault, Jacob A.; Blondes, Madalyn S.; Freeman, P.A.; Cahan, Steven M.; DeVera, Christina A.; Lohr, Celeste D.

    2014-01-01

    This is a report about the geologic characteristics of five storage assessment units (SAUs) within the Denver Basin of Colorado, Wyoming, and Nebraska. These SAUs are Cretaceous in age and include (1) the Plainview and Lytle Formations, (2) the Muddy Sandstone, (3) the Greenhorn Limestone, (4) the Niobrara Formation and Codell Sandstone, and (5) the Terry and Hygiene Sandstone Members. The described characteristics, as specified in the methodology, affect the potential carbon dioxide storage resource in the SAUs. The specific geologic and petrophysical properties of interest include depth to the top of the storage formation, average thickness, net-porous thickness, porosity, permeability, groundwater quality, and the area of structural reservoir traps. Descriptions of the SAU boundaries and the overlying sealing units are also included. Assessment results are not contained in this report; however, the geologic information included here will be used to calculate a statistical Monte Carlo-based distribution of potential storage volume in the SAUs.

  7. Applying a probabilistic seismic-petrophysical inversion and two different rock-physics models for reservoir characterization in offshore Nile Delta

    NASA Astrophysics Data System (ADS)

    Aleardi, Mattia

    2018-01-01

    We apply a two-step probabilistic seismic-petrophysical inversion for the characterization of a clastic, gas-saturated, reservoir located in offshore Nile Delta. In particular, we discuss and compare the results obtained when two different rock-physics models (RPMs) are employed in the inversion. The first RPM is an empirical, linear model directly derived from the available well log data by means of an optimization procedure. The second RPM is a theoretical, non-linear model based on the Hertz-Mindlin contact theory. The first step of the inversion procedure is a Bayesian linearized amplitude versus angle (AVA) inversion in which the elastic properties, and the associated uncertainties, are inferred from pre-stack seismic data. The estimated elastic properties constitute the input to the second step that is a probabilistic petrophysical inversion in which we account for the noise contaminating the recorded seismic data and the uncertainties affecting both the derived rock-physics models and the estimated elastic parameters. In particular, a Gaussian mixture a-priori distribution is used to properly take into account the facies-dependent behavior of petrophysical properties, related to the different fluid and rock properties of the different litho-fluid classes. In the synthetic and in the field data tests, the very minor differences between the results obtained by employing the two RPMs, and the good match between the estimated properties and well log information, confirm the applicability of the inversion approach and the suitability of the two different RPMs for reservoir characterization in the investigated area.

  8. Geomechanical Anisotropy and Rock Fabric in Shales

    NASA Astrophysics Data System (ADS)

    Huffman, K. A.; Connolly, P.; Thornton, D. A.

    2017-12-01

    Digital rock physics (DRP) is an emerging area of qualitative and quantitative scientific analysis that has been employed on a variety of rock types at various scales to characterize petrophysical, mechanical, and hydraulic rock properties. This contribution presents a generic geomechanically focused DRP workflow involving image segmentation by geomechanical constituents, generation of finite element (FE) meshes, and application of various boundary conditions (i.e. at the edge of the domain and at boundaries of various components such as edges of individual grains). The generic workflow enables use of constituent geological objects and relationships in a computational based approach to address specific questions in a variety of rock types at various scales. Two examples are 1) modeling stress dependent permeability, where it occurs and why it occurs at the grain scale; 2) simulating the path and complexity of primary fractures and matrix damage in materials with minerals or intervals of different mechanical behavior. Geomechanical properties and fabric characterization obtained from 100 micron shale SEM images using the generic DRP workflow are presented. Image segmentation and development of FE simulation composed of relatively simple components (elastic materials, frictional contacts) and boundary conditions enable the determination of bulk static elastic properties. The procedure is repeated for co-located images at pertinent orientations to determine mechanical anisotropy. The static moduli obtained are benchmarked against lab derived measurements since material properties (esp. frictional ones) are poorly constrained at the scale of investigation. Once confidence in the input material parameters is gained, the procedure can be used to characterize more samples (i.e. images) than is possible from rock samples alone. Integration of static elastic properties with grain statistics and geologic (facies) conceptual models derived from core and geophysical logs enables quantification of the impact that variations in rock fabric and grain interactions have on bulk mechanical rock behavior. When considered in terms of the stratigraphic framework of two different shale reservoirs it is found that silica distribution, clay content and orientation play a first order role in mechanical anisotropy.

  9. Quantification of uncertainties for application in detonation simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  10. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    NASA Astrophysics Data System (ADS)

    Hadjidoukas, P. E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.

    2015-03-01

    We present Π4U, an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  11. Machine Learning for Mapping Groundwater Salinity with Oil Well Log Data

    NASA Astrophysics Data System (ADS)

    Chang, W. H.; Shimabukuro, D.; Gillespie, J. M.; Stephens, M.

    2016-12-01

    An oil field may have thousands of wells with detailed petrophysical logs, and far fewer direct measurements of groundwater salinity. Can the former be used to extrapolate the latter into a detailed map of groundwater salinity? California Senate Bill 4, with its requirement to identify Underground Sources of Drinking Water, makes this a question worth answering. A well-known obstacle is that the basic petrophysical equations describe ideal scenarios ("clean wet sand") and even these equations contain many parameters that may vary with location and depth. Accounting for other common scenarios such as high-conductivity shaly sands or low-permeability diatomite (both characteristic of California's Central Valley) causes parameters to proliferate to the point where the model is underdetermined by the data. When parameters outnumber data points, however, is when machine learning methods are most advantageous. We present a method for modeling a generic oil field, where groundwater salinity and lithology are depth series parameters, and the constants in petrophysical equations are scalar parameters. The data are well log measurements (resistivity, porosity, spontaneous potential, and gamma ray) and a small number of direct groundwater salinity measurements. Embedded in the model are petrophysical equations that account for shaly sand and diatomite formations. As a proof of concept, we feed in well logs and salinity measurements from the Lost Hills Oil Field in Kern County, California, and show that with proper regularization and validation the model makes reasonable predictions of groundwater salinity despite the large number of parameters. The model is implemented using Tensorflow, which is an open-source software released by Google in November, 2015 that has been rapidly and widely adopted by machine learning researchers. The code will be made available on Github, and we encourage scrutiny and modification by machine learning researchers and hydrogeologists alike.

  12. An investigation into preserving spatially-distinct pore systems in multi-component rocks using a fossiliferous limestone example

    NASA Astrophysics Data System (ADS)

    Jiang, Zeyun; Couples, Gary D.; Lewis, Helen; Mangione, Alessandro

    2018-07-01

    Limestones containing abundant disc-shaped fossil Nummulites can form significant hydrocarbon reservoirs but they have a distinctly heterogeneous distribution of pore shapes, sizes and connectivities, which make it particularly difficult to calculate petrophysical properties and consequent flow outcomes. The severity of the problem rests on the wide length-scale range from the millimetre scale of the fossil's pore space to the micron scale of rock matrix pores. This work develops a technique to incorporate multi-scale void systems into a pore network, which is used to calculate the petrophysical properties for subsequent flow simulations at different stages in the limestone's petrophysical evolution. While rock pore size, shape and connectivity can be determined, with varying levels of fidelity, using techniques such as X-ray computed tomography (CT) or scanning electron microscopy (SEM), this work represents a more challenging class where the rock of interest is insufficiently sampled or, as here, has been overprinted by extensive chemical diagenesis. The main challenge is integrating multi-scale void structures derived from both SEM and CT images, into a single model or a pore-scale network while still honouring the nature of the connections across these length scales. Pore network flow simulations are used to illustrate the technique but of equal importance, to demonstrate how supportable earlier-stage petrophysical property distributions can be used to assess the viability of several potential geological event sequences. The results of our flow simulations on generated models highlight the requirement for correct determination of the dominant pore scales (one plus of nm, μm, mm, cm), the spatial correlation and the cross-scale connections.

  13. Is There a Link Between Mineralogy, Petrophysics, and the Hydraulic and Seismic Behaviors of the Soultz-sous-Forêts Granite During Stimulation? A Review and Reinterpretation of Petro-Hydromechanical Data Toward a Better Understanding of Induced Seismicity and Fluid Flow

    NASA Astrophysics Data System (ADS)

    Meller, Carola; Ledésert, Béatrice

    2017-12-01

    In the framework of the European Soultz-sous-Forêts enhanced geothermal system (EGS) in Alsace, France, 20 years of scientific and preindustrial tests had to be performed before the site began production of electricity in 2008. Stimulation tests were designed to enhance the permeability because most of the numerous natural fractures that crosscut the granite body were sealed by secondary minerals that crystallized as an effect of the circulation of local hot brines. The deep-seated granitic reservoir is located between 4,500 and 5,000 m depths. Hydraulic stimulations were conducted in the four deep wells (GPK1, GPK2, GPK3, and GPK4) inducing different microseismic event patterns, which cannot be explained by tectonic structures alone. In the present work, we provide a review of the hydraulic tests and reinterpret them in the light of mineralogical data obtained along the boreholes. A clear relationship appears between mineralogy (mainly clay and calcite content) and the petrophysical, mechanical, and hydraulic behaviors of the rock mass. High calcite contents are correlated with an abundance of clay minerals, low Young's modulus, low magnetic susceptibility, and variation in spectral gamma ray. Microearthquakes are generated in the fresh granite zones, while clay and calcite-rich zones, linked with hydrothermal alteration, might behave aseismically during hydraulic stimulations. These findings highlight the importance of a detailed knowledge of the petrography of a reservoir to conduct an effective stimulation while keeping the seismic hazard at a minimum.

  14. Flow in Coal Seams: An Unconventional Challenge

    NASA Astrophysics Data System (ADS)

    Armstrong, R. T.; Mostaghimi, P.; Jing, Y.; Gerami, A.

    2016-12-01

    A significant unconventional resource for energy is the methane gas stored in shallow coal beds, known as coal seam gas. An integrated imaging and modelling framework is developed for analysing petrophysical behaviour of coals. X-ray micro-computed tomography (micro-CT) is applied using a novel contrast agent method for visualising micrometer-sized fractures in coal. The technique allows for the visualisation of coal features not visible with conventional imaging methods. A Late Permian medium volatile bituminous coal from Moura Coal Mine (Queensland, Australia) is imaged and the resulting three-dimensional coal fracture system is extracted for fluid flow simulations. The results demonstrate a direct relationship between coal lithotype and permeability. Scanning electron microscope and energy dispersive spectrometry (SEM-EDS) together with X-ray diffraction (XRD) methods are used for identifying mineral matters at high resolution. SEM high-resolution images are also used to calibrate the micro-CT images and measure the exact aperture size of fractures. This leads to a more accurate estimation of permeability using micro-CT images. To study the significance of geometry and topology of the fracture system, a fracture reconstruction method based on statistical properties of coal is also developed. The network properties including the frequency, aperture size distribution, length, and spacing of the imaged coal fracture system. This allows for a sensitivity analysis on the effects that coal fracture topology and geometry has on coal petrophysical properties. Furthermore, we generate microfluidic chips based on coal fracture observations. The chip is used for flow experiments to visualise multi-fluid processes and measure recovery of gas. A combined numerical and experimental approach is applied to obtain relative permeability curves for different regions of interest. A number of challenges associated with coal samples are discussed and insights are provided for better understanding of these complex porous media systems.

  15. a Matlab Toolbox for Basin Scale Fluid Flow Modeling Applied to Hydrology and Geothermal Energy

    NASA Astrophysics Data System (ADS)

    Alcanie, M.; Lupi, M.; Carrier, A.

    2017-12-01

    Recent boosts in the development of geothermal energy were fostered by the latest oil crises and by the need of reducing CO2 emissions generated by the combustion of fossil fuels. Various numerical codes (e.g. FEHM, CSMP++, HYDROTHERM, TOUGH) have thus been implemented for the simulation and quantification of fluid flow in the upper crust. One possible limitation of such codes is the limited accessibility and the complex structure of the simulators. For this reason, we began to develop a Hydrothermal Fluid Flow Matlab library as part of MRST (Matlab Reservoir Simulation Toolbox). MRST is designed for the simulation of oil and gas problems including carbon capture storage. However, a geothermal module is still missing. We selected the Geneva Basin as a natural laboratory because of the large amount of data available in the region. The Geneva Basin has been intensely investigated in the past with exploration wells, active seismic and gravity surveys. In addition, the energy strategy of Switzerland promotes the development of geothermal energy that lead to recent geophysical prospections. Previous and ongoing projects have shown the geothermal potential of the Geneva Basin but a consistent fluid flow model assessing the deep circulation in the region is yet to be defined. The first step of the study was to create the basin-scale static model. We integrated available active seismic, gravity inversions and borehole data to describe the principal geologic and tectonic features of the Geneva Basin. Petrophysical parameters were obtained from available and widespread well logs. This required adapting MRST to standard text format file imports and outline a new methodology for quick static model creation in an open source environment. We implemented several basin-scale fluid flow models to test the effects of petrophysical properties on the circulation dynamics of deep fluids in the Geneva Basin. Preliminary results allow the identification of preferential fluid flow pathways, which are critical information to define geothermal exploitation locations. The next step will be the implementation of the equation of state for pure water, CO2 - H2O and H2O - CH4 fluid mixtures.

  16. Characterization of Carbonates by Spectral Induced Polarization

    NASA Astrophysics Data System (ADS)

    Hupfer, Sarah; Halisch, Matthias; Weller, Andreas

    2017-04-01

    This study investigates the complex electrical conductivity of carbonate samples by Spectral Induced Polarization (SIP). The analysis is conducted in combination with petrophysical, mineralogical and geochemical measurements. SIP is a useful tool to obtain more detailed information about rock properties and receive a more qualitative pore space characterization. Rock parameters like permeability, pore-size and -surface area can be predicted. Up to this point, sandstones or sandy materials were investigated in detail by laboratory SIP-measurements. Several robust empirical relationships were found that connect IP-signals and petrophysical parameters (surface area, surface conductivity and cation exchange capacity). Different types of carbonates were analyzed with laboratory SIP-measurements. Rock properties like grain density, porosity, permeability and surface area were determined by petrophysical measurements. Geochemistry and mineralogy were used to differentiate the carbonate types. First results of the SIP-measurements showed polarization effects for all different types. Four different phase behavior were observed in the phase spectra. A constant phase angle, a constant slope, a combination of both and a maximum type could be identified. Each phase behavior can be assigned to the specific carbonate type used, but the constant phase occurs at two carbonate types. Further experiments were conducted to get more insight the phase behavior and get explanations. 1. Approach: An expected phase peak frequency for each sample was calculated to check if this frequency is within the measured spectrum of 2 mHz to 100 Hz. 2. Approach: Significantly reducing of the fluid conductivity to increase phase signal for a better interpretation. 3. Approach: The cation-exchange-capacity (CEC) was regarded as a factor as well. A dependence between imaginary part of conductivity and CEC was detected. 4. Approach: Imaging procedures (scanning electron microscope, x-ray computed tomography, microscopy) were used to create a qualitative image of the carbonate samples and to investigate the pore space, for example the ratio of connected to non-connected pore space. A comparison between SIP data and the petrophysical data of the sample set showed that the phase behavior of carbonates is highly complicated and challenging compared with sandstones. It seems that there is no correlation between polarization effects and any petrophysical parameter. Ongoing investigations and measurements will be conducted to get more insight to the polarization effects of carbonates.

  17. Multidisciplinary exploratory study of a geothermal resource in the active volcanic arc of Basse-Terre (Guadeloupe, Lesser Antilles)

    NASA Astrophysics Data System (ADS)

    Navelot, Vivien; Favier, Alexiane; Géraud, Yves; Diraison, Marc; Corsini, Michel; Verati, Chrystèle; Lardeaux, Jean-Marc; Mercier de Lépinay, Jeanne; Munschy, Marc

    2017-04-01

    The GEOTREF project (high enthalpy geothermal energy in fractured reservoirs), supported by the French government program, "Investissements d'avenir" develops a sustainable geothermal resource in the Vieux Habitants area, 8-km south of the currently exploited Bouillante geothermal field. The Basse Terre Island is a recent volcanic arc (< 3 Myr) belonging to the Lesser Antilles subduction zone. It is composed of arc typical calc-alkaline volcanic rocks. Outcrops of the studied area consist either of andesitic lava flows, volcanic sedimentary facies or dikes. Field studies allow to propose a structural framework and highlight three major directions N000˚ E, N050˚ E and N090˚ E, which are consistent with the regional tectonic trends of the arc. Petrographical and petrophysical studies displayed that the major part of outcropping facies in the Vieux-Habitants area are not altered. Andesitic lava flows have poor reservoir properties with porosity and permeability lower than 5 % and 10-15m2 respectively. These results are in contrast with measurements performed in volcano-sedimentary rocks, which have heterogeneous petrophysical properties ranging from 15 to 50 % for porosity and from 10-15to 10-9m2 for permeability. Such surface data would probably change and decrease when depth increases. As there is a lack of underground data under the Vieux-Habitants area (wireline, drill core), exhumated rocks outcropping in the northern part of Basse-Terre Island (Basal Complex) have been studied. Such rocks have been identified in the Basal Complex (2.5 - 3 Myr) located in the northern part of the Basse-Terre Island. Previous works have demonstrated a 1000 m/Myr erosional rate, which corresponds at least to a 2 - 3 km exhumation. The petrography study of the Basal Complex reveals sub-greenschist type mineralogical transformations (chlorite, white mica, quartz...) changing the andesitic protolith in a meta-andesite. This metamorphism forms cleavage plans thanks to a pressure-solution mechanism. Mineralogical transformations associated with these cleavage planes have an impact on petrophysical properties. The solid phase density and porosity decrease. An anisotropy of permeability develops due to cleavage plans. Thermodynamics modelling based on the rock chemical composition and petrography observations emphasizes a steady-state mineral assemblage between 1.5 - 2 kbar and 280 - 320˚ C. This is consistent with an in situ measured volcanic arc conductive geothermal gradient of 70 ˚ C/km.

  18. A DDDAS Framework for Volcanic Ash Propagation and Hazard Analysis

    DTIC Science & Technology

    2012-01-01

    probability distribution for the input variables (for example, Hermite polynomials for normally distributed parameters, or Legendre for uniformly...parameters and windfields will drive our simulations. We will use uncertainty quantification methodology – polynomial chaos quadrature in combination...quantification methodology ? polynomial chaos quadrature in combination with data integration to complete the DDDAS loop. 15. SUBJECT TERMS 16. SECURITY

  19. Assessment of undiscovered oil and gas resources of the Williston Basin Province of North Dakota, Montana, and South Dakota, 2010

    USGS Publications Warehouse

    ,

    2011-01-01

    Using a geology-based assessment method, the U.S. Geological Survey estimated mean undiscovered volumes of 3.8 billion barrels of undiscovered oil, 3.7 trillion cubic feet of associated/dissolved natural gas, and 0.2 billion barrels of undiscovered natural gas liquids in the Williston Basin Province, North Dakota, Montana, and South Dakota. The U.S. Geological Survey (USGS) recently completed a comprehensive oil and gas assessment of the Williston Basin, which encompasses more than 90 million acres in parts of North Dakota, eastern Montana, and northern South Dakota. The assessment is based on the geologic elements of each total petroleum system (TPS) defined in the province, including hydrocarbon source rocks (source-rock maturation, hydrocarbon generation, and migration), reservoir rocks (sequence stratigraphy and petrophysical properties), and hydrocarbon traps (trap formation and timing). Using this geologic framework, the USGS defined 11 TPS and 19 Assessment Units (AU).

  20. National Assessment of Oil and Gas Project: Petroleum Systems and Geologic Assessment of Undiscovered Oil and Gas, Hanna, Laramie, and Shirley Basins Province, Wyoming

    USGS Publications Warehouse

    U.S. Geological Survey Hanna, Laramie

    2007-01-01

    INTRODUCTION The purpose of the U.S. Geological Survey?s (USGS) National Oil and Gas Assessment is to develop geologically based hypotheses regarding the potential for additions to oil and gas reserves in priority areas of the United States. The U.S. Geological Survey (USGS) recently completed an assessment of the undiscovered oil and gas potential of the Hanna, Laramie, and Shirley Basins Province in Wyoming and northeastern Colorado. The assessment is based on the geologic elements of each total petroleum system (TPS) defined in the province, including hydrocarbon source rocks (source-rock maturation, hydrocarbon generation, and migration), reservoir rocks (sequence stratigraphy and petrophysical properties), and hydrocarbon traps (trap formation and timing). Using this geologic framework, the USGS defined three TPSs and seven assessment units (AUs) within them; undiscovered resources for three of the seven AUs were quantitatively assessed.

  1. Executive Summary -- assessment of undiscovered oil and gas resources of the San Joaquin Basin Province of California, 2003: Chapter 1 in Petroleum systems and geologic assessment of oil and gas in the San Joaquin Basin Province, California

    USGS Publications Warehouse

    Gautier, Donald L.; Scheirer, Allegra Hosford; Tennyson, Marilyn E.; Peters, Kenneth E.; Magoon, Leslie B.; Lillis, Paul G.; Charpentier, Ronald R.; Cook, Troy A.; French, Christopher D.; Klett, Timothy R.; Pollastro, Richard M.; Schenk, Christopher J.

    2007-01-01

    In 2003, the U.S. Geological Survey (USGS) completed an assessment of the oil and gas resource potential of the San Joaquin Basin Province of California (fig. 1.1). The assessment is based on the geologic elements of each Total Petroleum System defined in the province, including hydrocarbon source rocks (source-rock type and maturation and hydrocarbon generation and migration), reservoir rocks (sequence stratigraphy and petrophysical properties), and hydrocarbon traps (trap formation and timing). Using this geologic framework, the USGS defined five total petroleum systems and ten assessment units within these systems. Undiscovered oil and gas resources were quantitatively estimated for the ten assessment units (table 1.1). In addition, the potential was estimated for further growth of reserves in existing oil fields of the San Joaquin Basin.

  2. Jurassic-Cretaceous Composite Total Petroleum System and Geologic Assessment of Oil and Gas Resources of the North Cuba Basin, Cuba

    USGS Publications Warehouse

    ,

    2008-01-01

    The purpose of the U.S. Geological Survey's (USGS) World Oil and Gas Assessment is to develop geologically based hypotheses regarding the potential for additions to oil and gas reserves in priority areas of the world. The U.S. Geological Survey (USGS) completed an assessment of the undiscovered oil and gas potential of the North Cuba Basin. The assessment is based on the geologic elements of the total petroleum system (TPS) defined in the province, including petroleum source rocks (source-rock maturation, generation, and migration), reservoir rocks (sequence stratigraphy and petrophysical properties), and petroleum traps (Trap formation and timing). Using this geologic framework, the USGS defined a Jurassic-Cretaceous Total Petroleum System in the North Cuba Basin Province. Within this TPS, three assessment units were defined and assessed for undiscovered oil and gas resources.

  3. In situ thermal conductivity of gas-hydrate-bearing sediments of the Mallik 5L-38 well

    NASA Astrophysics Data System (ADS)

    Henninges, J.; Huenges, E.; Burkhardt, H.

    2005-11-01

    Detailed knowledge about thermal properties of rocks containing gas hydrate is required in order to quantify processes involving gas hydrate formation and decomposition in nature. In the framework of the Mallik 2002 program, three wells penetrating a continental gas hydrate occurrence under permafrost were successfully equipped with permanent fiber-optic distributed temperature sensing cables. Temperature data were collected over a 21-month period after completing the wells. Thermal conductivity profiles were calculated from the geothermal data as well as from a petrophysical model derived from the available logging data and application of mixing law models. Results indicate that thermal conductivity variations are mainly lithologically controlled with a minor influence from hydrate saturation. Average thermal conductivity values of the hydrate-bearing sediments range between 2.35 and 2.77 W m-1 K-1. Maximum gas hydrate saturations can reach up to about 90% at an average porosity of 0.3.

  4. Quantification and characterization of leakage errors

    NASA Astrophysics Data System (ADS)

    Wood, Christopher J.; Gambetta, Jay M.

    2018-03-01

    We present a general framework for the quantification and characterization of leakage errors that result when a quantum system is encoded in the subspace of a larger system. To do this we introduce metrics for quantifying the coherent and incoherent properties of the resulting errors and we illustrate this framework with several examples relevant to superconducting qubits. In particular, we propose two quantities, the leakage and seepage rates, which together with average gate fidelity allow for characterizing the average performance of quantum gates in the presence of leakage and show how the randomized benchmarking protocol can be modified to enable the robust estimation of all three quantities for a Clifford gate set.

  5. Aeroelastic Uncertainty Quantification Studies Using the S4T Wind Tunnel Model

    NASA Technical Reports Server (NTRS)

    Nikbay, Melike; Heeg, Jennifer

    2017-01-01

    This paper originates from the joint efforts of an aeroelastic study team in the Applied Vehicle Technology Panel from NATO Science and Technology Organization, with the Task Group number AVT-191, titled "Application of Sensitivity Analysis and Uncertainty Quantification to Military Vehicle Design." We present aeroelastic uncertainty quantification studies using the SemiSpan Supersonic Transport wind tunnel model at the NASA Langley Research Center. The aeroelastic study team decided treat both structural and aerodynamic input parameters as uncertain and represent them as samples drawn from statistical distributions, propagating them through aeroelastic analysis frameworks. Uncertainty quantification processes require many function evaluations to asses the impact of variations in numerous parameters on the vehicle characteristics, rapidly increasing the computational time requirement relative to that required to assess a system deterministically. The increased computational time is particularly prohibitive if high-fidelity analyses are employed. As a remedy, the Istanbul Technical University team employed an Euler solver in an aeroelastic analysis framework, and implemented reduced order modeling with Polynomial Chaos Expansion and Proper Orthogonal Decomposition to perform the uncertainty propagation. The NASA team chose to reduce the prohibitive computational time by employing linear solution processes. The NASA team also focused on determining input sample distributions.

  6. Borehole petrophysical chemostratigraphy of Pennsylvanian black shales in the Kansas subsurface

    USGS Publications Warehouse

    Doveton, J.H.; Merriam, D.F.

    2004-01-01

    Pennsylvanian black shales in Kansas have been studied on outcrop for decades as the core unit of the classic Midcontinent cyclothem. These shales appear to be highstand condensed sections in the sequence stratigraphic paradigm. Nuclear log suites provide several petrophysical measurements of rock chemistry that are a useful data source for chemostratigraphic studies of Pennsylvanian black shales in the subsurface. Spectral gamma-ray logs partition natural radioactivity between contributions by U, Th, and K sources. Elevated U contents in black shales can be related to reducing depositional environments, whereas the K and Th contents are indicators of clay-mineral abundance and composition. The photoelectric factor log measurement is a direct function of aggregate atomic number and so is affected by clay-mineral volume, clay-mineral iron content, and other black shale compositional elements. Neutron porosity curves are primarily a response to hydrogen content. Although good quality logs are available for many black shales, borehole washout features invalidate readings from the nuclear contact devices, whereas black shales thinner than tool resolution will be averaged with adjacent beds. Statistical analysis of nuclear log data between black shales in successive cyclothems allows systematic patterns of their chemical and petrophysical properties to be discriminated in both space and time. ?? 2004 Elsevier B.V. All rights reserved.

  7. Geopressure modeling from petrophysical data: An example from East Kalimantan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herkommer, M.A.

    1994-07-01

    Localized models of abnormal formation pressure (geopressure) are important economic and safety tools frequently used for well planning and drilling operations. Simplified computer-based procedures have been developed that permit these models to be developed more rapidly and with greater accuracy. These techniques are broadly applicable to basins throughout the world where abnormal formation pressures occur. An example from the Attaka field of East Kalimantan, southeast Asia, shows how geopressure models are developed. Using petrophysical and engineering data, empirical correlations between observed pressure and petrophysical logs can be created by computer-assisted data-fitting techniques. These correlations serve as the basis for modelsmore » of the geopressure. By performing repeated analyses on wells at various locations, contour maps on the top of abnormal geopressure can be created. Methods that are simple in their development and application make the task of geopressure estimation less formidable to the geologist and petroleum engineer. Further, more accurate estimates can significantly improve drilling speeds while reducing the incidence of stuck pipe, kicks, and blowouts. In general, geopressure estimates are used in all phases of drilling operations: To develop mud plans and specify equipment ratings, to assist in the recognition of geopressured formations and determination of mud weights, and to improve predictions at offset locations and geologically comparable areas.« less

  8. Reservoir Models for Gas Hydrate Numerical Simulation

    NASA Astrophysics Data System (ADS)

    Boswell, R.

    2016-12-01

    Scientific and industrial drilling programs have now providing detailed information on gas hydrate systems that will increasingly be the subject of field experiments. The need to carefully plan these programs requires reliable prediction of reservoir response to hydrate dissociation. Currently, a major emphasis in gas hydrate modeling is the integration of thermodynamic/hydrologic phenomena with geomechanical response for both reservoir and bounding strata. However, also critical to the ultimate success of these efforts is the appropriate development of input geologic models, including several emerging issues, including (1) reservoir heterogeneity, (2) understanding of the initial petrophysical characteristics of the system (reservoirs and seals), the dynamic evolution of those characteristics during active dissociation, and the interdependency of petrophysical parameters and (3) the nature of reservoir boundaries. Heterogeneity is ubiquitous aspect of every natural reservoir, and appropriate characterization is vital. However, heterogeneity is not random. Vertical variation can be evaluated with core and well log data; however, core data often are challenged by incomplete recovery. Well logs also provide interpretation challenges, particularly where reservoirs are thinly-bedded due to limitation in vertical resolution. This imprecision will extend to any petrophysical measurements that are derived from evaluation of log data. Extrapolation of log data laterally is also complex, and should be supported by geologic mapping. Key petrophysical parameters include porosity, permeability and it many aspects, and water saturation. Field data collected to date suggest that the degree of hydrate saturation is strongly controlled by/dependant upon reservoir quality and that the ratio of free to bound water in the remaining pore space is likely also controlled by reservoir quality. Further, those parameters will also evolve during dissociation, and not necessary in a simple/linear way. Significant progress has also occurred in recent years with regard to the geologic characterization of reservoir boundaries. Vertical boundaries with overlying clay-rich "seals" are now widely-appreciated to have non-zero permeability, and lateral boundaries are sources of potential lateral fluid flow.

  9. Geophysics in Mejillones Basin, Chile: Dynamic analysis and associatedseismic hazard

    NASA Astrophysics Data System (ADS)

    Maringue, J. I.; Yanez, G. A.; Lira, E.; Podestá, L., Sr.; Figueroa, R.; Estay, N. P.; Saez, E.

    2016-12-01

    The active margin of South America has a high seismogenic potential. In particular, the Mejillones peninsula, located in northern Chile, represents a site of interest for seismic hazard due to 100-year seismic gap, the potentially large site effects, and the presence of the most important port in the region. We perform a dynamic analysis of the zone from a spatial and petrophysical model of the Mejillones Basin, to understand its behavior under realistic seismic scenarios. Geometry and petrophysics of the basin were obtained from an integrated modeling of geophysics observations (gravity, seismic and electromagnetic data) distributed mainly in Pampa Mejillones whose western edge is limited by Mejillones Fault, oriented north-south. This regional-scale normal fault shows a half-graben geometry which controls the development of the Mejillones basin eastwards. The gravimetric and magnetotelluric methods allow to define the geometry of the basin, through a cover/basement density contrast, and the transition zone from very low-moderate electrical resistivities, respectively. The seismic method complements the petrophysics in terms of the shear wave depth profile. The results show soil's thicknesses up to 700 meters on deeper zone, with steeper slopes to the west and lower slopes to the east, in agreement with the normal-fault-half-graben basin geometry. Along the N-S direction there are not great differences in basin depth, comprising an almost 2D problem. In terms of petrophysics, the sedimentary stratum is characterized by shear velocities between 300-700 m/s, extremely low electrical resistivities, below 1 ohm-m, and densities from 1.4 to 1.8 gr/cc. The numerical simulation of the seismic waves amplification gives values in the order of 0.8g, which implying large surface damages. The results demonstrate a potential risk in Mejillones bay to future events, therefore is very important to generate mitigations policies for infrastructure and human settlements.

  10. Variations in petrophysical properties of shales along a stratigraphic section in the Whitby mudstone (UK)

    NASA Astrophysics Data System (ADS)

    Barnhoorn, Auke; Houben, Maartje; Lie-A-Fat, Joella; Ravestein, Thomas; Drury, Martyn

    2015-04-01

    In unconventional tough gas reservoirs (e.g. tight sandstones or shales) the presence of fractures, either naturally formed or hydraulically induced, is almost always a prerequisite for hydrocarbon productivity to be economically viable. One of the formations classified so far as a potential interesting formation for shale gas exploration in the Netherlands is the Lower Jurassic Posidonia Shale Formation (PSF). However data of the Posidonia Shale Formation is scarce so far and samples are hard to come by, especially on the variability and heterogeneity of the petrophysical parameters of this shale little is known. Therefore research and sample collection is conducted on a time and depositional analogue of the PSF: the Whitby Mudstone Formation (WMF) in the United Kingdom. A large number of samples along a ~7m stratigraphic section of the Whitby Mudstone Formation have been collected and analysed. Standard petrophysical properties such as porosity and matrix densities are quantified for a number of samples throughout the section, as well as mineral composition analysis based on XRD/XRF and SEM analyses. Seismic velocity measurements are also conducted at multiple heights in the section and in multiple directions to elaborate on anisotropy of the material. Attenuation anisotropy is incorporated as well as Thomsen's parameters combined with elastic parameters, e.g. Young's modulus and Poisson's ratio, to quantify the elastic anisotropy. Furthermore rock mechanical experiments are conducted to determine the elastic constants, rock strength, fracture characteristics, brittleness index, fraccability and rock mechanical anisotropy across the stratigraphic section of the Whitby mudstone formation. Results show that the WMF is highly anisotropic and it exhibits an anisotropy on the large limit of anisotropy reported for US gas shales. The high anisotropy of the Whitby shales has an even larger control on the formation of the fracture network. Furthermore, most petrophysical properties are highly variable. They vary per sample, but even within a sample on a mm-scale, large variations in e.g. the porosity occur. These relatively large variations influence the potential for future shale gas exploration for these Lower Jurassic shales in northern Europe and need to be quantified in detail beforehand. Compositional analyses and rock deformation experiments on the first samples indicate relatively low brittleness indices for the Whitby shale, but variation of these parameters within the stratigraphy are present. All petrophysical analyses combined will provide a complete assessment of the potential for shale gas exploration of these Lower Jurassic shales.

  11. Instantaneous Attributes Applied to Full Waveform Sonic Log and Seismic Data in Integration of Elastic Properties of Shale Gas Formations in Poland

    NASA Astrophysics Data System (ADS)

    Wawrzyniak-Guz, Kamila

    2018-03-01

    Seismic attributes calculated from full waveform sonic log were proposed as a method that may enhance the interpretation the data acquired at log and seismic scales. Though attributes calculated in the study were the mathematical transformations of amplitude, frequency, phase or time of the acoustic full waveforms and seismic traces, they could be related to the geological factors and/or petrophysical properties of rock formations. Attributes calculated from acoustic full waveforms were combined with selected attributes obtained for seismic traces recorded in the vicinity of the borehole and with petrophysical parameters. Such relations may be helpful in elastic and reservoir properties estimation over the area covered by the seismic survey.

  12. Accurate proteome-wide protein quantification from high-resolution 15N mass spectra

    PubMed Central

    2011-01-01

    In quantitative mass spectrometry-based proteomics, the metabolic incorporation of a single source of 15N-labeled nitrogen has many advantages over using stable isotope-labeled amino acids. However, the lack of a robust computational framework for analyzing the resulting spectra has impeded wide use of this approach. We have addressed this challenge by introducing a new computational methodology for analyzing 15N spectra in which quantification is integrated with identification. Application of this method to an Escherichia coli growth transition reveals significant improvement in quantification accuracy over previous methods. PMID:22182234

  13. Calibration and Propagation of Uncertainty for Independence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, Troy Michael; Kress, Joel David; Bhat, Kabekode Ghanasham

    This document reports on progress and methods for the calibration and uncertainty quantification of the Independence model developed at UT Austin. The Independence model is an advanced thermodynamic and process model framework for piperazine solutions as a high-performance CO 2 capture solvent. Progress is presented in the framework of the CCSI standard basic data model inference framework. Recent work has largely focused on the thermodynamic submodels of Independence.

  14. A framework for quantification of groundwater dynamics - concepts and hydro(geo-)logical metrics

    NASA Astrophysics Data System (ADS)

    Haaf, Ezra; Heudorfer, Benedikt; Stahl, Kerstin; Barthel, Roland

    2017-04-01

    Fluctuation patterns in groundwater hydrographs are generally assumed to contain information on aquifer characteristics, climate and environmental controls. However, attempts to disentangle this information and map the dominant controls have been few. This is due to the substantial heterogeneity and complexity of groundwater systems, which is reflected in the abundance of morphologies of groundwater time series. To describe the structure and shape of hydrographs, descriptive terms like "slow"/ "fast" or "flashy"/ "inert" are frequently used, which are subjective, irreproducible and limited. This lack of objective and refined concepts limit approaches for regionalization of hydrogeological characteristics as well as our understanding of dominant processes controlling groundwater dynamics. Therefore, we propose a novel framework for groundwater hydrograph characterization in an attempt to categorize morphologies explicitly and quantitatively based on perceptual concepts of aspects of the dynamics. This quantitative framework is inspired by the existing and operational eco-hydrological classification frameworks for streamflow. The need for a new framework for groundwater systems is justified by the fundamental differences between the state variable groundwater head and the flow variable streamflow. Conceptually, we extracted exemplars of specific dynamic patterns, attributing descriptive terms for means of systematisation. Metrics, primarily taken from streamflow literature, were subsequently adapted to groundwater and assigned to the described patterns for means of quantification. In this study, we focused on the particularities of groundwater as a state variable. Furthermore, we investigated the descriptive skill of individual metrics as well as their usefulness for groundwater hydrographs. The ensemble of categorized metrics result in a framework, which can be used to describe and quantify groundwater dynamics. It is a promising tool for the setup of a successful similarity classification framework for groundwater hydrographs. However, the overabundance of metrics available calls for a systematic redundancy analysis of the metrics, which we describe in a second study (Heudorfer et al., 2017). Heudorfer, B., Haaf, E., Barthel, R., Stahl, K., 2017. A framework for quantification of groundwater dynamics - redundancy and transferability of hydro(geo-)logical metrics. EGU General Assembly 2017, Vienna, Austria.

  15. A Probabilistic Framework for Peptide and Protein Quantification from Data-Dependent and Data-Independent LC-MS Proteomics Experiments

    PubMed Central

    Richardson, Keith; Denny, Richard; Hughes, Chris; Skilling, John; Sikora, Jacek; Dadlez, Michał; Manteca, Angel; Jung, Hye Ryung; Jensen, Ole Nørregaard; Redeker, Virginie; Melki, Ronald; Langridge, James I.; Vissers, Johannes P.C.

    2013-01-01

    A probability-based quantification framework is presented for the calculation of relative peptide and protein abundance in label-free and label-dependent LC-MS proteomics data. The results are accompanied by credible intervals and regulation probabilities. The algorithm takes into account data uncertainties via Poisson statistics modified by a noise contribution that is determined automatically during an initial normalization stage. Protein quantification relies on assignments of component peptides to the acquired data. These assignments are generally of variable reliability and may not be present across all of the experiments comprising an analysis. It is also possible for a peptide to be identified to more than one protein in a given mixture. For these reasons the algorithm accepts a prior probability of peptide assignment for each intensity measurement. The model is constructed in such a way that outliers of any type can be automatically reweighted. Two discrete normalization methods can be employed. The first method is based on a user-defined subset of peptides, while the second method relies on the presence of a dominant background of endogenous peptides for which the concentration is assumed to be unaffected. Normalization is performed using the same computational and statistical procedures employed by the main quantification algorithm. The performance of the algorithm will be illustrated on example data sets, and its utility demonstrated for typical proteomics applications. The quantification algorithm supports relative protein quantification based on precursor and product ion intensities acquired by means of data-dependent methods, originating from all common isotopically-labeled approaches, as well as label-free ion intensity-based data-independent methods. PMID:22871168

  16. Merging information in geophysics: the triumvirat of geology, geophysics, and petrophysics

    NASA Astrophysics Data System (ADS)

    Revil, A.

    2016-12-01

    We know that geophysical inversion is non-unique and that many classical regularization techniques are unphysical. Despite this, we like to use them because of their simplicity and because geophysicists are often afraid to bias the inverse problem by introducing too much prior information (in a broad sense). It is also clear that geophysics is done on geological objects that are not random structures. Spending some time with a geologist in the field, before organizing a field geophysical campaign, is always an instructive experience. Finally, the measured properties are connected to physicochemical and textural parameters of the porous media and the interfaces between the various phases of a porous body. .Some fundamental parameters may control the geophysical observtions or their time variations. If we want to improve our geophysical tomograms, we need to be risk-takers and acknowledge, or rather embrqce, the cross-fertilization arising by coupling geology, geophysics, and ptrophysics. In this presentation, I will discuss various techniques to do so. They will include non-stationary geostatistical descriptors, facies deformation, cross-coupled petrophysical properties using petrophysical clustering, and image-guided inversion. I will show various applications to a number of relevant cases in hydrogeophysics. From these applications, it may become clear that there are many ways to address inverse or time-lapse inverse problems and geophysicists have to be pragmatic regarding the methods used depending on the degree of available prior information.

  17. Petrophysical evaluation of the hydrocarbon potential of the Lower Cretaceous Kharita clastics, North Qarun oil field, Western Desert, Egypt

    NASA Astrophysics Data System (ADS)

    Teama, Mostafa A.; Nabawy, Bassem S.

    2016-09-01

    Based on the available well log data of six wells chosen in the North Qarun oil field in the Western Desert of Egypt, the petrophysical evaluation for the Lower Cretaceous Kharita Formation was accomplished. The lithology of Kharita Formation was analyzed using the neutron porosity-density and the neutron porosity-gamma ray crossplots as well as the litho-saturation plot. The petrophysical parameters, include shale volume, effective porosity, water saturation and hydrocarbon pore volume, were determined and traced laterally in the studied field through the iso-parametric maps. The lithology crossplots of the studied wells show that the sandstone is the main lithology of the Kharita Formation intercalated with some calcareous shale. The cutoff values of shale volume, porosity and water saturation for the productive hydrocarbon pay zones are defined to be 40%, 10% and 50%, respectively, which were determined, based on the applied crossplots approach and their limits. The iso-parametric contour maps for the average reservoir parameters; such as net-pay thickness, average porosity, shale volume, water saturation and the hydrocarbon pore volume were illustrated. From the present study, it is found that the Kharita Formation in the North Qarun oil field has promising reservoir characteristics, particularly in the northwestern part of the study area, which is considered as a prospective area for oil accumulation.

  18. Determination of petrophysical properties of sedimentary rocks by optical methods

    NASA Astrophysics Data System (ADS)

    Korte, D.; Kaukler, D.; Fanetti, M.; Cabrera, H.; Daubront, E.; Franko, M.

    2017-04-01

    Petrophysical properties of rocks (thermal diffusivity and conductivity, porosity and density) as well as the correlation between them are of great importance for many geoscientific applications. The porosity of the reservoir rocks and their permeability are the most fundamental physical properties with respect to the storage and transmission of fluids, mainly oil characterization. Accurate knowledge of these parameters for any hydrocarbon reservoir is required for efficient development, management, and prediction of future performance of the oilfield. Thus, the porosity and permeability, as well as the chemical composition must be quantified as precisely as possible. This should be done along with the thermal properties, density, conductivity, diffusivity and effusivity that are intimately related with them. For this reason, photothermal Beam Deflection Spectrometry (BDS) technique for determination of materials' thermal properties together with other methods such as Energy Dispersive X-ray Scanning Electron Microscopy (SEM-EDX) for determining the chemical composition and sample structure, as well as optical microscopy to determine the particles size, were applied for characterization of sedimentary rocks. The rocks were obtained from the Andes south flank in the Venezuela's western basin. The validation of BDS applicability for determination of petrophysical properties of three sedimentary rocks of different texture and composition (all from Late Cretaceous associated with the Luna, Capacho and Colón-Mito Juan geological formations) was performed. The rocks' thermal properties were correlated to the microstructures and chemical composition of the examined samples.

  19. PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data*

    PubMed Central

    Mitchell, Christopher J.; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh

    2016-01-01

    Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, 15N, 13C, or 18O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25–45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. PMID:27231314

  20. Luminescent microporous metal–organic framework with functional Lewis basic sites on the pore surface: Quantifiable evaluation of luminescent sensing mechanisms towards Fe{sup 3+}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Jun-Cheng; Technology Promotion Center of Nano Composite Material of Biomimetic Sensor and Detecting Technology, Preparation and Application, Anhui Provincial Laboratory West Anhui University, Anhui 237012; Guo, Rui-Li

    2016-11-15

    A systematic study has been conducted on a novel luminescent metal-organic framework, ([Zn(bpyp)(L-OH)]·DMF·2H{sub 2}O){sub n} (1), to explore its sensing mechanisms to Fe{sup 3+}. Structure analyses show that compound 1 exist pyridine N atoms and -OH groups on the pore surface for specific sensing of metal ions via Lewis acid-base interactions. On this consideration, the quenching mechanisms are studied and the processes are controlled by multiple mechanisms in which dynamic and static mechanisms are calculated, achieving the quantification evaluation of the quenching process. This work not only achieves the quantitative evaluation of the luminescence quenching but also provides certain insightsmore » into the quenching process, and the possible mechanisms explored in this work may inspire future research and design of target luminescent metal-organic frameworks (LMOFs) with specific functions. - Graphical abstract: A systematic study has been conducted on a novel luminescent metal-organic framework to explore its sensing mechanisms to Fe{sup 3+}. The quenching mechanisms are studied and the processes are controlled by multiple mechanisms in which dynamic and static mechanisms are calculated, achieving the quantification evaluation of the quenching process. - Highlights: • A novel porous luminescent MOF containing uncoordinated groups in interlayer channels was successfully synthesized. • The compound 1 can exhibit significant luminescent sensitivity to Fe{sup 3+}, which make its good candidate as luminescent sensor. • The corresponding dynamic and static quenching constants are calculated, achieving the quantification evaluation of the quenching process.« less

  1. Automatic Segmentation and Quantification of Filamentous Structures in Electron Tomography

    PubMed Central

    Loss, Leandro A.; Bebis, George; Chang, Hang; Auer, Manfred; Sarkar, Purbasha; Parvin, Bahram

    2016-01-01

    Electron tomography is a promising technology for imaging ultrastructures at nanoscale resolutions. However, image and quantitative analyses are often hindered by high levels of noise, staining heterogeneity, and material damage either as a result of the electron beam or sample preparation. We have developed and built a framework that allows for automatic segmentation and quantification of filamentous objects in 3D electron tomography. Our approach consists of three steps: (i) local enhancement of filaments by Hessian filtering; (ii) detection and completion (e.g., gap filling) of filamentous structures through tensor voting; and (iii) delineation of the filamentous networks. Our approach allows for quantification of filamentous networks in terms of their compositional and morphological features. We first validate our approach using a set of specifically designed synthetic data. We then apply our segmentation framework to tomograms of plant cell walls that have undergone different chemical treatments for polysaccharide extraction. The subsequent compositional and morphological analyses of the plant cell walls reveal their organizational characteristics and the effects of the different chemical protocols on specific polysaccharides. PMID:28090597

  2. Automatic Segmentation and Quantification of Filamentous Structures in Electron Tomography.

    PubMed

    Loss, Leandro A; Bebis, George; Chang, Hang; Auer, Manfred; Sarkar, Purbasha; Parvin, Bahram

    2012-10-01

    Electron tomography is a promising technology for imaging ultrastructures at nanoscale resolutions. However, image and quantitative analyses are often hindered by high levels of noise, staining heterogeneity, and material damage either as a result of the electron beam or sample preparation. We have developed and built a framework that allows for automatic segmentation and quantification of filamentous objects in 3D electron tomography. Our approach consists of three steps: (i) local enhancement of filaments by Hessian filtering; (ii) detection and completion (e.g., gap filling) of filamentous structures through tensor voting; and (iii) delineation of the filamentous networks. Our approach allows for quantification of filamentous networks in terms of their compositional and morphological features. We first validate our approach using a set of specifically designed synthetic data. We then apply our segmentation framework to tomograms of plant cell walls that have undergone different chemical treatments for polysaccharide extraction. The subsequent compositional and morphological analyses of the plant cell walls reveal their organizational characteristics and the effects of the different chemical protocols on specific polysaccharides.

  3. Analysing deterioration of marble stones exposed to underwater conditions

    NASA Astrophysics Data System (ADS)

    Cámara, Beatriz; Álvarez de Buergo, Mónica; Bethencourt, Manuel; Freire-Lista, David; Fort, Rafael

    2016-04-01

    The peculiar conditions of the marine environment make the conservation of underwater archaeological sites an extremely complex procedure. This is due to the fact that the prevailing conditions in this environment promote the development of deterioration phenomena in submerged artefacts through the synergistic action of physical, chemical and biological factors. The objective of the present investigation was to determine how petrophysical properties of cultural heritage materials can be affected by being exposed to the specific underwater conditions of the sea bottom, and so, to evaluate how this can affect, in a long term, in their durability and evolution when they part of an archaeological site. For this purpose, two types of marble (the Italian Carrara and the Spanish Macael) were subjected to an experiment consisting of exposing stone materials for one and a half year to underwater conditions. The experimental test was located in an archaeological site in the Bay of Cadiz (southern Spain), Bajo del Chapitel (recognized as Cultural Interest), which includes remains of shipwrecks from different periods. In this site, samples were submerged to 12 m depth and placed in the sea bottom simulating the different positions in which underwater archaeological objects can be found (fully exposed, half buried and covered). Petrophysical characterisation involved determination of the apparent and bulk densities, water saturation (maximum water content a material may contain), open porosity (porosity accessible to water), chromatic parameters and ultrasonic velocity. Before measuring, samples were subjected to mechanical cleaning (in those samples with biological colonization) and to removal of salt deposits. Results showed significant differences in these petrophysical properties after underwater submersion, which were directly related to the type of underwater exposure condition. Comparative analysis of petrophysical properties, like the one conducted in this study, provides useful information for evaluation of the deterioration processes of heritage stones in a marine environment, and for conservation measures aimed at the in situ preservation of archaeological sites. Acknowledgements: Community of Madrid for financing Geomateriales2 program (P2013/MIT2914), CEI-Moncloa UCM-UPM, Applied Petrology for Heritage Stone Materials Conservation Research Group.

  4. Multivariate Formation Pressure Prediction with Seismic-derived Petrophysical Properties from Prestack AVO inversion and Poststack Seismic Motion Inversion

    NASA Astrophysics Data System (ADS)

    Yu, H.; Gu, H.

    2017-12-01

    A novel multivariate seismic formation pressure prediction methodology is presented, which incorporates high-resolution seismic velocity data from prestack AVO inversion, and petrophysical data (porosity and shale volume) derived from poststack seismic motion inversion. In contrast to traditional seismic formation prediction methods, the proposed methodology is based on a multivariate pressure prediction model and utilizes a trace-by-trace multivariate regression analysis on seismic-derived petrophysical properties to calibrate model parameters in order to make accurate predictions with higher resolution in both vertical and lateral directions. With prestack time migration velocity as initial velocity model, an AVO inversion was first applied to prestack dataset to obtain high-resolution seismic velocity with higher frequency that is to be used as the velocity input for seismic pressure prediction, and the density dataset to calculate accurate Overburden Pressure (OBP). Seismic Motion Inversion (SMI) is an inversion technique based on Markov Chain Monte Carlo simulation. Both structural variability and similarity of seismic waveform are used to incorporate well log data to characterize the variability of the property to be obtained. In this research, porosity and shale volume are first interpreted on well logs, and then combined with poststack seismic data using SMI to build porosity and shale volume datasets for seismic pressure prediction. A multivariate effective stress model is used to convert velocity, porosity and shale volume datasets to effective stress. After a thorough study of the regional stratigraphic and sedimentary characteristics, a regional normally compacted interval model is built, and then the coefficients in the multivariate prediction model are determined in a trace-by-trace multivariate regression analysis on the petrophysical data. The coefficients are used to convert velocity, porosity and shale volume datasets to effective stress and then to calculate formation pressure with OBP. Application of the proposed methodology to a research area in East China Sea has proved that the method can bridge the gap between seismic and well log pressure prediction and give predicted pressure values close to pressure meassurements from well testing.

  5. Shale Gas Petrophysical Models: an evaluation of contrasting approaches and assumptions

    NASA Astrophysics Data System (ADS)

    Inwood, Jennifer; Lovell, Mike; Davies, Sarah; Fishwick, Stewart; Taylor, Kevin

    2015-04-01

    Shale gas refers to fine-grained formations, or mudstones, where organic matter has matured sufficiently to produce predominantly gas, but that gas has not migrated any significant distance and hence the source rock is effectively the reservoir. Due to the success of shale gas extraction in the USA, many European countries are assessing their potential resources. A key uncertainty in evaluating the resource is the estimation of gas in place and most models are based on North American plays. However, it would seem that no single model to date can confidently predict the gas in place for a 'new' shale formation. Shale gas is frequently characterized by two distinct gas components: free gas is able to move and occupies the pores, while adsorbed gas is fixed onto organic surfaces and held in place by pressure. There are a number of different published methodologies that attempt to take account for this complicated distribution of gas within the rock ranging from models where the importance of the adsorbed gas is assumed to be negligible to those where all gas is assumed to exist within the organic pores and none within the mineral pore spaces. Models that assume both components are important and occupy adjacent volumes need to consider how to separate out the two to avoid double counting. Due to the heterogeneity of mudstones the most appropriate model may vary downhole as well as across adjacent wells. In this pilot study we consider the underlying assumptions and categorize models dependent on the deterministic or probabilistic approach used. We use an initial dataset from North America to test and compare a number of different approaches before expanding the analysis to further formations that span a range of geological and petrophysical characteristics. We then review and evaluate the models, identifying key variables and, where possible, determining their importance through sensitivity analysis. This work aims to establish guidelines for selecting the most appropriate petrophysical model for evaluating the gas in place in a shale gas play, and as such provides a more informed understanding of this petrophysical maze for both specialists and non-specialists.

  6. Application of uniaxial confining-core clamp with hydrous pyrolysis in petrophysical and geochemical studies of source rocks at various thermal maturities

    USGS Publications Warehouse

    Lewan, Michael D.; Birdwell, Justin E.; Baez, Luis; Beeney, Ken; Sonnenberg, Steve

    2013-01-01

    Understanding changes in petrophysical and geochemical parameters during source rock thermal maturation is a critical component in evaluating source-rock petroleum accumulations. Natural core data are preferred, but obtaining cores that represent the same facies of a source rock at different thermal maturities is seldom possible. An alternative approach is to induce thermal maturity changes by laboratory pyrolysis on aliquots of a source-rock sample of a given facies of interest. Hydrous pyrolysis is an effective way to induce thermal maturity on source-rock cores and provide expelled oils that are similar in composition to natural crude oils. However, net-volume increases during bitumen and oil generation result in expanded cores due to opening of bedding-plane partings. Although meaningful geochemical measurements on expanded, recovered cores are possible, the utility of the core for measuring petrophysical properties relevant to natural subsurface cores is not suitable. This problem created during hydrous pyrolysis is alleviated by using a stainless steel uniaxial confinement clamp on rock cores cut perpendicular to bedding fabric. The clamp prevents expansion just as overburden does during natural petroleum formation in the subsurface. As a result, intact cores can be recovered at various thermal maturities for the measurement of petrophysical properties as well as for geochemical analyses. This approach has been applied to 1.7-inch diameter cores taken perpendicular to the bedding fabric of a 2.3- to 2.4-inch thick slab of Mahogany oil shale from the Eocene Green River Formation. Cores were subjected to hydrous pyrolysis at 360 °C for 72 h, which represents near maximum oil generation. One core was heated unconfined and the other was heated in the uniaxial confinement clamp. The unconfined core developed open tensile fractures parallel to the bedding fabric that result in a 38 % vertical expansion of the core. These open fractures did not occur in the confined core, but short, discontinuous vertical fractures on the core periphery occurred as a result of lateral expansion.

  7. Petroleum Systems and Geologic Assessment of Oil and Gas Resources in the Wind River Basin Province, Wyoming

    USGS Publications Warehouse

    ,

    2007-01-01

    The purpose of the U.S. Geological Survey's (USGS) National Oil and Gas Assessment is to develop geologically based hypotheses regarding the potential for additions to oil and gas reserves in priority areas of the United States. The U.S. Geological Survey (USGS) recently completed an assessment of the undiscovered oil and gas potential of the Wind River Basin Province which encompasses about 4.7 million acres in central Wyoming. The assessment is based on the geologic elements of each total petroleum system (TPS) defined in the province, including hydrocarbon source rocks (source-rock maturation, hydrocarbon generation, and migration), reservoir rocks (sequence stratigraphy and petrophysical properties), and hydrocarbon traps (trap formation and timing). Using this geologic framework, the USGS defined three TPSs: (1) Phosphoria TPS, (2) Cretaceous-Tertiary TPS, and (3) Waltman TPS. Within these systems, 12 Assessment Units (AU) were defined and undiscovered oil and gas resources were quantitatively estimated within 10 of the 12 AUs.

  8. Petroleum Systems and Geologic Assessment of Undiscovered Oil and Gas, Navarro and Taylor Groups, Western Gulf Province, Texas

    USGS Publications Warehouse

    ,

    2006-01-01

    The purpose of the U.S. Geological Survey's (USGS) National Oil and Gas Assessment is to develop geologically based hypotheses regarding the potential for additions to oil and gas reserves in priority areas of the United States. The USGS recently completed an assessment of undiscovered oil and gas potential of the Late Cretaceous Navarro and Taylor Groups in the Western Gulf Province in Texas (USGS Province 5047). The Navarro and Taylor Groups have moderate potential for undiscovered oil resources and good potential for undiscovered gas resources. This assessment is based on geologic principles and uses the total petroleum system concept. The geologic elements of a total petroleum system include hydrocarbon source rocks (source rock maturation, hydrocarbon generation and migration), reservoir rocks (sequence stratigraphy and petrophysical properties), and hydrocarbon traps (trap formation and timing). The USGS used this geologic framework to define one total petroleum system and five assessment units. Five assessment units were quantitatively assessed for undiscovered oil and gas resources.

  9. National Assessment of Oil and Gas Project: petroleum systems and geologic assessment of oil and gas in the Southwestern Wyoming Province, Wyoming, Colorado and Utah

    USGS Publications Warehouse

    ,

    2005-01-01

    The U.S. Geological Survey (USGS) completed an assessment of the undiscovered oil and gas potential of the Southwestern Wyoming Province of southwestern Wyoming, northwestern Colorado, and northeastern Utah (fig. 1). The USGS Southwestern Wyoming Province for this assessment included the Green River Basin, Moxa arch, Hoback Basin, Sandy Bend arch, Rock Springs uplift, Great Divide Basin, Wamsutter arch, Washakie Basin, Cherokee ridge, and the Sand Wash Basin. The assessment of the Southwestern Wyoming Province is based on geologic principles and uses the total petroleum system concept. The geologic elements of a total petroleum system include hydrocarbon source rocks (source rock maturation, hydrocarbon generation, and migration), reservoir rocks (sequence stratigraphy, petrophysical properties), and hydrocarbon traps (trap types, formation, and timing). Using this geologic framework, the USGS defined 9 total petroleum systems (TPS) and 23 assessment units (AU) within these TPSs, and quantitatively estimated the undiscovered oil and gas resources within 21 of the 23 AUs.

  10. National Assessment of Oil and Gas Project: Geologic Assessment of Undiscovered Oil and Gas Resources of the Eastern Great Basin Province, Nevada, Utah, Idaho, and Arizona

    USGS Publications Warehouse

    ,

    2007-01-01

    Introduction The purpose of the U.S. Geological Survey's (USGS) National Oil and Gas Assessment is to develop geologically based hypotheses regarding the potential for additions to oil and gas reserves in priority areas of the United States. The U.S. Geological Survey (USGS) recently completed an assessment of the undiscovered oil and gas potential of the Eastern Great Basin Province of eastern Nevada, western Utah, southeastern Idaho, and northwestern Arizona. This assessment is based on geologic principles and uses the total petroleum system concept. The geologic elements of a total petroleum system include hydrocarbon source rocks (source rock maturation, hydrocarbon generation and migration), reservoir rocks (sequence stratigraphy and petrophysical properties), and hydrocarbon traps (trap formation and timing). The USGS used this geologic framework to define one total petroleum system and three assessment units. All three assessment units were quantitatively assessed for undiscovered oil and gas resources.

  11. Discriminant function analysis as tool for subsurface geologist

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chesser, K.

    1987-05-01

    Sedimentary structures such as cross-bedding control porosity, permeability, and other petrophysical properties in sandstone reservoirs. Understanding the distribution of such structures in the subsurface not only aids in the prediction of reservoir properties but also provides information about depositional environments. Discriminant function analysis (DFA) is a simple yet powerful method incorporating petrophysical data from wireline logs, core analyses, or other sources into groups that have been previously defined through direct observation of sedimentary structures in cores. Once data have been classified into meaningful groups, the geologist can predict the distribution of specific sedimentary structures or important reservoir properties in areasmore » where cores are unavailable. DFA is efficient. Given several variables, DFA will choose the best combination to discriminate among groups. The initial classification function can be computed from relatively few observations, and additional data may be included as necessary. Furthermore, DFA provides quantitative goodness-of-fit estimates for each observation. Such estimates can be used as mapping parameters or to assess risk in petroleum ventures. Petrophysical data from the Skinner sandstone of Strauss field in southeastern Kansas tested the ability of DFA to discriminate between cross-bedded and ripple-bedded sandstones. Petroleum production in Strauss field is largely restricted to the more permeable cross-bedded sandstones. DFA based on permeability correctly placed 80% of samples into cross-bedded or ripple-bedded groups. Addition of formation factor to the discriminant function increased correct classifications to 83% - a small but statistically significant gain.« less

  12. Integrated petrophysical and reservoir characterization workflow to enhance permeability and water saturation prediction

    NASA Astrophysics Data System (ADS)

    Al-Amri, Meshal; Mahmoud, Mohamed; Elkatatny, Salaheldin; Al-Yousef, Hasan; Al-Ghamdi, Tariq

    2017-07-01

    Accurate estimation of permeability is essential in reservoir characterization and in determining fluid flow in porous media which greatly assists optimize the production of a field. Some of the permeability prediction techniques such as Porosity-Permeability transforms and recently artificial intelligence and neural networks are encouraging but still show moderate to good match to core data. This could be due to limitation to homogenous media while the knowledge about geology and heterogeneity is indirectly related or absent. The use of geological information from core description as in Lithofacies which includes digenetic information show a link to permeability when categorized into rock types exposed to similar depositional environment. The objective of this paper is to develop a robust combined workflow integrating geology and petrophysics and wireline logs in an extremely heterogeneous carbonate reservoir to accurately predict permeability. Permeability prediction is carried out using pattern recognition algorithm called multi-resolution graph-based clustering (MRGC). We will bench mark the prediction results with hard data from core and well test analysis. As a result, we showed how much better improvements are achieved in the permeability prediction when geology is integrated within the analysis. Finally, we use the predicted permeability as an input parameter in J-function and correct for uncertainties in saturation calculation produced by wireline logs using the classical Archie equation. Eventually, high level of confidence in hydrocarbon volumes estimation is reached when robust permeability and saturation height functions are estimated in presence of important geological details that are petrophysically meaningful.

  13. Application of conditional simulation of heterogeneous rock properties to seismic scattering and attenuation analysis in gas hydrate reservoirs

    NASA Astrophysics Data System (ADS)

    Huang, Jun-Wei; Bellefleur, Gilles; Milkereit, Bernd

    2012-02-01

    We present a conditional simulation algorithm to parameterize three-dimensional heterogeneities and construct heterogeneous petrophysical reservoir models. The models match the data at borehole locations, simulate heterogeneities at the same resolution as borehole logging data elsewhere in the model space, and simultaneously honor the correlations among multiple rock properties. The model provides a heterogeneous environment in which a variety of geophysical experiments can be simulated. This includes the estimation of petrophysical properties and the study of geophysical response to the heterogeneities. As an example, we model the elastic properties of a gas hydrate accumulation located at Mallik, Northwest Territories, Canada. The modeled properties include compressional and shear-wave velocities that primarily depend on the saturation of hydrate in the pore space of the subsurface lithologies. We introduce the conditional heterogeneous petrophysical models into a finite difference modeling program to study seismic scattering and attenuation due to multi-scale heterogeneity. Similarities between resonance scattering analysis of synthetic and field Vertical Seismic Profile data reveal heterogeneity with a horizontal-scale of approximately 50 m in the shallow part of the gas hydrate interval. A cross-borehole numerical experiment demonstrates that apparent seismic energy loss can occur in a pure elastic medium without any intrinsic attenuation of hydrate-bearing sediments. This apparent attenuation is largely attributed to attenuative leaky mode propagation of seismic waves through large-scale gas hydrate occurrence as well as scattering from patchy distribution of gas hydrate.

  14. Nanoscale Pore Imaging and Pore Scale Fluid Flow Modeling in Chalk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tomutsa, Liviu; Silin, Dmitriy

    2004-08-19

    For many rocks of high economic interest such as chalk, diatomite, tight gas sands or coal, nanometer scale resolution is needed to resolve the 3D-pore structure, which controls the flow and trapping of fluids in the rocks. Such resolutions cannot be achieved with existing tomographic technologies. A new 3D imaging method, based on serial sectioning and using the Focused Ion Beam (FIB) technology has been developed. FIB allows for the milling of layers as thin as 10 nanometers by using accelerated Ga+ ions to sputter atoms from the sample surface. After each milling step, as a new surface is exposed,more » a 2D image of this surface is generated. Next, the 2D images are stacked to reconstruct the 3D pore or grain structure. Resolutions as high as 10 nm are achievable using such a technique. A new robust method of pore-scale fluid flow modeling has been developed and applied to sandstone and chalk samples. The method uses direct morphological analysis of the pore space to characterize the petrophysical properties of diverse formations. Not only petrophysical properties (porosity, permeability, relative permeability and capillary pressures) can be computed but also flow processes, such as those encountered in various IOR approaches, can be simulated. Petrophysical properties computed with the new method using the new FIB data will be presented. Present study is a part of the development of an Electronic Core Laboratory at LBNL/UCB.« less

  15. Integrated geomechanical, petrographical and petrophysical study of the sandstones of the Wajid Group, SW Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Benaafi, Mohammed; Hariri, Mustafa; Al-Shaibani, Abdulaziz; Abdullatif, Osman; Makkawi, Mohammed

    2018-07-01

    The Cambro-Permian siliciclastic succession in southwestern Saudi Arabia is represented by the Wajid Group, which consists mainly of fluvial, shallow marine, aeolian, and glacial sandstones. The Wajid Group comprises the Dibsiyah, Sanamah, Qalibah, Khusayyayn, and Juwayl Formations. It is exposed in the Wadi Al-Dawasir area and extends to Najran City. The sandstones of the Wajid Group serve as groundwater aquifers in the Wadi Al-Dawasir and Najran areas and host hydrocarbon (mainly gas) reservoirs in the Rub' Al-Khali Basin. This study aims to characterize the geomechanical properties (rock strength and Young's modulus) of the sandstones of the Wajid Group using field and experimental techniques. A further objective is to investigate the relationships between the geomechanical properties and the petrographical and petrophysical properties of the studied sandstones. The geomechanical properties of the studied sandstones vary from glacial to non-glacial sandstones, as the glacial sandstones display high values of the geomechanical properties with high variability indices. Four geological factors including grain size, cement content, porosity and permeability were observed as the main controls on the geomechanical behaviour of the studied sandstones except for the Khusayyayn sandstone, where the mineral composition was also important. Significant correlations were observed between the petrographical and petrophysical properties and the geomechanical properties of the glacial sandstones. Predictive models of the geomechanical properties (RN, UCS, and E) were generated using regression analysis to account for the glacial sandstones.

  16. Laboratory measurements of the seismic velocities and other petrophysical properties of the Outokumpu deep drill core samples, eastern Finland

    NASA Astrophysics Data System (ADS)

    Elbra, Tiiu; Karlqvist, Ronnie; Lassila, Ilkka; Høgström, Edward; Pesonen, Lauri J.

    2011-01-01

    Petrophysical, in particular seismic velocity, measurements of the Outokumpu deep drill core (depth 2.5 km) have been carried out to characterize the geophysical nature of the Paleoproterozoic crustal section of eastern Finland and to find lithological and geophysical interpretations to the distinct crustal reflectors as observed in seismic surveys. The results show that different lithological units can be identified based on the petrophysical data. The density of the samples remained nearly constant throughout the drilled section. Only diopside-tremolite skarns and black schists exhibit higher densities. The samples are dominated by the paramagnetic behaviour with occasional ferromagnetic signature caused by serpentinitic rocks. Large variations in seismic velocities, both at ambient pressure and under in situ crustal conditions are observed. The porosity of the samples, which is extremely low, is either intrinsic by nature or caused by decompaction related to fracturing during the core retrieval. It is noteworthy that these microfractures have dramatically lowered the VP and VS values. From the measured velocities and density data we have calculated the seismic impedances, Young's modulus and Poisson's ratios for the lithological units of the Outokumpu section and from these data the reflection coefficients for the major lithological boundaries, evident in the surveyed section, were determined. The data show that the strong and distinct reflections visible in wide-angle seismic surveys are caused by interfaces between diopside-tremolite skarn and either serpentinites, mica schist or black schist.

  17. Rapid development of Proteomic applications with the AIBench framework.

    PubMed

    López-Fernández, Hugo; Reboiro-Jato, Miguel; Glez-Peña, Daniel; Méndez Reboredo, José R; Santos, Hugo M; Carreira, Ricardo J; Capelo-Martínez, José L; Fdez-Riverola, Florentino

    2011-09-15

    In this paper we present two case studies of Proteomics applications development using the AIBench framework, a Java desktop application framework mainly focused in scientific software development. The applications presented in this work are Decision Peptide-Driven, for rapid and accurate protein quantification, and Bacterial Identification, for Tuberculosis biomarker search and diagnosis. Both tools work with mass spectrometry data, specifically with MALDI-TOF spectra, minimizing the time required to process and analyze the experimental data. Copyright 2011 The Author(s). Published by Journal of Integrative Bioinformatics.

  18. Identification and determination of trapping parameters as key site parameters for CO2 storage for the active CO2 storage site in Ketzin (Germany) - Comparison of different experimental approaches and analysis of field data

    NASA Astrophysics Data System (ADS)

    Zemke, Kornelia; Liebscher, Axel

    2015-04-01

    Petrophysical properties like porosity and permeability are key parameters for a safe long-term storage of CO2 but also for the injection operation itself. The accurate quantification of residual trapping is difficult, but very important for both storage containment security and storage capacity; it is also an important parameter for dynamic simulation. The German CO2 pilot storage in Ketzin is a Triassic saline aquifer with initial conditions of the target sandstone horizon of 33.5 ° C/6.1 MPa at 630 m. One injection and two observation wells were drilled in 2007 and nearly 200 m of core material was recovered for site characterization. From June 2008 to September 2013, slightly more than 67 kt food-grade CO2 has been injected and continuously monitored. A fourth observation well has been drilled after 61 kt injected CO2 in summer 2012 at only 25 m distance to the injection well and new core material was recovered that allow study CO2 induced changes in petrophysical properties. The observed only minor differences between pre-injection and post-injection petrophysical parameters of the heterogeneous formation have no severe consequences on reservoir and cap rock integrity or on the injection behavior. Residual brine saturation for the Ketzin reservoir core material was estimated by different methods. Brine-CO2 flooding experiments for two reservoir samples resulted in 36% and 55% residual brine saturation (Kiessling, 2011). Centrifuge capillary pressure measurements (pc = 0.22 MPa) yielded the smallest residual brine saturation values with ~20% for the lower part of the reservoir sandstone and ~28% for the upper part (Fleury, 2010). The method by Cerepi (2002), which calculates the residual mercury saturation after pressure release on the imbibition path as trapped porosity and the retracted mercury volume as free porosity, yielded unrealistic low free porosity values of only a few percent, because over 80% of the penetrated mercury remained in the samples after pressure release to atmospheric pressure. The results from the centrifuge capillary pressure measurements were then used for calibrating the cutoff time of NMR T2 relaxation (average value 8 ms) to differentiate between the mobile and immobile water fraction (standard for clean sandstone 33 ms). Following Norden (2010) a cutoff time of 10 ms was applied to estimate the residual saturation as Bound Fluid Volume for the Ketzin core materials and to estimate NMR permeability after Timur-Coates. This adapted cutoff value is also consistent with results from RST logging after injection. The maximum measured CO2 saturation corresponds to the effective porosity for the upper most CO2 filled sandstone horizon. The directly measured values and the estimated residual brine saturations from NMR measurements with the adapted cutoff time of 10 ms are within the expected range compared to the literature data with a mean residual brine saturation of 53%. A. Cerepi et al., 2002, Journal of Petroleum Science and Engineering 35. M. Fleury et al., 2011, SCA2010-06. D. Kiessling et al., 2010, International Journal of Greenhouse Gas Control 4. B. Norden et al. 2010, SPE Reservoir Evaluation & Engineering 13. .

  19. A framework for quantification of groundwater dynamics - redundancy and transferability of hydro(geo-)logical metrics

    NASA Astrophysics Data System (ADS)

    Heudorfer, Benedikt; Haaf, Ezra; Barthel, Roland; Stahl, Kerstin

    2017-04-01

    A new framework for quantification of groundwater dynamics has been proposed in a companion study (Haaf et al., 2017). In this framework, a number of conceptual aspects of dynamics, such as seasonality, regularity, flashiness or inter-annual forcing, are described, which are then linked to quantitative metrics. Hereby, a large number of possible metrics are readily available from literature, such as Pardé Coefficients, Colwell's Predictability Indices or Base Flow Index. In the present work, we focus on finding multicollinearity and in consequence redundancy among the metrics representing different patterns of dynamics found in groundwater hydrographs. This is done also to verify the categories of dynamics aspects suggested by Haaf et al., 2017. To determine the optimal set of metrics we need to balance the desired minimum number of metrics and the desired maximum descriptive property of the metrics. To do this, a substantial number of candidate metrics are applied to a diverse set of groundwater hydrographs from France, Germany and Austria within the northern alpine and peri-alpine region. By applying Principle Component Analysis (PCA) to the correlation matrix of the metrics, we determine a limited number of relevant metrics that describe the majority of variation in the dataset. The resulting reduced set of metrics comprise an optimized set that can be used to describe the aspects of dynamics that were identified within the groundwater dynamics framework. For some aspects of dynamics a single significant metric could be attributed. Other aspects have a more fuzzy quality that can only be described by an ensemble of metrics and are re-evaluated. The PCA is furthermore applied to groups of groundwater hydrographs containing regimes of similar behaviour in order to explore transferability when applying the metric-based characterization framework to groups of hydrographs from diverse groundwater systems. In conclusion, we identify an optimal number of metrics, which are readily available for usage in studies on groundwater dynamics, intended to help overcome analytical limitations that exist due to the complexity of groundwater dynamics. Haaf, E., Heudorfer, B., Stahl, K., Barthel, R., 2017. A framework for quantification of groundwater dynamics - concepts and hydro(geo-)logical metrics. EGU General Assembly 2017, Vienna, Austria.

  20. Exploration geophysics calculator programs for use on Hewlett-Packard models 67 and 97 programmable calculators

    USGS Publications Warehouse

    Campbell, David L.; Watts, Raymond D.

    1978-01-01

    Program listing, instructions, and example problems are given for 12 programs for the interpretation of geophysical data, for use on Hewlett-Packard models 67 and 97 programmable hand-held calculators. These are (1) gravity anomaly over 2D prism with = 9 vertices--Talwani method; (2) magnetic anomaly (?T, ?V, or ?H) over 2D prism with = 8 vertices?Talwani method; (3) total-field magnetic anomaly profile over thick sheet/thin dike; (4) single dipping seismic refractor--interpretation and design; (5) = 4 dipping seismic refractors--interpretation; (6) = 4 dipping seismic refractors?design; (7) vertical electrical sounding over = 10 horizontal layers--Schlumberger or Wenner forward calculation; (8) vertical electric sounding: Dar Zarrouk calculations; (9) magnetotelluric planewave apparent conductivity and phase angle over = 9 horizontal layers--forward calculation; (10) petrophysics: a.c. electrical parameters; (11) petrophysics: elastic constants; (12) digital convolution with = 10-1ength filter.

  1. Micro- and macro-scale petrophysical characterization of potential reservoir units from the Northern Israel

    NASA Astrophysics Data System (ADS)

    Haruzi, Peleg; Halisch, Matthias; Katsman, Regina; Waldmann, Nicolas

    2016-04-01

    Lower Cretaceous sandstone serves as hydrocarbon reservoir in some places over the world, and potentially in Hatira formation in the Golan Heights, northern Israel. The purpose of the current research is to characterize the petrophysical properties of these sandstone units. The study is carried out by two alternative methods: using conventional macroscopic lab measurements, and using CT-scanning, image processing and subsequent fluid mechanics simulations at a microscale, followed by upscaling to the conventional macroscopic rock parameters (porosity and permeability). Comparison between the upscaled and measured in the lab properties will be conducted. The best way to upscale the microscopic rock characteristics will be analyzed based the models suggested in the literature. Proper characterization of the potential reservoir will provide necessary analytical parameters for the future experimenting and modeling of the macroscopic fluid flow behavior in the Lower Cretaceous sandstone.

  2. Facies Distribution and Petrophysical Properties of Shoreface-Offshore Transition Environment in Sandakan Formation, NE Sabah Basin

    NASA Astrophysics Data System (ADS)

    Majid, M. Firdaus A.; Suhaili Ismail, M.; Rahman, A. Hadi A.; Azfar Mohamed, M.

    2017-10-01

    Newly exposed outcrop of Miocene shallow marine sandstone in Sandakan Formation, allows characterization of the facies distribution and petrophysical properties of shoreface to offshore transition environment. Six facies are defined: (1) Poorly bioturbated Hummocky Cross Stratified (HCS) sandstone (F1), (2) Moderately bioturbated HCS sandstone (F2), (3) Well bioturbated HCS sandstone (F3), (4) Poorly bioturbated Swaley Cross Stratified (SCS) sandstone (F4), (5) Interbedded HCS sandstone with sand-silt mudstone, (6) Heterolithic mudstone. The sedimentary successions were deposited in upper to lower shoreface, and offshore transition environment. Facies F3, F4 and F5 shows good reservoir quality with good porosity and fair permeability values from 20% to 21% and 14 mD to 33 mD respectively. While Facies F1 exhibits poor reservoir quality with low permeability values 3.13 mD.

  3. Compiling a national resistivity atlas of Denmark based on airborne and ground-based transient electromagnetic data

    NASA Astrophysics Data System (ADS)

    Barfod, Adrian A. S.; Møller, Ingelise; Christiansen, Anders V.

    2016-11-01

    We present a large-scale study of the petrophysical relationship of resistivities obtained from densely sampled ground-based and airborne transient electromagnetic surveys and lithological information from boreholes. The overriding aim of this study is to develop a framework for examining the resistivity-lithology relationship in a statistical manner and apply this framework to gain a better description of the large-scale resistivity structures of the subsurface. In Denmark very large and extensive datasets are available through the national geophysical and borehole databases, GERDA and JUPITER respectively. In a 10 by 10 km grid, these data are compiled into histograms of resistivity versus lithology. To do this, the geophysical data are interpolated to the position of the boreholes, which allows for a lithological categorization of the interpolated resistivity values, yielding different histograms for a set of desired lithological categories. By applying the proposed algorithm to all available boreholes and airborne and ground-based transient electromagnetic data we build nation-wide maps of the resistivity-lithology relationships in Denmark. The presented Resistivity Atlas reveals varying patterns in the large-scale resistivity-lithology relations, reflecting geological details such as available source material for tills. The resistivity maps also reveal a clear ambiguity in the resistivity values for different lithologies. The Resistivity Atlas is highly useful when geophysical data are to be used for geological or hydrological modeling.

  4. Quantification of substrate and cellular strains in stretchable 3D cell cultures: an experimental and computational framework.

    PubMed

    González-Avalos, P; Mürnseer, M; Deeg, J; Bachmann, A; Spatz, J; Dooley, S; Eils, R; Gladilin, E

    2017-05-01

    The mechanical cell environment is a key regulator of biological processes . In living tissues, cells are embedded into the 3D extracellular matrix and permanently exposed to mechanical forces. Quantification of the cellular strain state in a 3D matrix is therefore the first step towards understanding how physical cues determine single cell and multicellular behaviour. The majority of cell assays are, however, based on 2D cell cultures that lack many essential features of the in vivo cellular environment. Furthermore, nondestructive measurement of substrate and cellular mechanics requires appropriate computational tools for microscopic image analysis and interpretation. Here, we present an experimental and computational framework for generation and quantification of the cellular strain state in 3D cell cultures using a combination of 3D substrate stretcher, multichannel microscopic imaging and computational image analysis. The 3D substrate stretcher enables deformation of living cells embedded in bead-labelled 3D collagen hydrogels. Local substrate and cell deformations are determined by tracking displacement of fluorescent beads with subsequent finite element interpolation of cell strains over a tetrahedral tessellation. In this feasibility study, we debate diverse aspects of deformable 3D culture construction, quantification and evaluation, and present an example of its application for quantitative analysis of a cellular model system based on primary mouse hepatocytes undergoing transforming growth factor (TGF-β) induced epithelial-to-mesenchymal transition. © 2017 The Authors. Journal of Microscopy published by JohnWiley & Sons Ltd on behalf of Royal Microscopical Society.

  5. Computational Challenges in the Analysis of Petrophysics Using Microtomography and Upscaling

    NASA Astrophysics Data System (ADS)

    Liu, J.; Pereira, G.; Freij-Ayoub, R.; Regenauer-Lieb, K.

    2014-12-01

    Microtomography provides detailed 3D internal structures of rocks in micro- to tens of nano-meter resolution and is quickly turning into a new technology for studying petrophysical properties of materials. An important step is the upscaling of these properties as micron or sub-micron resolution can only be done on the sample-scale of millimeters or even less than a millimeter. We present here a recently developed computational workflow for the analysis of microstructures including the upscaling of material properties. Computations of properties are first performed using conventional material science simulations at micro to nano-scale. The subsequent upscaling of these properties is done by a novel renormalization procedure based on percolation theory. We have tested the workflow using different rock samples, biological and food science materials. We have also applied the technique on high-resolution time-lapse synchrotron CT scans. In this contribution we focus on the computational challenges that arise from the big data problem of analyzing petrophysical properties and its subsequent upscaling. We discuss the following challenges: 1) Characterization of microtomography for extremely large data sets - our current capability. 2) Computational fluid dynamics simulations at pore-scale for permeability estimation - methods, computing cost and accuracy. 3) Solid mechanical computations at pore-scale for estimating elasto-plastic properties - computational stability, cost, and efficiency. 4) Extracting critical exponents from derivative models for scaling laws - models, finite element meshing, and accuracy. Significant progress in each of these challenges is necessary to transform microtomography from the current research problem into a robust computational big data tool for multi-scale scientific and engineering problems.

  6. Geological characterization and statistical comparison of outcrop and subsurface facies: Shannon Shelf sand ridges

    NASA Astrophysics Data System (ADS)

    Jackson, S.; Szpaklewicz, M.; Tomutsa, L.

    1987-09-01

    The primary objective of this research is to develop a methodology for constructing accurate quantitative models of reservoir heterogeneities. The resulting models are expected to improve predictions of flow patterns, spatial distribution of residual oil after secondary and tertiary recovery operations, and ultimate oil recovery. The purpose of this study is to provide preliminary evaluation of the usefulness of outcrop information in characterizing analogous reservoirs and to develop research techniques necessary for model development. The Shannon Sandstone, a shelf sand ridge deposit in the Powder River Basin, Wyoming, was studied. Sedimentologic and petrophysical features of an outcrop exposure of the High-Energy Ridge-Margin facies (HERM) within the Shannon were compared with those from a Shannon sandstone reservoir in Teapot Dome field. Comparisons of outcrop and subsurface permeability and porosity histograms, cumulative distribution functions, correlation lengths and natural logarithm of permeability versus porosity plots indicate a strong similarity between Shannon outcrop and Teapot Dome HERM facies petrophysical properties. Permeability classes found in outcrop samples can be related to crossbedded zones and shaley, rippled, and bioturbated zones. Similar permeability classes related to similar sedimentologic features were found in Teapot Dome field. The similarities of outcrop and Teapot Dome petrophysical properties, which are from the same geologic facies but from different depositional episodes, suggest that rocks deposited under similar depositional processes within a given deposystem have similar reservoir properties. The results of the study indicate that the use of quantitative outcrop information in characterizing reservoirs may provide a significant improvement in reservoir characterization.

  7. Calculation of thermal conductivity, thermal diffusivity and specific heat capacity of sedimentary rocks using petrophysical well logs

    NASA Astrophysics Data System (ADS)

    Fuchs, Sven; Balling, Niels; Förster, Andrea

    2015-12-01

    In this study, equations are developed that predict for synthetic sedimentary rocks (clastics, carbonates and evapourates) thermal properties comprising thermal conductivity, specific heat capacity and thermal diffusivity. The rock groups are composed of mineral assemblages with variable contents of 15 major rock-forming minerals and porosities of 0-30 per cent. Petrophysical properties and their well-logging-tool-characteristic readings were assigned to these rock-forming minerals and to pore-filling fluids. Relationships are explored between each thermal property and other petrophysical properties (density, sonic interval transit time, hydrogen index, volume fraction of shale and photoelectric absorption index) using multivariate statistics. The application of these relations allows computing continuous borehole profiles for each rock thermal property. The uncertainties in the prediction of each property vary depending on the selected well-log combination. Best prediction is in the range of 2-8 per cent for the specific heat capacity, of 5-10 per cent for the thermal conductivity, and of 8-15 for the thermal diffusivity, respectively. Well-log derived thermal conductivity is validated by laboratory data measured on cores from deep boreholes of the Danish Basin, the North German Basin, and the Molasse Basin. Additional validation of thermal conductivity was performed by comparing predicted and measured temperature logs. The maximum deviation between these logs is <3 °C. The thermal-conductivity calculation allowed an evaluation of the depth range in which the palaeoclimatic effect on the subsurface temperature field can be observed in the North German Basin. This effect reduces the surface heat-flow density by 25 mW m-2.

  8. Studying Petrophysical and Geomechanical Properties of Utica Point-Pleasant Shale and its Variations Across the Northern Appalachian Basin

    NASA Astrophysics Data System (ADS)

    Raziperchikolaee, S.; Kelley, M. E.; Burchwell, A.

    2017-12-01

    Understanding petrophysical and geomechanical parameters of shale formations and their variations across the basin are necessary to optimize the design of a hydraulic fracturing program aimed at enhancing long term oil/gas production from unconventional wells. Dipole sonic logging data (compressional-wave and shear-wave slowness) from multiple wells across the study area, coupled with formation bulk density log data, were used to calculate dynamic elastic parameters, including shear modulus, bulk modulus, Poisson's ratio, and Young's modulus for the shale formations. The individual-well data were aggregated into a single histogram for each parameter to gain an understanding of the variation in the properties (including brittleness) of the Utica Point-Pleasant formations across the entire study area. A crossplot of the compressional velocity and bulk density and a crossplot between the compressional velocity, the shear velocity, and depth of the measurement were used for a high level petrophysical characterization of the Utica Point-Pleasant. Detailed interpretation of drilling induced fractures recorded in image logs, and an analysis of shear wave anisotropy using multi-receiver sonic logs were also performed. Orientation of drilling induced fractures was measured to determine the maximum horizontal stress azimuth. Also, an analysis of shear wave anisotropy to predict stress anisotropy around the wellbore was performed to determine the direction of maximum horizontal stress. Our study shows how the detailed interpretation of borehole breakouts, drilling induced fractures, and sonic wave data can be used to reduce uncertainty and produce a better hydraulic fracturing design in the Utica Point Pleasant formations across the northern Appalachian Basin region of Ohio.

  9. PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data.

    PubMed

    Mitchell, Christopher J; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh

    2016-08-01

    Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, (15)N, (13)C, or (18)O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25-45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  10. Spatial delineation, fluid-lithology characterization, and petrophysical modeling of deepwater Gulf of Mexico reservoirs though joint AVA deterministic and stochastic inversion of three-dimensional partially-stacked seismic amplitude data and well logs

    NASA Astrophysics Data System (ADS)

    Contreras, Arturo Javier

    This dissertation describes a novel Amplitude-versus-Angle (AVA) inversion methodology to quantitatively integrate pre-stack seismic data, well logs, geologic data, and geostatistical information. Deterministic and stochastic inversion algorithms are used to characterize flow units of deepwater reservoirs located in the central Gulf of Mexico. A detailed fluid/lithology sensitivity analysis was conducted to assess the nature of AVA effects in the study area. Standard AVA analysis indicates that the shale/sand interface represented by the top of the hydrocarbon-bearing turbidite deposits generate typical Class III AVA responses. Layer-dependent Biot-Gassmann analysis shows significant sensitivity of the P-wave velocity and density to fluid substitution, indicating that presence of light saturating fluids clearly affects the elastic response of sands. Accordingly, AVA deterministic and stochastic inversions, which combine the advantages of AVA analysis with those of inversion, have provided quantitative information about the lateral continuity of the turbidite reservoirs based on the interpretation of inverted acoustic properties and fluid-sensitive modulus attributes (P-Impedance, S-Impedance, density, and LambdaRho, in the case of deterministic inversion; and P-velocity, S-velocity, density, and lithotype (sand-shale) distributions, in the case of stochastic inversion). The quantitative use of rock/fluid information through AVA seismic data, coupled with the implementation of co-simulation via lithotype-dependent multidimensional joint probability distributions of acoustic/petrophysical properties, provides accurate 3D models of petrophysical properties such as porosity, permeability, and water saturation. Pre-stack stochastic inversion provides more realistic and higher-resolution results than those obtained from analogous deterministic techniques. Furthermore, 3D petrophysical models can be more accurately co-simulated from AVA stochastic inversion results. By combining AVA sensitivity analysis techniques with pre-stack stochastic inversion, geologic data, and awareness of inversion pitfalls, it is possible to substantially reduce the risk in exploration and development of conventional and non-conventional reservoirs. From the final integration of deterministic and stochastic inversion results with depositional models and analogous examples, the M-series reservoirs have been interpreted as stacked terminal turbidite lobes within an overall fan complex (the Miocene MCAVLU Submarine Fan System); this interpretation is consistent with previous core data interpretations and regional stratigraphic/depositional studies.

  11. Assessing the utility of FIB-SEM images for shale digital rock physics

    NASA Astrophysics Data System (ADS)

    Kelly, Shaina; El-Sobky, Hesham; Torres-Verdín, Carlos; Balhoff, Matthew T.

    2016-09-01

    Shales and other unconventional or low permeability (tight) reservoirs house vast quantities of hydrocarbons, often demonstrate considerable water uptake, and are potential repositories for fluid sequestration. The pore-scale topology and fluid transport mechanisms within these nanoporous sedimentary rocks remain to be fully understood. Image-informed pore-scale models are useful tools for studying porous media: a debated question in shale pore-scale petrophysics is whether there is a representative elementary volume (REV) for shale models? Furthermore, if an REV exists, how does it differ among petrophysical properties? We obtain three dimensional (3D) models of the topology of microscale shale volumes from image analysis of focused ion beam-scanning electron microscope (FIB-SEM) image stacks and investigate the utility of these models as a potential REV for shale. The scope of data used in this work includes multiple local groups of neighboring FIB-SEM images of different microscale sizes, corresponding core-scale (milli- and centimeters) laboratory data, and, for comparison, series of two-dimensional (2D) cross sections from broad ion beam SEM images (BIB-SEM), which capture a larger microscale field of view than the FIB-SEM images; this array of data is larger than the majority of investigations with FIB-SEM-derived microscale models of shale. Properties such as porosity, organic matter content, and pore connectivity are extracted from each model. Assessments of permeability with single phase, pressure-driven flow simulations are performed in the connected pore space of the models using the lattice-Boltzmann method. Calculated petrophysical properties are compared to those of neighboring FIB-SEM images and to core-scale measurements of the sample associated with the FIB-SEM sites. Results indicate that FIB-SEM images below ∼5000 μm3 volume (the largest volume analyzed) are not a suitable REV for shale permeability and pore-scale networks; i.e. field of view is compromised at the expense of detailed, but often unconnected, nanopore morphology. Further, we find that it is necessary to acquire several local FIB-SEM or BIB-SEM images and correlate their extracted geometric properties to improve the likelihood of achieving representative values of porosity and organic matter volume. Our work indicates that FIB-SEM images of microscale volumes of shale are a qualitative tool for petrophysical and transport analysis. Finally, we offer alternatives for quantitative pore-scale assessments of shale.

  12. Detailed petrophysical characterization enhances geological mapping of a buried substratum using aeromagnetic and gravity data; application to the southwestern Paris basin

    NASA Astrophysics Data System (ADS)

    Baptiste, Julien; Martelet, Guillaume; Faure, Michel; Beccaletto, Laurent; Chen, Yan; Reninger, Pierre-Alexandre

    2016-04-01

    Mapping the geometries (structure and lithology) of a buried basement is a key for targeting resources and for improving the regional geological knowledge. The Paris basin is a Mesozoic to Cenozoic intraplate basin set up on a Variscan substratum, which crops out in the surrounding massifs. We focus our study on the southwestern part of the Paris basin at its junction with the Aquitaine basin. This Mezo-Cenozoic cover separates the Armorican Massif and the Massif Central which compose of several litho-tectonic units bounded by crustal-scale shear zones. In spite of several lithological and structural correlations between various domains of the two massifs, their geological connection, hidden below the Paris basin sedimentary cover, is still largely debated. Potential field geophysics have proven effective for mapping buried basin/basement interfaces. In order to enhance the cartographic interpretation of these data, we have set up a detailed petrophysical library (field magnetic susceptibility data and density measurements on rock samples) of the Paleozoic rocks outcropping in the Variscan massifs. The combination of aeromagnetic and gravity data supported by the petrophysical signatures and field/borehole geological information, is carried out to propose a new map of the architecture of the Variscan substratum. The new synthetic map of geophysical signature of the Paris basin basement combines: i) the magnetic anomaly reduced to the pole, ii) the vertical gradient of the Bouguer anomaly and iii) the tilt derivative of the magnetic anomaly reduced to the pole. Based on this information, the Eastern extension of the major shear zones below the sedimentary cover is assessed. The petrophysical signatures were classified in three classes of magnetic susceptibility and density: low, intermediate and high. Basic rocks have high magnetization and density values whereas granite, migmatite and orthogneiss show low magnetization and density values, Proterozoic and Paleozoic sediments, micaschists and metagrauwackes have intermediate to low magnetization and density values. Detailed lithological attribution of geophysical anomalies was achieved separately for each geological sub-domain (in between 2 major structures). This methodology will be generalized at the scale of the entire Paris basin in order to propose a tectonic reconstruction of this segment of the Variscan belt, and provide guides for the exploration of hidden resources.

  13. Correlating Petrophysical Well Logs Using Fractal-based Analysis to Identify Changes in the Signal Complexity Across Neutron, Density, Dipole Sonic, and Gamma Ray Tool Types

    NASA Astrophysics Data System (ADS)

    Matthews, L.; Gurrola, H.

    2015-12-01

    Typical petrophysical well log correlation is accomplished by manual pattern recognition leading to subjective correlations. The change in character in a well log is dependent upon the change in the response of the tool to lithology. The petrophysical interpreter looks for a change in one log type that would correspond to the way a different tool responds to the same lithology. To develop an objective way to pick changes in well log characteristics, we adapt a method of first arrival picking used in seismic data to analyze changes in the character of well logs. We chose to use the fractal method developed by Boschetti et al[1] (1996). This method worked better than we expected and we found similar changes in the fractal dimension across very different tool types (sonic vs density vs gamma ray). We reason the fractal response of the log is not dependent on the physics of the tool response but rather the change in the complexity of the log data. When a formation changes physical character in time or space the recorded magnitude in tool data changes complexity at the same time even if the original tool response is very different. The relative complexity of the data regardless of the tool used is dependent upon the complexity of the medium relative to tool measurement. The relative complexity of the recorded magnitude data changes as a tool transitions from one character type to another. The character we are measuring is the roughness or complexity of the petrophysical curve. Our method provides a way to directly compare different log types based on a quantitative change in signal complexity. For example, using changes in data complexity allow us to correlate gamma ray suites with sonic logs within a well and then across to an adjacent well with similar signatures. Our method creates reliable and automatic correlations to be made in data sets beyond the reasonable cognitive limits of geoscientists in both speed and consistent pattern recognition. [1] Fabio Boschetti, Mike D. Dentith, and Ron D. List, (1996). A fractal-based algorithm for detecting first arrivals on seismic traces. Geophysics, Vol.61, No.4, P. 1095-1102.

  14. Spatiotemporal sedimentological and petrophysical characterization of El Gueria reservoir (Ypresian) in sFAX and Gulf of Gabes Basins (SE-Tunisia)

    NASA Astrophysics Data System (ADS)

    Nadhem, Kassabi; Zahra, Njahi; Ménendez, Béatriz; Salwa, Jeddi; Jamel, Touir

    2017-06-01

    El Gueria carbonate Formation (Ypresian) in Tunisia is a proven hydrocarbon reservoir. In the Gulf of Gabes, El Gueria reservoir consists mainly of a nummulitic limestone which is developed in an inner shelf environment. In order to characterize the depositional facies evolution and the petrophysical parameters, and to understand the origin of heterogeneity of El Gueria reservoir, we firstly conducted a sedimentological and a sequence stratigraphy study of this Formation in more than 10 wells especially in P1, then we established a detailed petrophysical study of El Gueria reservoir in P1, P3c and P7d cores. Based on lithostratigraphic and gamma ray correlations of an important number of wells in the study area, a detailed sedimentological study has been established. This latter shows that: (i): The Ypresien deposits are deposited in an inner shelf (El Gueria Formation) in the south and in an outer shelf (Boudabbous Formation) in the north of the study area with the form of horsts and grabens, (ii): 3 distinct members and 7 principal facies within El Gueria Formation have been distinguished. The coupling of data logging and data of the P1 core shows that the El Gueria deposits include 10 transgressive-regressive depositional sequences, while showing from bottom to top a broad regressive tendancy from a subtidal domain during the early Ypresian to an intertidal domain during the middle Ypresian reaching the supratidal environnement during the late Ypresian-early Lutetian. The petrophysical parameters (porosity and permeability) of El Gueria reservoir vary in time and space (laterally and vertically variation) following the deposit environment variation. Particularly, the porosity variation is controlled by eustatic cycles so that high porosities are linked with transgressive phases and low porosities with regressive phases. In addition, the vertical evolution of porosity through the El Gueria reservoir varies following the (i) deposit environments, (ii) type and morphology of nummulites such as large nummulites are more porous than small nummulites and nummilithoclastes (iii) matrix and cement such as micrite are more porous than sparite (iv) microfacies and diagenetic structures (Fractures, stylolithic seals …) such as the fractured wackstone are the most porous and permeable.

  15. An Uncertainty Quantification Framework for Remote Sensing Retrievals

    NASA Astrophysics Data System (ADS)

    Braverman, A. J.; Hobbs, J.

    2017-12-01

    Remote sensing data sets produced by NASA and other space agencies are the result of complex algorithms that infer geophysical state from observed radiances using retrieval algorithms. The processing must keep up with the downlinked data flow, and this necessitates computational compromises that affect the accuracies of retrieved estimates. The algorithms are also limited by imperfect knowledge of physics and of ancillary inputs that are required. All of this contributes to uncertainties that are generally not rigorously quantified by stepping outside the assumptions that underlie the retrieval methodology. In this talk we discuss a practical framework for uncertainty quantification that can be applied to a variety of remote sensing retrieval algorithms. Ours is a statistical approach that uses Monte Carlo simulation to approximate the sampling distribution of the retrieved estimates. We will discuss the strengths and weaknesses of this approach, and provide a case-study example from the Orbiting Carbon Observatory 2 mission.

  16. Uncertainty quantification for PZT bimorph actuators

    NASA Astrophysics Data System (ADS)

    Bravo, Nikolas; Smith, Ralph C.; Crews, John

    2018-03-01

    In this paper, we discuss the development of a high fidelity model for a PZT bimorph actuator used for micro-air vehicles, which includes the Robobee. We developed a high-fidelity model for the actuator using the homogenized energy model (HEM) framework, which quantifies the nonlinear, hysteretic, and rate-dependent behavior inherent to PZT in dynamic operating regimes. We then discussed an inverse problem on the model. We included local and global sensitivity analysis of the parameters in the high-fidelity model. Finally, we will discuss the results of Bayesian inference and uncertainty quantification on the HEM.

  17. EEG-Based Quantification of Cortical Current Density and Dynamic Causal Connectivity Generalized across Subjects Performing BCI-Monitored Cognitive Tasks

    PubMed Central

    Courellis, Hristos; Mullen, Tim; Poizner, Howard; Cauwenberghs, Gert; Iversen, John R.

    2017-01-01

    Quantification of dynamic causal interactions among brain regions constitutes an important component of conducting research and developing applications in experimental and translational neuroscience. Furthermore, cortical networks with dynamic causal connectivity in brain-computer interface (BCI) applications offer a more comprehensive view of brain states implicated in behavior than do individual brain regions. However, models of cortical network dynamics are difficult to generalize across subjects because current electroencephalography (EEG) signal analysis techniques are limited in their ability to reliably localize sources across subjects. We propose an algorithmic and computational framework for identifying cortical networks across subjects in which dynamic causal connectivity is modeled among user-selected cortical regions of interest (ROIs). We demonstrate the strength of the proposed framework using a “reach/saccade to spatial target” cognitive task performed by 10 right-handed individuals. Modeling of causal cortical interactions was accomplished through measurement of cortical activity using (EEG), application of independent component clustering to identify cortical ROIs as network nodes, estimation of cortical current density using cortically constrained low resolution electromagnetic brain tomography (cLORETA), multivariate autoregressive (MVAR) modeling of representative cortical activity signals from each ROI, and quantification of the dynamic causal interaction among the identified ROIs using the Short-time direct Directed Transfer function (SdDTF). The resulting cortical network and the computed causal dynamics among its nodes exhibited physiologically plausible behavior, consistent with past results reported in the literature. This physiological plausibility of the results strengthens the framework's applicability in reliably capturing complex brain functionality, which is required by applications, such as diagnostics and BCI. PMID:28566997

  18. Chapter 1: Executive Summary - 2003 Assessment of Undiscovered Oil and Gas Resources in the Upper Cretaceous Navarro and Taylor Groups, Western Gulf Province, Gulf Coast Region, Texas

    USGS Publications Warehouse

    ,

    2006-01-01

    The U.S. Geological Survey (USGS) recently completed an assessment of the undiscovered oil and gas potential of the Upper Cretaceous Navarro and Taylor Groups in the Western Gulf Province of the Gulf Coast region (fig. 1) as part of a national oil and gas assessment effort (USGS Navarro and Taylor Groups Assessment Team, 2004). The assessment of the petroleum potential of the Navarro and Taylor Groups was based on the general geologic elements used to define a total petroleum system (TPS), including hydrocarbon source rocks (source rock maturation, hydrocarbon generation and migration), reservoir rocks (sequence stratigraphy and petrophysical properties), and hydrocarbon traps (trap formation and timing). Using this geologic framework, the USGS defined five assessment units (AU) in the Navarro and Taylor Groups as parts of a single TPS, the Smackover-Austin-Eagle Ford Composite TPS: Travis Volcanic Mounds Oil AU, Uvalde Volcanic Mounds Gas and Oil AU, Navarro-Taylor Updip Oil and Gas AU, Navarro-Taylor Downdip Gas and Oil AU, and Navarro-Taylor Slope-Basin Gas AU (table 1).

  19. Petrophysical characterization of first ever drilled core samples from an active CO2 storage site, the German Ketzin Pilot Site - Comparison with long term experiments

    NASA Astrophysics Data System (ADS)

    Zemke, Kornelia; Liebscher, Axel

    2014-05-01

    Petrophysical properties like porosity and permeability are key parameters for a safe long-term storage of CO2 but also for the injection operation itself. These parameters may change during and/or after the CO2 injection due to geochemical reactions in the reservoir system that are triggered by the injected CO2. Here we present petrophysical data of first ever drilled cores from a newly drilled well at the active CO2 storage site - the Ketzin pilot site in the Federal State of Brandenburg, Germany. By comparison with pre-injection baseline data from core samples recovered prior to injection, the new samples provide the unique opportunity to evaluate the impact of CO2 on pore size related properties of reservoir and cap rocks at a real injection site under in-situ reservoir conditions. After injection of 61 000 tons CO2, an additional well was drilled and new rock cores were recovered. In total 100 core samples from the reservoir and the overlaying caprock were investigated by NMR relaxation. Permeability of 20 core samples was estimated by nitrogen and porosity by helium pycnometry. The determined data are comparable between pre-injection and post-injection core samples. The lower part of the reservoir sandstone is unaffected by the injected CO2. The upper part of the reservoir sandstone shows consistently slightly lower NMR porosity and permeability values in the post-injection samples when compared to the pre-injection data. This upper sandstone part is above the fluid level and CO2 present as a free gas phase and a possible residual gas saturation of the cores distorted the NMR results. The potash-containing drilling fluid can also influence these results: NMR investigation of twin samples from inner and outer parts of the cores show a reduced fraction of larger pores for the outer core samples together with lower porosities and T2 times. The drill mud penetration depth can be controlled by the added fluorescent tracer. Due to the heterogeneous character of the Stuttgart Formation it is difficult to estimate definite CO2 induced changes from petrophysical measurements. The observed changes are only minor. Several batch experiments on Ketzin samples drilled prior injection confirm the results from investigation of the in-situ rock cores. Core samples of the pre-injection wells were exposed to CO2 and brine in autoclaves over various time periods. Samples were characterized prior to and after the experiments by NMR and Mercury Injection Porosimetry (MIP). The results are consistent with the logging data and show only minor change. Unfortunately, also in these experiments observed mineralogical and petrophysical changes were within the natural heterogeneity of the Ketzin reservoir and precluded unequivocal conclusions. However, given the only minor differences between post-injection well and pre-injection well, it is reasonable to assume that the potential dissolution-precipitation processes appear to have no severe consequences on reservoir and cap rock integrity or on the injection behaviour. This is also in line with the continuously recorded injection operation parameter. These do not point to any changes in reservoir injectivity.|

  20. Petrophysics and hydrocarbon potential of Paleozoic rocks in Kuwait

    NASA Astrophysics Data System (ADS)

    Abdullah, Fowzia; Shaaban, Fouad; Khalaf, Fikry; Bahaman, Fatma; Akbar, Bibi; Al-Khamiss, Awatif

    2017-10-01

    Well logs from nine deep exploratory and development wells in Kuwaiti oil fields have been used to study petrophysical characteristics and their effect on the reservoir quality of the subsurface Paleozoic Khuff and Unayzah formations. Petrophysical log data have been calibrated with core analysis available at some intervals. The study indicates a complex lithological facies of the Khuff Formation that is composed mainly of dolomite and anhydrite interbeds with dispersed argillaceous materials and few limestone intercalations. This facies greatly lowered the formation matrix porosity and permeability index. The porosity is fully saturated with water, which is reflected by the low resistivity logs responses, except at some intervals where few hydrocarbon shows are recorded. The impermeable anhydrites, massive (low-permeability) carbonate rock and shale at the lower part of the formation combine to form intraformational seals for the clastic reservoirs of the underlying Unayzah Formation. By contrast, the log interpretation revealed clastic lithological nature of the Unayzah Formation with cycles of conglomerate, sandstone, siltstone, mudstone and shales. The recorded argillaceous materials are mainly of disseminated habit, which control, for some extent, the matrix porosity, that ranges from 2% to 15% with water saturation ranges from 65% to 100%. Cementation, dissolution, compaction and clay mineral authigenesis are the most significant diagenetic processes affecting the reservoir quality. Calibration with the available core analysis at some intervals of the formation indicates that the siliciclastic sequence is a fluvial with more than one climatic cycle changes from humid, semi-arid to arid condition and displays the impact of both physical and chemical diagenesis. In general, the study revealed that the Unyazah Formation has a better reservoir quality than the Khuff Formation and possible gas bearing zones.

  1. A genetic meta-algorithm-assisted inversion approach: hydrogeological study for the determination of volumetric rock properties and matrix and fluid parameters in unsaturated formations

    NASA Astrophysics Data System (ADS)

    Szabó, Norbert Péter

    2018-03-01

    An evolutionary inversion approach is suggested for the interpretation of nuclear and resistivity logs measured by direct-push tools in shallow unsaturated sediments. The efficiency of formation evaluation is improved by estimating simultaneously (1) the petrophysical properties that vary rapidly along a drill hole with depth and (2) the zone parameters that can be treated as constant, in one inversion procedure. In the workflow, the fractional volumes of water, air, matrix and clay are estimated in adjacent depths by linearized inversion, whereas the clay and matrix properties are updated using a float-encoded genetic meta-algorithm. The proposed inversion method provides an objective estimate of the zone parameters that appear in the tool response equations applied to solve the forward problem, which can significantly increase the reliability of the petrophysical model as opposed to setting these parameters arbitrarily. The global optimization meta-algorithm not only assures the best fit between the measured and calculated data but also gives a reliable solution, practically independent of the initial model, as laboratory data are unnecessary in the inversion procedure. The feasibility test uses engineering geophysical sounding logs observed in an unsaturated loessy-sandy formation in Hungary. The multi-borehole extension of the inversion technique is developed to determine the petrophysical properties and their estimation errors along a profile of drill holes. The genetic meta-algorithmic inversion method is recommended for hydrogeophysical logging applications of various kinds to automatically extract the volumetric ratios of rock and fluid constituents as well as the most important zone parameters in a reliable inversion procedure.

  2. pyGIMLi: An open-source library for modelling and inversion in geophysics

    NASA Astrophysics Data System (ADS)

    Rücker, Carsten; Günther, Thomas; Wagner, Florian M.

    2017-12-01

    Many tasks in applied geosciences cannot be solved by single measurements, but require the integration of geophysical, geotechnical and hydrological methods. Numerical simulation techniques are essential both for planning and interpretation, as well as for the process understanding of modern geophysical methods. These trends encourage open, simple, and modern software architectures aiming at a uniform interface for interdisciplinary and flexible modelling and inversion approaches. We present pyGIMLi (Python Library for Inversion and Modelling in Geophysics), an open-source framework that provides tools for modelling and inversion of various geophysical but also hydrological methods. The modelling component supplies discretization management and the numerical basis for finite-element and finite-volume solvers in 1D, 2D and 3D on arbitrarily structured meshes. The generalized inversion framework solves the minimization problem with a Gauss-Newton algorithm for any physical forward operator and provides opportunities for uncertainty and resolution analyses. More general requirements, such as flexible regularization strategies, time-lapse processing and different sorts of coupling individual methods are provided independently of the actual methods used. The usage of pyGIMLi is first demonstrated by solving the steady-state heat equation, followed by a demonstration of more complex capabilities for the combination of different geophysical data sets. A fully coupled hydrogeophysical inversion of electrical resistivity tomography (ERT) data of a simulated tracer experiment is presented that allows to directly reconstruct the underlying hydraulic conductivity distribution of the aquifer. Another example demonstrates the improvement of jointly inverting ERT and ultrasonic data with respect to saturation by a new approach that incorporates petrophysical relations in the inversion. Potential applications of the presented framework are manifold and include time-lapse, constrained, joint, and coupled inversions of various geophysical and hydrological data sets.

  3. Uncertainty quantification in volumetric Particle Image Velocimetry

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Sayantan; Charonko, John; Vlachos, Pavlos

    2016-11-01

    Particle Image Velocimetry (PIV) uncertainty quantification is challenging due to coupled sources of elemental uncertainty and complex data reduction procedures in the measurement chain. Recent developments in this field have led to uncertainty estimation methods for planar PIV. However, no framework exists for three-dimensional volumetric PIV. In volumetric PIV the measurement uncertainty is a function of reconstructed three-dimensional particle location that in turn is very sensitive to the accuracy of the calibration mapping function. Furthermore, the iterative correction to the camera mapping function using triangulated particle locations in space (volumetric self-calibration) has its own associated uncertainty due to image noise and ghost particle reconstructions. Here we first quantify the uncertainty in the triangulated particle position which is a function of particle detection and mapping function uncertainty. The location uncertainty is then combined with the three-dimensional cross-correlation uncertainty that is estimated as an extension of the 2D PIV uncertainty framework. Finally the overall measurement uncertainty is quantified using an uncertainty propagation equation. The framework is tested with both simulated and experimental cases. For the simulated cases the variation of estimated uncertainty with the elemental volumetric PIV error sources are also evaluated. The results show reasonable prediction of standard uncertainty with good coverage.

  4. Comparison of laboratory and in-situ measurements of waterflood residual oil saturations for the Cormorant field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    van Poelgeest, F.; Niko, H.; Modwid, A.R.

    1991-03-01

    Shell Expro and Koninklijke/Shell E and P Laboratorium (KSEPL) have been engaged in a multidisciplinary effort to determine the water flood residual oil saturation (ROS) in two principal reservoirs of the Cormorant oil field in the U.K. sector of the North Sea. Data acquisition included special coring and testing. The study, which involved new reservoir-engineering and petrophysical techniques, was aimed at establishing consistent ROS values. This paper reports that reservoir-engineering work centered on reservoir-condition corefloods in the relative-permeability-at-reservoir-conditions (REPARC) apparatus, in which restoration of representative wettability condition was attempted with the aging technique. Aging results in a consistent reduction ofmore » water-wetness of all core samples. The study indicated that ROS values obtained on aged cores at water throughputs of at least 5 PV represented reservoir conditions. The petrophysical part of the study involved ROS estimation from sponge-core analysis and log evaluation.« less

  5. CORRELATOR 5.2 - A program for interactive lithostratigraphic correlation of wireline logs

    USGS Publications Warehouse

    Olea, R.A.

    2004-01-01

    The limited radius of investigation of petrophysical measurements made in boreholes and the relatively large distances between wells result in an incomplete sensing of the subsurface through well logging. CORRELATOR is a program for estimating geological properties between logged boreholes. An initial and fundamental step is the lithostratigraphic correlation of logs in different wells. The method employed by the program closely emulates the process of visual inspection used by experienced subsurface geologists in manual correlation. Mathematically, the determination of lithostratigraphical equivalence is based on the simultaneous assessment of similarity in shale content, similarity in the patterns of vertical variation in a petrophysical property that is measured with high vertical resolution, and spatial consistency of stratigraphic relationships as determined by an expert system. Multiple additional options for processing log readings allow maximization in the extraction of information from pairs of logs per well and great flexibility in the final display of results in the form of cross sections and dip diagrams. ?? 2004 Elsevier Ltd. All rights reserved.

  6. A comparison of petrophysical data inputs for establishing time-depth relationships: a guide for future drilling expeditions

    NASA Astrophysics Data System (ADS)

    Boaga, J.; Sauermilch, I.; Mateo, Z. R. P.

    2017-12-01

    Time-depth relationships (TDR) are crucial in correlating drillhole and core information to seismic reflection profiles, for accurate resource estimation, scientific interpretation and to guide drilling operations. Conventional seismic time-depth domain conversion utilizes downhole sonic logs (DSI), calibrated using available checkshot data, which are local travel times from the surface to a particular depth. Scientific drilling programs (ODP and IODP) also measure P-wave velocity (PWL or C) on recovered core samples. Only three percent of all ODP and IODP sites record all three velocity measurements, however this information can be instructive as sometimes these data input show dissimilar TDR. These representative sites provide us with an opportunity to perform a comparative analysis highlighting the differences and similarities of TDRs derived from checkshot, downhole, and laboratory measurements. We then discuss the impact of lithology, stratigraphy, water column and other petrophysical properties in the predictive accuracy of TDR calculations, in an effort to provide guidance for future drilling and coring expeditions.

  7. Strawberry: Fast and accurate genome-guided transcript reconstruction and quantification from RNA-Seq.

    PubMed

    Liu, Ruolin; Dickerson, Julie

    2017-11-01

    We propose a novel method and software tool, Strawberry, for transcript reconstruction and quantification from RNA-Seq data under the guidance of genome alignment and independent of gene annotation. Strawberry consists of two modules: assembly and quantification. The novelty of Strawberry is that the two modules use different optimization frameworks but utilize the same data graph structure, which allows a highly efficient, expandable and accurate algorithm for dealing large data. The assembly module parses aligned reads into splicing graphs, and uses network flow algorithms to select the most likely transcripts. The quantification module uses a latent class model to assign read counts from the nodes of splicing graphs to transcripts. Strawberry simultaneously estimates the transcript abundances and corrects for sequencing bias through an EM algorithm. Based on simulations, Strawberry outperforms Cufflinks and StringTie in terms of both assembly and quantification accuracies. Under the evaluation of a real data set, the estimated transcript expression by Strawberry has the highest correlation with Nanostring probe counts, an independent experiment measure for transcript expression. Strawberry is written in C++14, and is available as open source software at https://github.com/ruolin/strawberry under the MIT license.

  8. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  9. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  10. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    NASA Astrophysics Data System (ADS)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.

    2018-03-01

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  11. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE PAGES

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; ...

    2018-02-09

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  12. Standardized evaluation framework for evaluating coronary artery stenosis detection, stenosis quantification and lumen segmentation algorithms in computed tomography angiography.

    PubMed

    Kirişli, H A; Schaap, M; Metz, C T; Dharampal, A S; Meijboom, W B; Papadopoulou, S L; Dedic, A; Nieman, K; de Graaf, M A; Meijs, M F L; Cramer, M J; Broersen, A; Cetin, S; Eslami, A; Flórez-Valencia, L; Lor, K L; Matuszewski, B; Melki, I; Mohr, B; Oksüz, I; Shahzad, R; Wang, C; Kitslaar, P H; Unal, G; Katouzian, A; Örkisz, M; Chen, C M; Precioso, F; Najman, L; Masood, S; Ünay, D; van Vliet, L; Moreno, R; Goldenberg, R; Vuçini, E; Krestin, G P; Niessen, W J; van Walsum, T

    2013-12-01

    Though conventional coronary angiography (CCA) has been the standard of reference for diagnosing coronary artery disease in the past decades, computed tomography angiography (CTA) has rapidly emerged, and is nowadays widely used in clinical practice. Here, we introduce a standardized evaluation framework to reliably evaluate and compare the performance of the algorithms devised to detect and quantify the coronary artery stenoses, and to segment the coronary artery lumen in CTA data. The objective of this evaluation framework is to demonstrate the feasibility of dedicated algorithms to: (1) (semi-)automatically detect and quantify stenosis on CTA, in comparison with quantitative coronary angiography (QCA) and CTA consensus reading, and (2) (semi-)automatically segment the coronary lumen on CTA, in comparison with expert's manual annotation. A database consisting of 48 multicenter multivendor cardiac CTA datasets with corresponding reference standards are described and made available. The algorithms from 11 research groups were quantitatively evaluated and compared. The results show that (1) some of the current stenosis detection/quantification algorithms may be used for triage or as a second-reader in clinical practice, and that (2) automatic lumen segmentation is possible with a precision similar to that obtained by experts. The framework is open for new submissions through the website, at http://coronary.bigr.nl/stenoses/. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Methods for detection of GMOs in food and feed.

    PubMed

    Marmiroli, Nelson; Maestri, Elena; Gullì, Mariolina; Malcevschi, Alessio; Peano, Clelia; Bordoni, Roberta; De Bellis, Gianluca

    2008-10-01

    This paper reviews aspects relevant to detection and quantification of genetically modified (GM) material within the feed/food chain. The GM crop regulatory framework at the international level is evaluated with reference to traceability and labelling. Current analytical methods for the detection, identification, and quantification of transgenic DNA in food and feed are reviewed. These methods include quantitative real-time PCR, multiplex PCR, and multiplex real-time PCR. Particular attention is paid to methods able to identify multiple GM events in a single reaction and to the development of microdevices and microsensors, though they have not been fully validated for application.

  14. Convex geometry of quantum resource quantification

    NASA Astrophysics Data System (ADS)

    Regula, Bartosz

    2018-01-01

    We introduce a framework unifying the mathematical characterisation of different measures of general quantum resources and allowing for a systematic way to define a variety of faithful quantifiers for any given convex quantum resource theory. The approach allows us to describe many commonly used measures such as matrix norm-based quantifiers, robustness measures, convex roof-based measures, and witness-based quantifiers together in a common formalism based on the convex geometry of the underlying sets of resource-free states. We establish easily verifiable criteria for a measure to possess desirable properties such as faithfulness and strong monotonicity under relevant free operations, and show that many quantifiers obtained in this framework indeed satisfy them for any considered quantum resource. We derive various bounds and relations between the measures, generalising and providing significantly simplified proofs of results found in the resource theories of quantum entanglement and coherence. We also prove that the quantification of resources in this framework simplifies for pure states, allowing us to obtain more easily computable forms of the considered measures, and show that many of them are in fact equal on pure states. Further, we investigate the dual formulation of resource quantifiers, which provide a characterisation of the sets of resource witnesses. We present an explicit application of the results to the resource theories of multi-level coherence, entanglement of Schmidt number k, multipartite entanglement, as well as magic states, providing insight into the quantification of the four resources by establishing novel quantitative relations and introducing new quantifiers, such as a measure of entanglement of Schmidt number k which generalises the convex roof-extended negativity, a measure of k-coherence which generalises the \

  15. Characterization and calibration of seawater intrusion models using electrical resistivity tomography (Invited)

    NASA Astrophysics Data System (ADS)

    Nguyen, F. H.; Kemna, A.; Antonsson, A.; Engesgaard, P. K.; Beaujean, J.

    2009-12-01

    The urban development of coastal regions create seawater intrusion (SWI) problems which threatens groundwater quality and coastal ecosystems. To study SWI, one needs both robust measuring technologies, and reliable predictions. A key aspect in the calibration of SWI models involves reproducing measured groundwater chloride concentrations. Drilling such multi-screen wells to obtain a whole concentration profile is a risky task if reliable information about the position of the salt wedge is not available. Electrical resistivity tomography (ERT) is increasingly being used to characterize seawater intrusion and constrain corresponding models, given its high sensitivity to ion concentration in groundwater and its relatively high spatial resolution. We have investigated the potential of ERT using field data from a site in Almeria, SE Spain and synthetic data. Simulations have been run for several scenarios, with a simple hydrogeological model reflecting the local site conditions. The simulations showed that only the lower salt concentrations of the seawater-freshwater transition zone could be recovered, due to the loss of resolution with depth. We quantified this capability in terms of image appraisal indicators (cumulative sensitivity) associated with the measurement setup and showed that the mismatch between the targeted and imaged parameter values occurs from a certain threshold. Similarly, heterogeneity may only be determined accurately if located in an adequately sensitive area. Inversion of the synthetic data was performed by coupling an inversion code (PEST) with a finite-difference density-dependent flow and transport modeling code (HTS). The numerical results demonstrate the capacity of sensitivity-filtered ERT images to constrain transverse hydraulic dispersivity and longitudinal hydraulic conductivity of homogeneous seawater intrusion models. At the field site, we identified SWI at the scale of a few kilometers down to a hundred meters. Borehole logs show a remarkable correlation with the image obtained from surface data but indicate that the electrically derived mass fraction of pure seawater could not be recovered due to the discrepancy between the in-situ and laboratory-derived petrophysical relationships. Inversion of hydrologic model parameters using the field ERT image was not possible due to the inadequacy of a 2D representation of the geology at the site. Using ERT-derived data to estimate hydrological parameters requires to address resolution loss issues and the non-stationarity of the petrophysical relationship. The first issue may be approached using objective criteria. The most crucial limitation, however, is probably the non-stationarity of the petrophysical relationship. This is currently being investigated using more realistic models based on geostatistical modeling (SGeMS) of the petrophysical properties of a coastal aquifer and for transient simulations.

  16. A framework for testing the use of electric and electromagnetic data to reduce the prediction error of groundwater models

    NASA Astrophysics Data System (ADS)

    Christensen, N. K.; Christensen, S.; Ferre, T. P. A.

    2015-09-01

    Despite geophysics is being used increasingly, it is still unclear how and when the integration of geophysical data improves the construction and predictive capability of groundwater models. Therefore, this paper presents a newly developed HYdrogeophysical TEst-Bench (HYTEB) which is a collection of geological, groundwater and geophysical modeling and inversion software wrapped to make a platform for generation and consideration of multi-modal data for objective hydrologic analysis. It is intentionally flexible to allow for simple or sophisticated treatments of geophysical responses, hydrologic processes, parameterization, and inversion approaches. It can also be used to discover potential errors that can be introduced through petrophysical models and approaches to correlating geophysical and hydrologic parameters. With HYTEB we study alternative uses of electromagnetic (EM) data for groundwater modeling in a hydrogeological environment consisting of various types of glacial deposits with typical hydraulic conductivities and electrical resistivities covering impermeable bedrock with low resistivity. It is investigated to what extent groundwater model calibration and, often more importantly, model predictions can be improved by including in the calibration process electrical resistivity estimates obtained from TEM data. In all calibration cases, the hydraulic conductivity field is highly parameterized and the estimation is stabilized by regularization. For purely hydrologic inversion (HI, only using hydrologic data) we used Tikhonov regularization combined with singular value decomposition. For joint hydrogeophysical inversion (JHI) and sequential hydrogeophysical inversion (SHI) the resistivity estimates from TEM are used together with a petrophysical relationship to formulate the regularization term. In all cases, the regularization stabilizes the inversion, but neither the HI nor the JHI objective function could be minimized uniquely. SHI or JHI with regularization based on the use of TEM data produced estimated hydraulic conductivity fields that bear more resemblance to the reference fields than when using HI with Tikhonov regularization. However, for the studied system the resistivities estimated by SHI or JHI must be used with caution as estimators of hydraulic conductivity or as regularization means for subsequent hydrological inversion. Much of the lack of value of the geophysical data arises from a mistaken faith in the power of the petrophysical model in combination with geophysical data of low sensitivity, thereby propagating geophysical estimation errors into the hydrologic model parameters. With respect to reducing model prediction error, it depends on the type of prediction whether it has value to include geophysical data in the model calibration. It is found that all calibrated models are good predictors of hydraulic head. When the stress situation is changed from that of the hydrologic calibration data, then all models make biased predictions of head change. All calibrated models turn out to be a very poor predictor of the pumping well's recharge area and groundwater age. The reason for this is that distributed recharge is parameterized as depending on estimated hydraulic conductivity of the upper model layer which tends to be underestimated. Another important insight from the HYTEB analysis is thus that either recharge should be parameterized and estimated in a different way, or other types of data should be added to better constrain the recharge estimates.

  17. Hydrogen underground storage in siliciclastic reservoirs - intention and topics of the H2STORE project

    NASA Astrophysics Data System (ADS)

    Pudlo, Dieter; Ganzer, Leonhard; Henkel, Steven; Liebscher, Axel; Kühn, Michael; De Lucia, Marco; Panfilov, Michel; Pilz, Peter; Reitenbach, Viktor; Albrecht, Daniel; Würdemann, Hilke; Gaupp, Reinhard

    2013-04-01

    The transfer of energy supply from nuclear and CO2-emitting power generation to renewable energy production sources is strongly reliant to the potential of storing high capacities of energy in a safe and reliable way in time spans of several months. One conceivable option can be the storage of hydrogen and (related) synthetic natural gas (SNG) production in appropriate underground structures, like salt caverns and pore space reservoirs. Successful storage of hydrogen in the form of town gas in salt caverns has been proven in several demonstration projects and can be considered as state of the art technology. However, salt structures have only limited importance for hydrogen storage due to only small cavern volumes and the limited occurrence of salt deposits suitable for flushing of cavern constructions. Thus, regarding potential high-volume storage sites, siliciclastic deposits like saline aquifers and depleted gas reservoirs are of increasing interest. Motivated by a project call and sponsored by the German government the H2STORE ("Hydrogen to Store") collaborative project will investigate the feasibility and the requirements for pore space storage of hydrogen. Thereby depleted gas reservoirs are a major concern of this study. This type of geological structure is chosen because of their well investigated geological settings and proved sealing capacities, which already enable a present (and future) use as natural (and synthetic) reservoir gas storages. Nonetheless hydrogen and hydrocarbon in porous media exhibit major differences in physico-chemical behaviour, essentially due to the high diffusivity and reactivity of hydrogen. The biotic and abiotic reactions of hydrogen with rocks and fluids will be necessary observed in siliciclastic sediments which consist of numerous inorganic and organic compounds and comprise original formation fluids. These features strongly control petrophysical behaviour (e.g. porosity, permeability) and therefore fluid (hydrogen) migration. To reveal the relevance of these interactions and their impact on petrophysics and fluid mechanics in H2STORE six subprojects are included, which are devoted to various aspects of hydrogen storage in pore space reservoirs. The analytical and (laboratory) experimental studies will be based on rock and fluid samples issued from different reservoir sandstone and cap rock mudstone types originated from different depths all over Germany. Thereby data on sedimentological, geochemical, mineralogical, hydrochemical, petrophysical and microbiological rock composition will be gained. These studies will be completed with conceptual mathematical and numerical modelling of dynamic reservoir processes, including basin/facies burial evolution, mineralogical alteration, hydro-/geochemical reactions and gas mixing processes coupled with population dynamics of methanogenic microorganisms and dynamic displacement instability effects. The estimation of the hydrogen impact on reservoir behaviour of different rock types at depths will enable an evaluation of the feasibility of "Eco-/Green" methane and synthetic natural gas (SNG) generation by hydrogen reaction with CO2. The verification/falsification of specific processes will also enhance predictions on the operational reliability, the ecological tolerance, and the economic efficiency of future energy storing plants. These aspects are main motivations for any industrial investors and the public acceptance of such new technologies within the framework of an overall power supply by renewable energy production.

  18. An optimized workflow for building 3D models from balanced sections and potential field geophysics: a study case in NE Spain.

    NASA Astrophysics Data System (ADS)

    Ayala, Conxi; Izquierdo-Llavall, Esther; Pueyo, Emilio Luis; Rubio, Félix; Rodríguez-Pintó, Adriana; María Casas, Antonio; Oliva-Urcía, Belén; Rey-Moral, Carmen

    2015-04-01

    Obtaining an accurate 3D image of the geometry and physical properties of geological structures in depth is a challenge regardless the scale and the aim of the investigation. In this framework, assessing the origin of the uncertainties and reducing them is a key issue when building a 3D reconstruction of a target area. Usually, this process involves an interdisciplinary approach and also the use of different software whose inputs and outputs have to be interoperable. We have designed a new workflow for 2.5D and 3D geological and potential field modelling, especially useful in areas where no seismic data is available. The final aim is to obtain a 3D geological model, at a regional or local scale, with the smaller uncertainty as possible. Once the study area and the working scale are is decided, the first obvious step is to compile all preexisting data and to determine its uncertainties. If necessary, a survey will be carried out to acquire additional data (e.g., gravity, magnetic or petrophysical data) to have an appropriated coverage of information and rock samples. A thorough study of the petrophysical properties is made to determine the density, magnetic susceptibility and remanence that will be assigned to each lithology, together with its corresponding uncertainty. Finally, the modelling process is started, and it includes a feedback between geology and potential fields in order to progressively refine the model until it fits all the existing data. The procedure starts with the construction of balanced geological cross sections from field work, available geological maps as well as data from stratigraphic columns, boreholes, etc. These geological cross sections are exported and imported in GMSYS software to carry out the 2.5D potential field modelling. The model improves and its uncertainty is reduced through the feedback between the geologists and the geophysicists. Once the potential field anomalies are well adjusted, the cross sections are exported into 3DMove (Midland Valley) to construct a preliminary balanced 3D model. Inversion of the potential field data in GeoModeller is the final step to obtain a 3D model consistent with the input data and with the minimum possible uncertainty. Our case study is a 3D model from the Linking Zone between the Iberian Range and the Catalonian Costal ones (NE Spain, an extension of 11,325 km2). No seismic data was available, so we carried out several surveys to acquire new gravity data and rock samples to complete the data from IGME petrophysical databases. A total of 1470 samples have been used to define the physical properties for the modelled lithologies. The gravity data consists of 2902 stations. The initial model is based on the surface geology, eleven boreholes and 8 balanced geological cross sections built in the frame of this research. The final model resulted from gravimetric inversion has allowed us to define the geometry of the top of the basement as well as to identify two structures (anticlines) as potential CO2 reservoirs.

  19. Challenges in soil erosion research and prediction model development

    USDA-ARS?s Scientific Manuscript database

    Quantification of soil erosion has been traditionally considered as a surface hydrologic process with equations for soil detachment and sediment transport derived from the mechanics and hydraulics of the rainfall and surface flow. Under the current erosion modeling framework, the soil has a constant...

  20. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David; Agarwal, Deborah A.; Sun, Xin

    2011-09-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  1. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, D.; Agarwal, D.; Sun, X.

    2011-01-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  2. Quantifying circular RNA expression from RNA-seq data using model-based framework.

    PubMed

    Li, Musheng; Xie, Xueying; Zhou, Jing; Sheng, Mengying; Yin, Xiaofeng; Ko, Eun-A; Zhou, Tong; Gu, Wanjun

    2017-07-15

    Circular RNAs (circRNAs) are a class of non-coding RNAs that are widely expressed in various cell lines and tissues of many organisms. Although the exact function of many circRNAs is largely unknown, the cell type-and tissue-specific circRNA expression has implicated their crucial functions in many biological processes. Hence, the quantification of circRNA expression from high-throughput RNA-seq data is becoming important to ascertain. Although many model-based methods have been developed to quantify linear RNA expression from RNA-seq data, these methods are not applicable to circRNA quantification. Here, we proposed a novel strategy that transforms circular transcripts to pseudo-linear transcripts and estimates the expression values of both circular and linear transcripts using an existing model-based algorithm, Sailfish. The new strategy can accurately estimate transcript expression of both linear and circular transcripts from RNA-seq data. Several factors, such as gene length, amount of expression and the ratio of circular to linear transcripts, had impacts on quantification performance of circular transcripts. In comparison to count-based tools, the new computational framework had superior performance in estimating the amount of circRNA expression from both simulated and real ribosomal RNA-depleted (rRNA-depleted) RNA-seq datasets. On the other hand, the consideration of circular transcripts in expression quantification from rRNA-depleted RNA-seq data showed substantial increased accuracy of linear transcript expression. Our proposed strategy was implemented in a program named Sailfish-cir. Sailfish-cir is freely available at https://github.com/zerodel/Sailfish-cir . tongz@medicine.nevada.edu or wanjun.gu@gmail.com. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  3. Compendium of Arab exploratory wells and petroleum fields, 1985 edition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1985-01-01

    This book provides a compilation of primary and secondary information giving well name and province, operating company, completion date, exploration technique, bottom home formation, total depth, producing formations, lithology, geologic age, drilling results, and geologic, petrophysical, and production data. It covers all the Arab countries in a new format.

  4. Accurate frequency domain measurement of the best linear time-invariant approximation of linear time-periodic systems including the quantification of the time-periodic distortions

    NASA Astrophysics Data System (ADS)

    Louarroudi, E.; Pintelon, R.; Lataire, J.

    2014-10-01

    Time-periodic (TP) phenomena occurring, for instance, in wind turbines, helicopters, anisotropic shaft-bearing systems, and cardiovascular/respiratory systems, are often not addressed when classical frequency response function (FRF) measurements are performed. As the traditional FRF concept is based on the linear time-invariant (LTI) system theory, it is only approximately valid for systems with varying dynamics. Accordingly, the quantification of any deviation from this ideal LTI framework is more than welcome. The “measure of deviation” allows us to define the notion of the best LTI (BLTI) approximation, which yields the best - in mean square sense - LTI description of a linear time-periodic LTP system. By taking into consideration the TP effects, it is shown in this paper that the variability of the BLTI measurement can be reduced significantly compared with that of classical FRF estimators. From a single experiment, the proposed identification methods can handle (non-)linear time-periodic [(N)LTP] systems in open-loop with a quantification of (i) the noise and/or the NL distortions, (ii) the TP distortions and (iii) the transient (leakage) errors. Besides, a geometrical interpretation of the BLTI approximation is provided, leading to a framework called vector FRF analysis. The theory presented is supported by numerical simulations as well as real measurements mimicking the well-known mechanical Mathieu oscillator.

  5. Application of ecological site information to transformative changes on Great Basin sagebrush rangelands

    USDA-ARS?s Scientific Manuscript database

    Ecological Site Description (ESD) concepts are broadly applicable and provide a necessary framework to inform and guide rangeland management decisions. In this paper, we demonstrate how understanding and quantification of key vegetation, hydrology, and soil relationships in the ESD context can info...

  6. A Graphical Aid for Introducing the Climatic Water Budget.

    ERIC Educational Resources Information Center

    Shelton, Marlyn L.

    1986-01-01

    The climatic water budget model provides an analytical framework to help geography students examine the processes shaping the environment. Examples illustrate how the model can be used in geography classes. Two flow diagrams are presented to help students master quantification of water budget variables. (RM)

  7. A Bayes network approach to uncertainty quantification in hierarchically developed computational models

    DOE PAGES

    Urbina, Angel; Mahadevan, Sankaran; Paez, Thomas L.

    2012-03-01

    Here, performance assessment of complex systems is ideally accomplished through system-level testing, but because they are expensive, such tests are seldom performed. On the other hand, for economic reasons, data from tests on individual components that are parts of complex systems are more readily available. The lack of system-level data leads to a need to build computational models of systems and use them for performance prediction in lieu of experiments. Because their complexity, models are sometimes built in a hierarchical manner, starting with simple components, progressing to collections of components, and finally, to the full system. Quantification of uncertainty inmore » the predicted response of a system model is required in order to establish confidence in the representation of actual system behavior. This paper proposes a framework for the complex, but very practical problem of quantification of uncertainty in system-level model predictions. It is based on Bayes networks and uses the available data at multiple levels of complexity (i.e., components, subsystem, etc.). Because epistemic sources of uncertainty were shown to be secondary, in this application, aleatoric only uncertainty is included in the present uncertainty quantification. An example showing application of the techniques to uncertainty quantification of measures of response of a real, complex aerospace system is included.« less

  8. The T-TEL Method for Assessing Water, Sediment, and Chemical Connectivity

    NASA Astrophysics Data System (ADS)

    Ali, Genevieve; Oswald, Claire; Spence, Christopher; Wellen, Christopher

    2018-02-01

    The concept of connectivity has been the subject of a great deal of recent research and provided new insights and breakthroughs on runoff generation processes and watershed biogeochemistry. However, a consensus definition and cohesive mathematical framework that would permit the consistent quantification of hydrologic connectivity, the examination of the interrelationships between water and material (e.g., sediment and chemicals) connectivity, or rigorous study intercomparison, have not been presented by the water resource community. Building on previous conceptualizations and site-specific or process-specific metrics, this paper aimed to review the current state of science on hydrologic connectivity and its role in water-mediated connectivity of material such as solutes and sediment before introducing a conceptual and a mathematical connectivity assessment framework. These frameworks rely on the quantification of Time scales, Thresholds, Excesses and Losses related to water and water-mediated material transport dynamics and are referred to as the T-TEL method. Through a small case study, we show how the T-TEL method allows a wide range of properties to be quantified, namely the occurrence, frequency, duration, magnitude, and spatial extent of water and water-mediated material connectivity. We also propose a research agenda to refine the T-TEL method and ensure its usefulness for facilitating the research and management of connectivity in pristine and human-impacted landscapes.

  9. Nuclear Data Uncertainty Quantification: Past, Present and Future

    NASA Astrophysics Data System (ADS)

    Smith, D. L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested.

  10. Activity Theory as a Theoretical Framework for Health Self-Quantification: A Systematic Review of Empirical Studies.

    PubMed

    Almalki, Manal; Gray, Kathleen; Martin-Sanchez, Fernando

    2016-05-27

    Self-quantification (SQ) is a way of working in which, by using tracking tools, people aim to collect, manage, and reflect on personal health data to gain a better understanding of their own body, health behavior, and interaction with the world around them. However, health SQ lacks a formal framework for describing the self-quantifiers' activities and their contextual components or constructs to pursue these health related goals. Establishing such framework is important because it is the first step to operationalize health SQ fully. This may in turn help to achieve the aims of health professionals and researchers who seek to make or study changes in the self-quantifiers' health systematically. The aim of this study was to review studies on health SQ in order to answer the following questions: What are the general features of the work and the particular activities that self-quantifiers perform to achieve their health objectives? What constructs of health SQ have been identified in the scientific literature? How have these studies described such constructs? How would it be possible to model these constructs theoretically to characterize the work of health SQ? A systematic review of peer-reviewed literature was conducted. A total of 26 empirical studies were included. The content of these studies was thematically analyzed using Activity Theory as an organizing framework. The literature provided varying descriptions of health SQ as data-driven and objective-oriented work mediated by SQ tools. From the literature, we identified two types of SQ work: work on data (ie, data management activities) and work with data (ie, health management activities). Using Activity Theory, these activities could be characterized into 6 constructs: users, tracking tools, health objectives, division of work, community or group setting, and SQ plan and rules. We could not find a reference to any single study that accounted for all these activities and constructs of health SQ activity. A Health Self-Quantification Activity Framework is presented, which shows SQ tool use in context, in relation to the goals, plans, and competence of the user. This makes it easier to analyze issues affecting SQ activity, and thereby makes it more feasible to address them. This review makes two significant contributions to research in this field: it explores health SQ work and its constructs thoroughly and it adapts Activity Theory to describe health SQ activity systematically.

  11. A Generic Privacy Quantification Framework for Privacy-Preserving Data Publishing

    ERIC Educational Resources Information Center

    Zhu, Zutao

    2010-01-01

    In recent years, the concerns about the privacy for the electronic data collected by government agencies, organizations, and industries are increasing. They include individual privacy and knowledge privacy. Privacy-preserving data publishing is a research branch that preserves the privacy while, at the same time, withholding useful information in…

  12. Identification of lithology in Gulf of Mexico Miocene rocks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hilterman, F.J.; Sherwood, J.W.C.; Schellhorn, R.

    1996-12-31

    In the Gulf of Mexico, many gas-saturated sands are not Bright Spots and thus are difficult to detect on conventional 3D seismic data. These small amplitude reflections occur frequently in Pliocene-Miocene exploration plays when the acoustic impedances of the gas-saturated sands and shales are approximately the same. In these areas, geophysicists have had limited success using AVO to reduce the exploration risk. The interpretation of the conventional AVO attributes is often difficult and contains questionable relationships to the physical properties of the media. A 3D AVO study was conducted utilizing numerous well-log suites, core analyses, and production histories to helpmore » calibrate the seismic response to the petrophysical properties. This study resulted in an extension of the AVO method to a technique that now displays Bright spots when very clean sands and gas-saturated sands occur. These litho-stratigraphic reflections on the new AVO technique are related to Poisson`s ratio, a petrophysical property that is normally mixed with the acoustic impedance on conventional 3D migrated data.« less

  13. A new interpretation of seismic tomography in the southern Dead Sea basin using neural network clustering techniques

    NASA Astrophysics Data System (ADS)

    Braeuer, Benjamin; Bauer, Klaus

    2015-11-01

    The Dead Sea is a prime location to study the structure and development of pull-apart basins. We analyzed tomographic models of Vp, Vs, and Vp/Vs using self-organizing map clustering techniques. The method allows us to identify major lithologies by their petrophysical signatures. Remapping the clusters into the subsurface reveals the distribution of basin sediments, prebasin sedimentary rocks, and crystalline basement. The Dead Sea basin shows an asymmetric structure with thickness variation from 5 km in the west to 13 km in the east. Most importantly, we identified a distinct, well-defined body under the eastern part of the basin down to 18 km depth. Considering its geometry and petrophysical signature, this unit is interpreted as a buried counterpart of the shallow prebasin sediments encountered outside of the basin and not as crystalline basement. The seismicity distribution supports our results, where events are concentrated along boundaries of the basin and the deep prebasin sedimentary body. Our results suggest that the Dead Sea basin is about 4 km deeper than assumed from previous studies.

  14. Geologic Assessment of Undiscovered Gas Resources of the Eastern Oregon and Washington Province

    USGS Publications Warehouse

    U.S. Geological Survey Eastern Oregon and Washington Province Assessment Team, (compiler)

    2008-01-01

    The purpose of the U.S. Geological Survey's (USGS) National Oil and Gas Assessment is to develop geology-based hypotheses regarding the potential for additions to oil and gas reserves in priority areas of the United States, focusing on the distribution, quantity, and availability of oil and natural gas resources. The USGS has completed an assessment of the undiscovered oil and gas potential of the Eastern Oregon and Washington Province of Oregon and Washington (USGS Province 5005). The province is a priority Energy Policy and Conservation Act (EPCA) province for the National Assessment because of its potential for oil and gas resources. The assessment of this province is based on geologic principles and uses the total petroleum system concept. The geologic elements of a total petroleum system include hydrocarbon source rocks (source rock maturation, hydrocarbon generation and migration), reservoir rocks (stratigraphy, sedimentology, petrophysical properties), and hydrocarbon traps (trap formation and timing). In the Eastern Oregon and Washington Province, the USGS used this geologic framework to define one total petroleum system and two assessment units within the total petroleum system, and quantitatively estimated the undiscovered gas resources within each assessment unit.

  15. Petroleum systems and geologic assessment of undiscovered oil and gas, Cotton Valley group and Travis Peak-Hosston formations, East Texas basin and Louisiana-Mississippi salt basins provinces of the northern Gulf Coast region. Chapters 1-7.

    USGS Publications Warehouse

    ,

    2006-01-01

    The purpose of the U.S. Geological Survey's (USGS) National Oil and Gas Assessment is to develop geologically based hypotheses regarding the potential for additions to oil and gas reserves in priority areas of the United States. The USGS recently completed an assessment of undiscovered oil and gas potential of the Cotton Valley Group and Travis Peak and Hosston Formations in the East Texas Basin and Louisiana-Mississippi Salt Basins Provinces in the Gulf Coast Region (USGS Provinces 5048 and 5049). The Cotton Valley Group and Travis Peak and Hosston Formations are important because of their potential for natural gas resources. This assessment is based on geologic principles and uses the total petroleum system concept. The geologic elements of a total petroleum system include hydrocarbon source rocks (source rock maturation, hydrocarbon generation and migration), reservoir rocks (sequence stratigraphy and petrophysical properties), and hydrocarbon traps (trap formation and timing). The USGS used this geologic framework to define one total petroleum system and eight assessment units. Seven assessment units were quantitatively assessed for undiscovered oil and gas resources.

  16. Pitfalls and Limitations in the Interpretation of Geophysical Images for Hydrologic Properties and Processes

    NASA Astrophysics Data System (ADS)

    Day-Lewis, F. D.

    2014-12-01

    Geophysical imaging (e.g., electrical, radar, seismic) can provide valuable information for the characterization of hydrologic properties and monitoring of hydrologic processes, as evidenced in the rapid growth of literature on the subject. Geophysical imaging has been used for monitoring tracer migration and infiltration, mapping zones of focused groundwater/surface-water exchange, and verifying emplacement of amendments for bioremediation. Despite the enormous potential for extraction of hydrologic information from geophysical images, there also is potential for misinterpretation and over-interpretation. These concerns are particularly relevant when geophysical results are used within quantitative frameworks, e.g., conversion to hydrologic properties through petrophysical relations, geostatistical estimation and simulation conditioned to geophysical inversions, and joint inversion. We review pitfalls to interpretation associated with limited image resolution, spatially variable image resolution, incorrect data weighting, errors in the timing of measurements, temporal smearing resulting from changes during data acquisition, support-volume/scale effects, and incorrect assumptions or approximations involved in modeling geophysical or other jointly inverted data. A series of numerical and field-based examples illustrate these potential problems. Our goal in this talk is to raise awareness of common pitfalls and present strategies for recognizing and avoiding them.

  17. Modeling Stokes flow in real pore geometries derived by high resolution micro CT imaging

    NASA Astrophysics Data System (ADS)

    Halisch, M.; Müller, C.

    2012-04-01

    Meanwhile, numerical modeling of rock properties forms an important part of modern petrophysics. Substantially, equivalent rock models are used to describe and assess specific properties and phenomena, like fluid transport or complex electrical properties. In recent years, non-destructive computed X-ray tomography got more and more important - not only to take a quick and three dimensional look into rock samples but also to get access to in-situ sample information for highly accurate modeling purposes. Due to - by now - very high resolution of the 3D CT data sets (micron- to submicron range) also very small structures and sample features - e.g. micro porosity - can be visualized and used for numerical models of very high accuracy. Special demands even arise before numerical modeling can take place. Inappropriate filter applications (e.g. improper type of filter, wrong kernel, etc.) may lead to a significant corruption of spatial sample structure and / or even sample or void space volume. Because of these difficulties, especially small scale mineral- and pore space textures are very often lost and valuable in-situ information is erased. Segmentation of important sample features - porosity as well as rock matrix - based upon grayscale values strongly depends upon the scan quality and upon the experience of the application engineer, respectively. If the threshold for matrix-porosity separation is set too low, porosity can be quickly (and even more, due to restrictions of scanning resolution) underestimated. Contrary to this, a too high threshold over-determines porosity and small void space features as well as interfaces are changed and falsified. Image based phase separation in close combination with "conventional" analytics, as scanning electron microscopy or thin sectioning, greatly increase the reliability of this preliminary work. For segmentation and quantification purposes, a special CT imaging and processing software (Avizo Fire) has been used. By using this tool, 3D rock data can be assessed and interpreted by petrophysical means. Furthermore, pore structures can be directly segmented and hence could be used for so called image based modeling approach. The special XLabHydro module grants a finite volume solver for the direct assessment of Stokes flow (incompressible fluid, constant dynamic viscosity, stationary conditions and laminar flow) in real pore geometries. Nevertheless, also pore network extraction and numerical modeling with standard FE or lattice Boltzmann solvers is possible. By using the achieved voxel resolution as smallest node distance, fluid flow properties can be analyzed even in very small sample structures and hence with very high accuracy, especially with interaction to bigger parts of the pore network. The so derived results in combination with a direct 3D visualization within the structures offer great new insights and understanding in range of meso- and microscopic pore space phenomena.

  18. The discovery and geophysical response of the Atlántida Cu-Au porphyry deposit, Chile

    NASA Astrophysics Data System (ADS)

    Hope, Matthew; Andersson, Steve

    2016-03-01

    The discovery of the Atlántida Cu-Au-Mo porphyry deposit, which is unconformably overlain by 25-80 m of gravels, is a recent example of exploration success under cover in a traditional mining jurisdiction. Early acquisition of geophysics was a key tool in the discovery, and in later guiding further exploration drilling throughout the life of the project. Detailed review of the geophysical response of the deposit, with respect to the distribution of lithologies and alteration, coupled with their petrophysical properties has allowed full characterisation, despite no exposure at the surface of host rock nor porphyry-style mineralisation. Data acquired over the project include induced polarisation, magnetotellurics, ground and airborne magnetics, ground-based gravimetry, and petrophysical sampling. The distribution of the key geological features of the deposit has been inferred via acquisition of petrophysical properties and interpretation of surface geophysical datasets. Magnetic susceptibility is influenced strongly by both alteration and primary lithology, whilst density variations are dominated by primary lithological control. Several studies have shown that electrical properties may map the footprint of the hydrothermal system and associated mineralisation, via a combination of chargeability and resistivity. These properties are observed in geophysical datasets acquired at surface and allow further targeting and sterilisation at the deposit and project scale. By understanding these geophysical characteristics in a geological context, these data can be used to infer distribution of lithological units, depth to exploration targets and the potential for high grade mineralisation. Future exploration will likely be increasingly reliant on the understanding of the surface manifestations of buried deposits in remotely acquired data. This review summarises the application and results of these principles at the Atlántida project of northern Chile. Geophysical data can be used to improve the chances of discovery beneath post-mineral cover, and also improve drilling results throughout the advanced exploration of the program. The process of data review against geological control information is essential.

  19. Petrophysical Effects during karstification

    NASA Astrophysics Data System (ADS)

    Mai, Franziska; Kirsch, Reinhard; Rücker, Carsten; Börner, Frank

    2017-04-01

    Sinkholes are depression or collapse structures caused by dissolution in the subsurface or subrosion processes and occur in a vast variety of geological settings. They pose a considerable threat to people's safety and can cause severe economic loss, especially in highly populated areas. Commonly, sinkholes are linked to anomalies in groundwater flow and to the heterogeneities in the soluble sediment. To develop an early recognition system of sinkhole instability, unrest and collapse it is necessary to obtain a better understanding of sinkhole generation. With this intent the joint project "SIMULTAN" studies sinkholes applying a combination of structural, geophysical, petrophysical, and hydrological mapping methods, accompanied by sensor development, and multi-scale monitoring. Studying the solution process of gypsum and limestone as well as the accompanying processes and their relation to hydrologic mechanisms from a petrophysical point of view is essential to understand geophysically detected anomalies related to sinkholes. The focus lies on measurements of the complex, frequency dependent electrical conductivity, the self potential and the travel time of elastic waves. First, systematic laboratory measurements of the complex electrical conductivity were conducted on samples consisting of unconsolidated sand. The fully saturated samples differed in the ionic composition of their pore water (e.g. calcium sulfate and/or sodium chloride). The results indicate that it is possible to detect effects of higher gypsum concentration in the ground- or pore-water using electrical conductivity. This includes both the karstificable sediments as well as the adjacent, non-soluble sediments like e.g. clean sand or shaly sand. To monitor karstification and subrosion processes on a field scale, a stationary measuring system was installed in Münsterdorf, Schleswig-Holstein in northern Germany, an area highly at risk of sinkhole development. The complex electrical conductivity is measured in two boreholes, located 5 meters apart. The results of these measurements are used to investigate possible solution of the subterranean chalk.

  20. Petrofacies Analysis - A Petrophysical Tool for Geologic/Engineering Reservoir Characterization

    USGS Publications Warehouse

    Watney, W.L.; Guy, W.J.; Doveton, J.H.; Bhattacharya, S.; Gerlach, P.M.; Bohling, Geoffrey C.; Carr, T.R.

    1998-01-01

    Petrofacies analysis is defined as the characterization and classification of pore types and fluid saturations as revealed by petrophysical measurements of a reservoir. The word "petrofacies" makes an explicit link between petroleum engineers' concerns with pore characteristics as arbiters of production performance and the facies paradigm of geologists as a methodology for genetic understanding and prediction. In petrofacies analysis, the porosity and resistivity axes of the classical Pickett plot are used to map water saturation, bulk volume water, and estimated permeability, as well as capillary pressure information where it is available. When data points are connected in order of depth within a reservoir, the characteristic patterns reflect reservoir rock character and its interplay with the hydrocarbon column. A third variable can be presented at each point on the crossplot by assigning a color scale that is based on other well logs, often gamma ray or photoelectric effect, or other derived variables. Contrasts between reservoir pore types and fluid saturations are reflected in changing patterns on the crossplot and can help discriminate and characterize reservoir heterogeneity. Many hundreds of analyses of well logs facilitated by spreadsheet and object-oriented programming have provided the means to distinguish patterns typical of certain complex pore types (size and connectedness) for sandstones and carbonate reservoirs, occurrences of irreducible water saturation, and presence of transition zones. The result has been an improved means to evaluate potential production, such as bypassed pay behind pipe and in old exploration wells, or to assess zonation and continuity of the reservoir. Petrofacies analysis in this study was applied to distinguishing flow units and including discriminating pore type as an assessment of reservoir conformance and continuity. The analysis is facilitated through the use of colorimage cross sections depicting depositional sequences, natural gamma ray, porosity, and permeability. Also, cluster analysis was applied to discriminate petrophysically similar reservoir rock.

  1. Hydrogeological characterisation of a glacially affected barrier island - the North Frisian Island of Föhr

    NASA Astrophysics Data System (ADS)

    Burschil, T.; Scheer, W.; Kirsch, R.; Wiederhold, H.

    2012-04-01

    We present the application of geophysical investigations to characterise and improve the geological/hydrogeological model through the estimation of petrophysical parameters for groundwater modelling. Seismic reflection and airborne electromagnetic surveys in combination with borehole information enhance the 3-D geological model and allow a petrophysical interpretation of the subsurface. The North Sea Island of Föhr has a very complex underground structure what was already known from boreholes. The local waterworks use a freshwater body embedded in saline groundwater. Several glaciations disordered the Youngest Tertiary and Quaternary sediments by glaciotectonic thrust-faulting as well as incision and refill of glacial valleys. Both underground structures have a strong impact on the distribution of freshwater bearing aquifers. An initial hydrogeological model of Föhr was built from borehole data alone and was restricted to the southern part of the island where in the sandy areas of the Geest a large freshwater body was formed. We improved the geological/hydrogeological model by adding data from different geophysical methods, e.g. airborne electromagnetics (EM) for mapping the resistivity of the entire island, seismic reflections for detailed cross sections in the groundwater catchment area, and geophysical borehole logging for calibration of these measurements. An integrated evaluation of the results from the different geophysical methods yields reliable data. To determinate petrophysical parameter about 18 borehole logs, more than 75 m deep, and nearby airborne EM inversion models were analyzed concerning resistivity. We establish an empirical relation between measured resistivity and hydraulic conductivity for the specific area - the North Sea island of Föhr. Five boreholes concerning seismic interval velocities discriminate sand and till. The interpretation of these data was the basis for building the geological/hydrogeological 3-D model. We fitted the relevant model layers to all geophysical and geological data and created a consistent 3-D model. This model is the fundament for groundwater simulations considering forecasted changes in precipitation and sea level rise due to climate change.

  2. Gas storage in the Upper Devonian-Lower Mississippian Woodford Shale, Arbuckle Mountains, Oklahoma: how much of a role do the cherts play?

    USGS Publications Warehouse

    Fishman, Neil S.; Ellis, Geoffrey S.; Paxton, Stanley T.; Abbott, Marvin M.; Boehlke, Adam

    2010-01-01

    Microfractures also contribute to Woodford Shale porosity but they appear to be lithologically controlled. Fractures are relatively well-developed and are typically perpendicular to bedding in cherts, but these fractures typically end abruptly or become much more diffuse in adjacent mudstones. The brittle nature of the cherts, due to their high quartz content, is most likely the reason for their excellent fracture development, particularly relative to the mudstones, which are composed of much more ductile clay and Tasmanites constituents. Interestingly, the overlap of some petrophysical properties of cherts and mudstones (e.g., porosity, pore apertures) in the Woodford Shale for samples from the Arbuckle Mountains indicates that for shallowly-buried (i.e. minimally compacted) parts of the formation, both lithologies may have exhibited similar behavior relative to fluid movement. Where the Woodford has been more deeply buried and subjected to more intense compaction (i.e. in the Anadarko Basin), the petrophysical characteristics of cherts are likely to have changed only minimally due to their rigid fabric, whereas the petrophysical characteristics of the mudstones are likely to have changed significantly due to compaction and the resultant compression and collapse of ductile constituents such as clays and Tasmanites microfossils (those without quartz infilling). Moldic porosity, which could be expected to develop in kerogen as a consequence of maturation (Loucks and others, 2009), is more likely in the high TOC mudstones, but would also occur in Woodford cherts, which contain lower TOC contents. Owing to the potential for Woodford cherts to better retain porosity, coupled with their contained TOC, cherts may indeed provide important overlooked intervals of gas generation and overall gas storage in the formation. Thus, Woodford cherts may contribute a significant portion of the gas that is produced from the formation. As such, chert beds may play a very significant, heretofore overlooked role as source and reservoir intervals within the Woodford in the Anadarko Basin.

  3. Petrophysical Properties of the Yeso, Abo and Cisco Formations in the Permian Basin in New Mexico, U.S.A

    NASA Astrophysics Data System (ADS)

    Mann, Griffin

    The area that comprises the Northwest Shelf in Lea Co., New Mexico has been heavily drilled over the past half century. The main target being shallow reservoirs within the Permian section (San Andres and Grayburg Formations). With a focus shifting towards deeper horizons, there is a need for more petrophysical data pertaining to these formations, which is the focus of this study through a variety of techniques. This study involves the use of contact angle measurements, fluid imbibition tests, Mercury Injection Capillary Pressure (MICP) and log analysis to evaluate the nano-petrophysical properties of the Yeso, Abo and Cisco Formation within the Northwest Shelf area of southeast New Mexico. From contact angle measurements, all of the samples studied were found to be oil-wetting as n-decane spreads on to the rock surface much quicker than the other fluids (deionized water and API brine) tested. Imbibition tests resulted in a well-connected pore network being observed for all of the samples with the highest values of imbibition slopes being recorded for the Abo samples. MICP provided a variety of pore structure data which include porosity, pore-throat size distributions, permeability and tortuosity. The Abo samples saw the highest porosity percentages, which were above 15%, with all the other samples ranging from 4 - 7%. The majority of the pore-throat sizes for most of the samples fell within the 1 - 10 mum range. The only exceptions to this being the Paddock Member within the Yeso Formation, which saw a higher percentage of larger pores (10 - 1000mum) and one of the Cisco Formation samples, which had the majority of its pore sizes fall in the 0.1 - 1 mum range. The log analysis created log calculations and curves for cross-plot porosity and water saturation that were then used to derive a value for permeability. The porosity and permeability values were comparable with those measured from our MICP and literature values.

  4. A framework for assessing cumulative effects in watersheds: an introduction to Canadian case studies.

    PubMed

    Dubé, Monique G; Duinker, Peter; Greig, Lorne; Carver, Martin; Servos, Mark; McMaster, Mark; Noble, Bram; Schreier, Hans; Jackson, Lee; Munkittrick, Kelly R

    2013-07-01

    From 2008 to 2013, a series of studies supported by the Canadian Water Network were conducted in Canadian watersheds in an effort to improve methods to assess cumulative effects. These studies fit under a common framework for watershed cumulative effects assessment (CEA). This article presents an introduction to the Special Series on Watershed CEA in IEAM including the framework and its impetus, a brief introduction to each of the articles in the series, challenges, and a path forward. The framework includes a regional water monitoring program that produces 3 core outputs: an accumulated state assessment, stressor-response relationships, and development of predictive cumulative effects scenario models. The framework considers core values, indicators, thresholds, and use of consistent terminology. It emphasizes that CEA requires 2 components, accumulated state quantification and predictive scenario forecasting. It recognizes both of these components must be supported by a regional, multiscale monitoring program. Copyright © 2013 SETAC.

  5. Optimizing Fracture Treatments in a Mississippian "Chat" Reservoir, South-Central Kansas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    K. David Newell; Saibal Bhattacharya; Alan Byrnes

    2005-10-01

    This project is a collaboration of Woolsey Petroleum Corporation (a small independent operator) and the Kansas Geological Survey. The project will investigate geologic and engineering factors critical for designing hydraulic fracture treatments in Mississippian ''chat'' reservoirs. Mississippian reservoirs, including the chat, account for 159 million m3 (1 billion barrels) of the cumulative oil produced in Kansas. Mississippian reservoirs presently represent {approx}40% of the state's 5.6*106m3 (35 million barrels) annual production. Although geographically widespread, the ''chat'' is a heterogeneous reservoir composed of chert, cherty dolomite, and argillaceous limestone. Fractured chert with micro-moldic porosity is the best reservoir in this 18- tomore » 30-m-thick (60- to 100-ft) unit. The chat will be cored in an infill well in the Medicine Lodge North field (417,638 m3 [2,626,858 bbls] oil; 217,811,000 m3 [7,692,010 mcf] gas cumulative production; discovered 1954). The core and modern wireline logs will provide geological and petrophysical data for designing a fracture treatment. Optimum hydraulic fracturing design is poorly defined in the chat, with poor correlation of treatment size to production increase. To establish new geologic and petrophysical guidelines for these treatments, data from core petrophysics, wireline logs, and oil-field maps will be input to a fracture-treatment simulation program. Parameters will be established for optimal size of the treatment and geologic characteristics of the predicted fracturing. The fracturing will be performed and subsequent wellsite tests will ascertain the results for comparison to predictions. A reservoir simulation program will then predict the rate and volumetric increase in production. Comparison of the predicted increase in production with that of reality, and the hypothetical fracturing behavior of the reservoir with that of its actual behavior, will serve as tests of the geologic and petrophysical characterization of the oil field. After this feedback, a second well will be cored and logged, and procedure will be repeated to test characteristics determined to be critical for designing cost-effective fracture treatments. Most oil and gas production in Kansas, and that of the Midcontinent oil industry, is dominated by small companies. The overwhelming majority of these independent operators employ less than 20 people. These companies have limited scientific and engineering expertise and they are increasingly needing guidelines and technical examples that will help them to not be wasteful of their limited financial resources and petroleum reserves. To aid these operators, the technology transfer capabilities of the Kansas Geological Survey will disseminate the results of this study to the local, regional, and national oil industry. Internet access, seminars, presentations, and publications by Woolsey Petroleum Company and Kansas Geological Survey geologists and engineers are anticipated.« less

  6. MoisturEC: an R application for geostatistical estimation of moisture content from electrical conductivity data

    NASA Astrophysics Data System (ADS)

    Terry, N.; Day-Lewis, F. D.; Werkema, D. D.; Lane, J. W., Jr.

    2017-12-01

    Soil moisture is a critical parameter for agriculture, water supply, and management of landfills. Whereas direct data (as from TDR or soil moisture probes) provide localized point scale information, it is often more desirable to produce 2D and/or 3D estimates of soil moisture from noninvasive measurements. To this end, geophysical methods for indirectly assessing soil moisture have great potential, yet are limited in terms of quantitative interpretation due to uncertainty in petrophysical transformations and inherent limitations in resolution. Simple tools to produce soil moisture estimates from geophysical data are lacking. We present a new standalone program, MoisturEC, for estimating moisture content distributions from electrical conductivity data. The program uses an indicator kriging method within a geostatistical framework to incorporate hard data (as from moisture probes) and soft data (as from electrical resistivity imaging or electromagnetic induction) to produce estimates of moisture content and uncertainty. The program features data visualization and output options as well as a module for calibrating electrical conductivity with moisture content to improve estimates. The user-friendly program is written in R - a widely used, cross-platform, open source programming language that lends itself to further development and customization. We demonstrate use of the program with a numerical experiment as well as a controlled field irrigation experiment. Results produced from the combined geostatistical framework of MoisturEC show improved estimates of moisture content compared to those generated from individual datasets. This application provides a convenient and efficient means for integrating various data types and has broad utility to soil moisture monitoring in landfills, agriculture, and other problems.

  7. A new framework for UAV-based remote sensing data processing and its application in almond water stress quantification

    USDA-ARS?s Scientific Manuscript database

    With the rapid development of small imaging sensors and unmanned aerial vehicles (UAVs), remote sensing is undergoing a revolution with greatly increased spatial and temporal resolutions. While more relevant detail becomes available, it is a challenge to analyze the large number of images to extract...

  8. The Politics of Language. Lektos: Interdisciplinary Working Papers in Language Sciences, Vol. 3, No. 2.

    ERIC Educational Resources Information Center

    St. Clair, Robert N.

    The areas of language planning and the language of oppression are discussed within the theoretical framework of existential sociolinguistics. This tradition is contrasted with the contemporary models of positivism with its assumptions about constancy and quantification. The proposed model brings in social history, intent, consciousness, and other…

  9. Indigenous Wellbeing Frameworks in Australia and the Quest for Quantification

    ERIC Educational Resources Information Center

    Prout, Sarah

    2012-01-01

    There is an emerging global recognition of the inadequacies of conventional socio-economic and demographic data in being able to reflect the relative wellbeing of Indigenous peoples. This paper emerges out of a recent desktop study commissioned by an Australian Indigenous organization who identified a need to enhance local literacies in data…

  10. Nuclear Data Uncertainty Quantification: Past, Present and Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, D. L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for futuremore » investigation of this subject are also suggested.« less

  11. Nuclear Data Uncertainty Quantification: Past, Present and Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, D.L., E-mail: Donald.L.Smith@anl.gov

    2015-01-15

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for futuremore » investigation of this subject are also suggested.« less

  12. Quantitative diagnosis and prognosis framework for concrete degradation due to alkali-silica reaction

    NASA Astrophysics Data System (ADS)

    Mahadevan, Sankaran; Neal, Kyle; Nath, Paromita; Bao, Yanqing; Cai, Guowei; Orme, Peter; Adams, Douglas; Agarwal, Vivek

    2017-02-01

    This research is seeking to develop a probabilistic framework for health diagnosis and prognosis of aging concrete structures in nuclear power plants that are subjected to physical, chemical, environment, and mechanical degradation. The proposed framework consists of four elements: monitoring, data analytics, uncertainty quantification, and prognosis. The current work focuses on degradation caused by ASR (alkali-silica reaction). Controlled concrete specimens with reactive aggregate are prepared to develop accelerated ASR degradation. Different monitoring techniques — infrared thermography, digital image correlation (DIC), mechanical deformation measurements, nonlinear impact resonance acoustic spectroscopy (NIRAS), and vibro-acoustic modulation (VAM) — are studied for ASR diagnosis of the specimens. Both DIC and mechanical measurements record the specimen deformation caused by ASR gel expansion. Thermography is used to compare the thermal response of pristine and damaged concrete specimens and generate a 2-D map of the damage (i.e., ASR gel and cracked area), thus facilitating localization and quantification of damage. NIRAS and VAM are two separate vibration-based techniques that detect nonlinear changes in dynamic properties caused by the damage. The diagnosis results from multiple techniques are then fused using a Bayesian network, which also helps to quantify the uncertainty in the diagnosis. Prognosis of ASR degradation is then performed based on the current state of degradation obtained from diagnosis, by using a coupled thermo-hydro-mechanical-chemical (THMC) model for ASR degradation. This comprehensive approach of monitoring, data analytics, and uncertainty-quantified diagnosis and prognosis will facilitate the development of a quantitative, risk informed framework that will support continuous assessment and risk management of structural health and performance.

  13. A Demonstration of Concrete Structural Health Monitoring Framework for Degradation due to Alkali-Silica Reaction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahadevan, Sankaran; Agarwal, Vivek; Neal, Kyle

    Assessment and management of aging concrete structures in nuclear power plants require a more systematic approach than simple reliance on existing code margins of safety. Structural health monitoring of concrete structures aims to understand the current health condition of a structure based on heterogeneous measurements to produce high-confidence actionable information regarding structural integrity that supports operational and maintenance decisions. This ongoing research project is seeking to develop a probabilistic framework for health diagnosis and prognosis of aging concrete structures in a nuclear power plant that is subjected to physical, chemical, environment, and mechanical degradation. The proposed framework consists of fourmore » elements: monitoring, data analytics, uncertainty quantification and prognosis. This report focuses on degradation caused by ASR (alkali-silica reaction). Controlled specimens were prepared to develop accelerated ASR degradation. Different monitoring techniques – thermography, digital image correlation (DIC), mechanical deformation measurements, nonlinear impact resonance acoustic spectroscopy (NIRAS), and vibro-acoustic modulation (VAM) -- were used to detect the damage caused by ASR. Heterogeneous data from the multiple techniques was used for damage diagnosis and prognosis, and quantification of the associated uncertainty using a Bayesian network approach. Additionally, MapReduce technique has been demonstrated with synthetic data. This technique can be used in future to handle large amounts of observation data obtained from the online monitoring of realistic structures.« less

  14. Towards tributyltin quantification in natural water at the Environmental Quality Standard level required by the Water Framework Directive.

    PubMed

    Alasonati, Enrica; Fettig, Ina; Richter, Janine; Philipp, Rosemarie; Milačič, Radmila; Sčančar, Janez; Zuliani, Tea; Tunç, Murat; Bilsel, Mine; Gören, Ahmet Ceyhan; Fisicaro, Paola

    2016-11-01

    The European Union (EU) has included tributyltin (TBT) and its compounds in the list of priority water pollutants. Quality standards demanded by the EU Water Framework Directive (WFD) require determination of TBT at so low concentration level that chemical analysis is still difficult and further research is needed to improve the sensitivity, the accuracy and the precision of existing methodologies. Within the frame of a joint research project "Traceable measurements for monitoring critical pollutants under the European Water Framework Directive" in the European Metrology Research Programme (EMRP), four metrological and designated institutes have developed a primary method to quantify TBT in natural water using liquid-liquid extraction (LLE) and species-specific isotope dilution mass spectrometry (SSIDMS). The procedure has been validated at the Environmental Quality Standard (EQS) level (0.2ngL(-1) as cation) and at the WFD-required limit of quantification (LOQ) (0.06ngL(-1) as cation). The LOQ of the methodology was 0.06ngL(-1) and the average measurement uncertainty at the LOQ was 36%, which agreed with WFD requirements. The analytical difficulties of the method, namely the presence of TBT in blanks and the sources of measurement uncertainties, as well as the interlaboratory comparison results are discussed in detail. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. X-ray fluorescence at nanoscale resolution for multicomponent layered structures: A solar cell case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    West, Bradley M.; Stuckelberger, Michael; Jeffries, April

    The study of a multilayered and multicomponent system by spatially resolved X-ray fluorescence microscopy poses unique challenges in achieving accurate quantification of elemental distributions. This is particularly true for the quantification of materials with high X-ray attenuation coefficients, depth-dependent composition variations and thickness variations. A widely applicable procedure for use after spectrum fitting and quantification is described. This procedure corrects the elemental distribution from the measured fluorescence signal, taking into account attenuation of the incident beam and generated fluorescence from multiple layers, and accounts for sample thickness variations. Deriving from Beer–Lambert's law, formulae are presented in a general integral formmore » and numerically applicable framework. Here, the procedure is applied using experimental data from a solar cell with a Cu(In,Ga)Se 2 absorber layer, measured at two separate synchrotron beamlines with varied measurement geometries. This example shows the importance of these corrections in real material systems, which can change the interpretation of the measured distributions dramatically.« less

  16. X-ray fluorescence at nanoscale resolution for multicomponent layered structures: A solar cell case study

    DOE PAGES

    West, Bradley M.; Stuckelberger, Michael; Jeffries, April; ...

    2017-01-01

    The study of a multilayered and multicomponent system by spatially resolved X-ray fluorescence microscopy poses unique challenges in achieving accurate quantification of elemental distributions. This is particularly true for the quantification of materials with high X-ray attenuation coefficients, depth-dependent composition variations and thickness variations. A widely applicable procedure for use after spectrum fitting and quantification is described. This procedure corrects the elemental distribution from the measured fluorescence signal, taking into account attenuation of the incident beam and generated fluorescence from multiple layers, and accounts for sample thickness variations. Deriving from Beer–Lambert's law, formulae are presented in a general integral formmore » and numerically applicable framework. Here, the procedure is applied using experimental data from a solar cell with a Cu(In,Ga)Se 2 absorber layer, measured at two separate synchrotron beamlines with varied measurement geometries. This example shows the importance of these corrections in real material systems, which can change the interpretation of the measured distributions dramatically.« less

  17. Autonomous cloud based site monitoring through hydro geophysical data assimilation, processing and result delivery

    NASA Astrophysics Data System (ADS)

    Versteeg, R.; Johnson, D. V.; Rodzianko, A.; Zhou, H.; Dafflon, B.; Leger, E.; de Kleine, M.

    2017-12-01

    Understanding of processes in the shallow subsurface requires that geophysical, biogeochemical, hydrological and remote sensing datasets are assimilated, processed and interpreted. Multiple enabling software capabilities for process understanding have been developed by the science community. These include information models (ODM2), reactive transport modeling (PFLOTRAN, Modflow, CLM, Landlab), geophysical inversion (E4D, BERT), parameter estimation (PEST, DAKOTA), visualization (ViSiT, Paraview, D3, QGIS) as well as numerous tools written in python and R for petrophysical mapping, stochastic modeling, data analysis and so on. These capabilities use data collected using sensors and analytical tools developed by multiple manufacturers which produce many different measurements. While scientists obviously leverage tools, capabilities and lessons learned from one site at other sites, the current approach to site characterization and monitoring is very labor intensive and does not scale well. Our objective is to be able to monitor many (hundreds - thousands) of sites. This requires that monitoring can be done in a near time, affordable, auditable and essentially autonomous manner. For this we have developed a modular vertically integrated cloud based software framework which was designed from the ground up for effective site and process monitoring. This software framework (PAF - Predictive Assimilation Framework) is multitenant software and provides automation of data ingestion, processing and visualization of hydrological, geochemical and geophysical (ERT/DTS) data. The core organizational element of PAF is a project/user one in which capabilities available to users are controlled by a combination of available data and access permissions. All PAF capabilities are exposed through APIs, making it easy to quickly add new components. PAF is fully integrated with newly developed autonomous electrical geophysical hardware and thus allows for automation of electrical geophysical ingestion and processing and the ability for co analysis and visualization of the raw and processed data with other data of interest (e.g. soil temperature, soil moisture, precipitation). We will demonstrate current PAF capabilities and discuss future efforts.

  18. Activity Theory as a Theoretical Framework for Health Self-Quantification: A Systematic Review of Empirical Studies

    PubMed Central

    2016-01-01

    Background Self-quantification (SQ) is a way of working in which, by using tracking tools, people aim to collect, manage, and reflect on personal health data to gain a better understanding of their own body, health behavior, and interaction with the world around them. However, health SQ lacks a formal framework for describing the self-quantifiers’ activities and their contextual components or constructs to pursue these health related goals. Establishing such framework is important because it is the first step to operationalize health SQ fully. This may in turn help to achieve the aims of health professionals and researchers who seek to make or study changes in the self-quantifiers’ health systematically. Objective The aim of this study was to review studies on health SQ in order to answer the following questions: What are the general features of the work and the particular activities that self-quantifiers perform to achieve their health objectives? What constructs of health SQ have been identified in the scientific literature? How have these studies described such constructs? How would it be possible to model these constructs theoretically to characterize the work of health SQ? Methods A systematic review of peer-reviewed literature was conducted. A total of 26 empirical studies were included. The content of these studies was thematically analyzed using Activity Theory as an organizing framework. Results The literature provided varying descriptions of health SQ as data-driven and objective-oriented work mediated by SQ tools. From the literature, we identified two types of SQ work: work on data (ie, data management activities) and work with data (ie, health management activities). Using Activity Theory, these activities could be characterized into 6 constructs: users, tracking tools, health objectives, division of work, community or group setting, and SQ plan and rules. We could not find a reference to any single study that accounted for all these activities and constructs of health SQ activity. Conclusions A Health Self-Quantification Activity Framework is presented, which shows SQ tool use in context, in relation to the goals, plans, and competence of the user. This makes it easier to analyze issues affecting SQ activity, and thereby makes it more feasible to address them. This review makes two significant contributions to research in this field: it explores health SQ work and its constructs thoroughly and it adapts Activity Theory to describe health SQ activity systematically. PMID:27234343

  19. Experimental investigations of the wettability of clays and shales

    NASA Astrophysics Data System (ADS)

    Borysenko, Artem; Clennell, Ben; Sedev, Rossen; Burgar, Iko; Ralston, John; Raven, Mark; Dewhurst, David; Liu, Keyu

    2009-07-01

    Wettability in argillaceous materials is poorly understood, yet it is critical to hydrocarbon recovery in clay-rich reservoirs and capillary seal capacity in both caprocks and fault gouges. The hydrophobic or hydrophilic nature of clay-bearing soils and sediments also controls to a large degree the movement of spilled nonaqueous phase liquids in the subsurface and the options available for remediation of these pollutants. In this paper the wettability of hydrocarbons contacting shales in their natural state and the tendencies for wettability alteration were examined. Water-wet, oil-wet, and mixed-wet shales from wells in Australia were investigated and were compared with simplified model shales (single and mixed minerals) artificially treated in crude oil. The intact natural shale samples (preserved with their original water content) were characterized petrophysically by dielectric spectroscopy and nuclear magnetic resonance, plus scanning electron, optical and fluorescence microscopy. Wettability alteration was studied using spontaneous imbibition, pigment extraction, and the sessile drop method for contact angle measurement. The mineralogy and chemical compositions of the shales were determined by standard methods. By studying pure minerals and natural shales in parallel, a correlation between the petrophysical properties, and wetting behavior was observed. These correlations may potentially be used to assess wettability in downhole measurements.

  20. Resolution dependence of petrophysical parameters derived from X-ray tomography of chalk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Müter, D.; Sørensen, H. O.; Jha, D.

    2014-07-28

    X-ray computed tomography data from chalk drill cuttings were taken over a series of voxel dimensions, ranging from 320 to 25 nm. From these data sets, standard petrophysical parameters (porosity, surface area, and permeability) were derived and we examined the effect of the voxel dimension (i.e., image resolution) on these properties. We found that for the higher voxel dimensions, they are severely over or underestimated, whereas for 50 and 25 nm voxel dimension, the resulting values (5%–30% porosity, 0.2–2 m{sup 2}/g specific surface area, and 0.06–0.34 mD permeability) are within the expected range for this type of rock. We compared our resultsmore » to macroscopic measurements and in the case of surface area, also to measurements using the Brunauer-Emmett-Teller (BET) method and found that independent of the degree of compaction, the results from tomography amount to about 30% of the BET method. Finally, we concluded that at 25 nm voxel dimension, the essential features of the nanoscopic pore network in chalk are captured but better resolution is still needed to derive surface area.« less

  1. Methods and techniques for measuring gas emissions from agricultural and animal feeding operations.

    PubMed

    Hu, Enzhu; Babcock, Esther L; Bialkowski, Stephen E; Jones, Scott B; Tuller, Markus

    2014-01-01

    Emissions of gases from agricultural and animal feeding operations contribute to climate change, produce odors, degrade sensitive ecosystems, and pose a threat to public health. The complexity of processes and environmental variables affecting these emissions complicate accurate and reliable quantification of gas fluxes and production rates. Although a plethora of measurement technologies exist, each method has its limitations that exacerbate accurate quantification of gas fluxes. Despite a growing interest in gas emission measurements, only a few available technologies include real-time, continuous monitoring capabilities. Commonly applied state-of-the-art measurement frameworks and technologies were critically examined and discussed, and recommendations for future research to address real-time monitoring requirements for forthcoming regulation and management needs are provided.

  2. Characterization of structures of the Nankai Trough accretionary prism from integrated analyses of LWD log response, resistivity images and clay mineralogy of cuttings: Expedition 338 Site C0002

    NASA Astrophysics Data System (ADS)

    Jurado, Maria Jose; Schleicher, Anja

    2014-05-01

    The objective of our research is a detailed characterization of structures on the basis of LWD oriented images and logs,and clay mineralogy of cuttings from Hole C0002F of the Nankai Trough accretionary prism. Our results show an integrated interpretation of structures derived from borehole images, petrophysical characterization on LWD logs and cuttings mineralogy. The geometry of the structure intersected at Hole C0002F has been characterized by the interpretation of oriented borehole resistivity images acquired during IODP Expedition 338. The characterization of structural features, faults and fracture zones is based on a detailed post-cruise interpretation of bedding and fractures on borehole images and also on the analysis of Logging While Drilling (LWD) log response (gamma radioactivity, resistivity and sonic logs). The interpretation and complete characterization of structures (fractures, fracture zones, fault zones, folds) was achieved after detailed shorebased reprocessing of resistivity images, which allowed to enhance bedding and fracture's imaging for geometry and orientation interpretation. In order to characterize distinctive petrophysical properties based on LWD log response, it could be compared with compositional changes derived from cuttings analyses. Cuttings analyses were used to calibrate and to characterize log response and to verify interpretations in terms of changes in composition and texture at fractures and fault zones defined on borehole images. Cuttings were taken routinely every 5 m during Expedition 338, indicating a clay-dominated lithology of silty claystone with interbeds of weakly consolidated, fine sandstones. The main mineralogical components are clay minerals, quartz, feldspar and calcite. Selected cuttings were taken from areas of interest as defined on LWD logs and images. The clay mineralogy was investigated on the <2 micron clay-size fraction, with special focus on smectite and illite minerals. Based on X-ray diffraction analysis measured at room temperature and a relative humidity of ~30%, we compared the shape and size of illite and smectite, as well as their water content and their polytypes. The comparison of cuttings mineralogy with logging while drilling (LWD) data allowed us to characterize structural, petrophysical and mineralogical properties at fracture and fault zones. We also analyzed the relationship between deformation structures and compositional and mineralogical changes. We established a correlation between observed results on clay mineralogy and log responses in relation with the structures and trends characterized on logging data. In general, the log data provide a good correlation with the actual mineralogy and the relative abundance of clay. In particular we analyzed trends characterized by smectite water layers as indication of compaction. These trends were correlated with log response (on sonic velocity) within Unit IV. Our results show the integration of logging data and cutting sample analyses as a valuable tool for characterization of petrophysical and mineralogical changes of the structures of the Nankai accretionary prism.

  3. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    NASA Astrophysics Data System (ADS)

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis for future work in this field.

  4. Experimental design for TBT quantification by isotope dilution SPE-GC-ICP-MS under the European water framework directive.

    PubMed

    Alasonati, Enrica; Fabbri, Barbara; Fettig, Ina; Yardin, Catherine; Del Castillo Busto, Maria Estela; Richter, Janine; Philipp, Rosemarie; Fisicaro, Paola

    2015-03-01

    In Europe the maximum allowable concentration for tributyltin (TBT) compounds in surface water has been regulated by the water framework directive (WFD) and daughter directive that impose a limit of 0.2 ng L(-1) in whole water (as tributyltin cation). Despite the large number of different methodologies for the quantification of organotin species developed in the last two decades, standardised analytical methods at required concentration level do not exist. TBT quantification at picogram level requires efficient and accurate sample preparation and preconcentration, and maximum care to avoid blank contamination. To meet the WFD requirement, a method for the quantification of TBT in mineral water at environmental quality standard (EQS) level, based on solid phase extraction (SPE), was developed and optimised. The quantification was done using species-specific isotope dilution (SSID) followed by gas chromatography (GC) coupled to inductively coupled plasma mass spectrometry (ICP-MS). The analytical process was optimised using a design of experiment (DOE) based on a factorial fractionary plan. The DOE allowed to evaluate 3 qualitative factors (type of stationary phase and eluent, phase mass and eluent volume, pH and analyte ethylation procedure) for a total of 13 levels studied, and a sample volume in the range of 250-1000 mL. Four different models fitting the results were defined and evaluated with statistic tools: one of them was selected and optimised to find the best procedural conditions. C18 phase was found to be the best stationary phase for SPE experiments. The 4 solvents tested with C18, the pH and ethylation conditions, the mass of the phases, the volume of the eluents and the sample volume can all be optimal, but depending on their respective combination. For that reason, the equation of the model conceived in this work is a useful decisional tool for the planning of experiments, because it can be applied to predict the TBT mass fraction recovery when the experimental conditions are drawn. This work shows that SPE is a convenient technique for TBT pre-concentration at pico-trace levels and a robust approach: in fact (i) number of different experimental conditions led to satisfactory results and (ii) the participation of two institutes to the experimental work did not impact the developed model. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Uncertainty Quantification using Exponential Epi-Splines

    DTIC Science & Technology

    2013-06-01

    Leibler divergence. The choice of κ in applications can be informed by the fact that the Kullback - Leibler divergence between two normal densities, ϕ1... of ran- dom output quantities of interests. The framework systematically incorporates hard information derived from physics-based sensors, field test ... information , and determines the ‘best’ estimate within that family. Bayesian estima- tion makes use of prior soft information

  6. Quantifying understorey vegetation in the US Lake States: a proposed framework to inform regional forest carbon stocks

    Treesearch

    Matthew B. Russell; Anthony W. D' Amato; Bethany K. Schulz; Christopher W. Woodall; Grant M. Domke; John B. Bradford

    2014-01-01

    The contribution of understorey vegetation (UVEG) to forest ecosystem biomass and carbon (C) across diverse forest types has, to date, eluded quantification at regional and national scales. Efforts to quantify UVEG C have been limited to field-intensive studies or broad-scalemodelling approaches lacking fieldmeasurements. Although large-scale inventories of UVEG C are...

  7. Challenges in leveraging existing human performance data for quantifying the IDHEAS HRA method

    DOE PAGES

    Liao, Huafei N.; Groth, Katrina; Stevens-Adams, Susan

    2015-07-29

    Our article documents an exploratory study for collecting and using human performance data to inform human error probability (HEP) estimates for a new human reliability analysis (HRA) method, the IntegrateD Human Event Analysis System (IDHEAS). The method was based on cognitive models and mechanisms underlying human behaviour and employs a framework of 14 crew failure modes (CFMs) to represent human failures typical for human performance in nuclear power plant (NPP) internal, at-power events [1]. A decision tree (DT) was constructed for each CFM to assess the probability of the CFM occurring in different contexts. Data needs for IDHEAS quantification aremore » discussed. Then, the data collection framework and process is described and how the collected data were used to inform HEP estimation is illustrated with two examples. Next, five major technical challenges are identified for leveraging human performance data for IDHEAS quantification. Furthermore, these challenges reflect the data needs specific to IDHEAS. More importantly, they also represent the general issues with current human performance data and can provide insight for a path forward to support HRA data collection, use, and exchange for HRA method development, implementation, and validation.« less

  8. Quantification of causal couplings via dynamical effects: A unifying perspective

    NASA Astrophysics Data System (ADS)

    Smirnov, Dmitry A.

    2014-12-01

    Quantitative characterization of causal couplings from time series is crucial in studies of complex systems of different origin. Various statistical tools for that exist and new ones are still being developed with a tendency to creating a single, universal, model-free quantifier of coupling strength. However, a clear and generally applicable way of interpreting such universal characteristics is lacking. This work suggests a general conceptual framework for causal coupling quantification, which is based on state space models and extends the concepts of virtual interventions and dynamical causal effects. Namely, two basic kinds of interventions (state space and parametric) and effects (orbital or transient and stationary or limit) are introduced, giving four families of coupling characteristics. The framework provides a unifying view of apparently different well-established measures and allows us to introduce new characteristics, always with a definite "intervention-effect" interpretation. It is shown that diverse characteristics cannot be reduced to any single coupling strength quantifier and their interpretation is inevitably model based. The proposed set of dynamical causal effect measures quantifies different aspects of "how the coupling manifests itself in the dynamics," reformulating the very question about the "causal coupling strength."

  9. Uncertainty Quantification for Polynomial Systems via Bernstein Expansions

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2012-01-01

    This paper presents a unifying framework to uncertainty quantification for systems having polynomial response metrics that depend on both aleatory and epistemic uncertainties. The approach proposed, which is based on the Bernstein expansions of polynomials, enables bounding the range of moments and failure probabilities of response metrics as well as finding supersets of the extreme epistemic realizations where the limits of such ranges occur. These bounds and supersets, whose analytical structure renders them free of approximation error, can be made arbitrarily tight with additional computational effort. Furthermore, this framework enables determining the importance of particular uncertain parameters according to the extent to which they affect the first two moments of response metrics and failure probabilities. This analysis enables determining the parameters that should be considered uncertain as well as those that can be assumed to be constants without incurring significant error. The analytical nature of the approach eliminates the numerical error that characterizes the sampling-based techniques commonly used to propagate aleatory uncertainties as well as the possibility of under predicting the range of the statistic of interest that may result from searching for the best- and worstcase epistemic values via nonlinear optimization or sampling.

  10. Uncertainty Quantification of Medium-Term Heat Storage From Short-Term Geophysical Experiments Using Bayesian Evidential Learning

    NASA Astrophysics Data System (ADS)

    Hermans, Thomas; Nguyen, Frédéric; Klepikova, Maria; Dassargues, Alain; Caers, Jef

    2018-04-01

    In theory, aquifer thermal energy storage (ATES) systems can recover in winter the heat stored in the aquifer during summer to increase the energy efficiency of the system. In practice, the energy efficiency is often lower than expected from simulations due to spatial heterogeneity of hydraulic properties or non-favorable hydrogeological conditions. A proper design of ATES systems should therefore consider the uncertainty of the prediction related to those parameters. We use a novel framework called Bayesian Evidential Learning (BEL) to estimate the heat storage capacity of an alluvial aquifer using a heat tracing experiment. BEL is based on two main stages: pre- and postfield data acquisition. Before data acquisition, Monte Carlo simulations and global sensitivity analysis are used to assess the information content of the data to reduce the uncertainty of the prediction. After data acquisition, prior falsification and machine learning based on the same Monte Carlo are used to directly assess uncertainty on key prediction variables from observations. The result is a full quantification of the posterior distribution of the prediction conditioned to observed data, without any explicit full model inversion. We demonstrate the methodology in field conditions and validate the framework using independent measurements.

  11. Quantification of fibrous cap thickness in intracoronary optical coherence tomography with a contour segmentation method based on dynamic programming.

    PubMed

    Zahnd, Guillaume; Karanasos, Antonios; van Soest, Gijs; Regar, Evelyn; Niessen, Wiro; Gijsen, Frank; van Walsum, Theo

    2015-09-01

    Fibrous cap thickness is the most critical component of plaque stability. Therefore, in vivo quantification of cap thickness could yield valuable information for estimating the risk of plaque rupture. In the context of preoperative planning and perioperative decision making, intracoronary optical coherence tomography imaging can provide a very detailed characterization of the arterial wall structure. However, visual interpretation of the images is laborious, subject to variability, and therefore not always sufficiently reliable for immediate decision of treatment. A novel semiautomatic segmentation method to quantify coronary fibrous cap thickness in optical coherence tomography is introduced. To cope with the most challenging issue when estimating cap thickness (namely the diffuse appearance of the anatomical abluminal interface to be detected), the proposed method is based on a robust dynamic programming framework using a geometrical a priori. To determine the optimal parameter settings, a training phase was conducted on 10 patients. Validated on a dataset of 179 images from 21 patients, the present framework could successfully extract the fibrous cap contours. When assessing minimal cap thickness, segmentation results from the proposed method were in good agreement with the reference tracings performed by a medical expert (mean absolute error and standard deviation of 22 ± 18 μm) and were similar to inter-observer reproducibility (21 ± 19 μm, R = .74), while being significantly faster and fully reproducible. The proposed framework demonstrated promising performances and could potentially be used for online identification of high-risk plaques.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ernest A. Mancini

    The University of Alabama in cooperation with Texas A&M University, McGill University, Longleaf Energy Group, Strago Petroleum Corporation, and Paramount Petroleum Company are undertaking an integrated, interdisciplinary geoscientific and engineering research project. The project is designed to characterize and model reservoir architecture, pore systems and rock-fluid interactions at the pore to field scale in Upper Jurassic Smackover reef and carbonate shoal reservoirs associated with varying degrees of relief on pre-Mesozoic basement paleohighs in the northeastern Gulf of Mexico. The project effort includes the prediction of fluid flow in carbonate reservoirs through reservoir simulation modeling which utilizes geologic reservoir characterization andmore » modeling and the prediction of carbonate reservoir architecture, heterogeneity and quality through seismic imaging. The primary objective of the project is to increase the profitability, producibility and efficiency of recovery of oil from existing and undiscovered Upper Jurassic fields characterized by reef and carbonate shoals associated with pre-Mesozoic basement paleohighs. The principal research effort for Year 1 of the project has been reservoir description and characterization. This effort has included four tasks: (1) geoscientific reservoir characterization, (2) the study of rock-fluid interactions, (3) petrophysical and engineering characterization and (4) data integration. This work was scheduled for completion in Year 1. Overall, the project work is on schedule. Geoscientific reservoir characterization is essentially completed. The architecture, porosity types and heterogeneity of the reef and shoal reservoirs at Appleton and Vocation Fields have been characterized using geological and geophysical data. The study of rock-fluid interactions has been initiated. Observations regarding the diagenetic processes influencing pore system development and heterogeneity in these reef and shoal reservoirs have been made. Petrophysical and engineering property characterization is progressing. Data on reservoir production rate and pressure history at Appleton and Vocation Fields have been tabulated, and porosity data from core analysis has been correlated with porosity as observed from well log response. Data integration is on schedule, in that, the geological, geophysical, petrophysical and engineering data collected to date for Appleton and Vocation Fields have been compiled into a fieldwide digital database for reservoir characterization, modeling and simulation for the reef and carbonate shoal reservoirs for each of these fields.« less

  13. Structural architecture and petrophysical properties of the Rocca di Neto extensional fault zone developed in the shallow marine sediments of the Crotone Basin (Southern Apennines, Italy).

    NASA Astrophysics Data System (ADS)

    Pizzati, Mattia; Balsamo, Fabrizio; Iacumin, Paola; Swennen, Rudy; Storti, Fabrizio

    2017-04-01

    In this contribution we describe the architecture and petrophysical properties of the Rocca di Neto extensional fault zone in loose and poorly lithified sediments, located in the Crotone forearc basin (south Italy). To this end, we combined fieldwork with microstructural observations, grain size analysis, and in situ permeability measurements. The studied fault zone has an estimated maximum displacement of 80-90 m and separates early Pleistocene age (Gelasian) sands in the footwall from middle Pleistocene (Calabrian) silty clay in the hangingwall. The analysed outcrop consists of about 70 m section through the fault zone mostly developed in the footwall block. Fault zone consists of four different structural domains characterized by distinctive features: (1) <1 m-thick fault core (where the majority of the displacement is accommodated) in which bedding is transposed into foliation imparted by grain preferential orientation and some black gouges decorate the main slip surfaces; (2) zone of tectonic mixing characterized by a set of closely spaced and anastomosed deformation bands parallel to the main slip surface; (3) about 8 m-thick footwall damage zone characterized by synthetic and antithetic sets of deformation bands; (4) zone of background deformation with a few, widely-spaced conjugate minor faults and deformation bands. The boundary between the relatively undeformed sediments and the damage zone is not sharp and it is characterized by a progressive decrease in deformation intensity. The silty clay in the hangingwall damage zone is characterized by minor faults. Grain size and microstructural data indicate that particulate flow with little amount of cataclasis is the dominant deformation mechanism in both fault core rocks and deformation bands. Permeability of undeformed sediments is about 70000 mD, whereas the permeability in deformation bands ranges from 1000 to 18000 mD; within the fault core rocks permeability is reduced up to 3-4 orders of magnitude respect to the undeformed domains. Structural and petrophysical data suggest that the Rocca di Neto fault zone may compartmentalize the footwall block due to both juxtaposition of clay-rich lithology in the hangingwall and the development of low permeability fault core rocks.

  14. Application of real rock pore-threat statistics to a regular pore network model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rakibul, M.; Sarker, H.; McIntyre, D.

    2011-01-01

    This work reports the application of real rock statistical data to a previously developed regular pore network model in an attempt to produce an accurate simulation tool with low computational overhead. A core plug from the St. Peter Sandstone formation in Indiana was scanned with a high resolution micro CT scanner. The pore-throat statistics of the three-dimensional reconstructed rock were extracted and the distribution of the pore-throat sizes was applied to the regular pore network model. In order to keep the equivalent model regular, only the throat area or the throat radius was varied. Ten realizations of randomly distributed throatmore » sizes were generated to simulate the drainage process and relative permeability was calculated and compared with the experimentally determined values of the original rock sample. The numerical and experimental procedures are explained in detail and the performance of the model in relation to the experimental data is discussed and analyzed. Petrophysical properties such as relative permeability are important in many applied fields such as production of petroleum fluids, enhanced oil recovery, carbon dioxide sequestration, ground water flow, etc. Relative permeability data are used for a wide range of conventional reservoir engineering calculations and in numerical reservoir simulation. Two-phase oil water relative permeability data are generated on the same core plug from both pore network model and experimental procedure. The shape and size of the relative permeability curves were compared and analyzed and good match has been observed for wetting phase relative permeability but for non-wetting phase, simulation results were found to be deviated from the experimental ones. Efforts to determine petrophysical properties of rocks using numerical techniques are to eliminate the necessity of regular core analysis, which can be time consuming and expensive. So a numerical technique is expected to be fast and to produce reliable results. In applied engineering, sometimes quick result with reasonable accuracy is acceptable than the more time consuming results. Present work is an effort to check the accuracy and validity of a previously developed pore network model for obtaining important petrophysical properties of rocks based on cutting-sized sample data.« less

  15. Magnetic resonance imaging in laboratory petrophysical core analysis

    NASA Astrophysics Data System (ADS)

    Mitchell, J.; Chandrasekera, T. C.; Holland, D. J.; Gladden, L. F.; Fordham, E. J.

    2013-05-01

    Magnetic resonance imaging (MRI) is a well-known technique in medical diagnosis and materials science. In the more specialized arena of laboratory-scale petrophysical rock core analysis, the role of MRI has undergone a substantial change in focus over the last three decades. Initially, alongside the continual drive to exploit higher magnetic field strengths in MRI applications for medicine and chemistry, the same trend was followed in core analysis. However, the spatial resolution achievable in heterogeneous porous media is inherently limited due to the magnetic susceptibility contrast between solid and fluid. As a result, imaging resolution at the length-scale of typical pore diameters is not practical and so MRI of core-plugs has often been viewed as an inappropriate use of expensive magnetic resonance facilities. Recently, there has been a paradigm shift in the use of MRI in laboratory-scale core analysis. The focus is now on acquiring data in the laboratory that are directly comparable to data obtained from magnetic resonance well-logging tools (i.e., a common physics of measurement). To maintain consistency with well-logging instrumentation, it is desirable to measure distributions of transverse (T2) relaxation time-the industry-standard metric in well-logging-at the laboratory-scale. These T2 distributions can be spatially resolved over the length of a core-plug. The use of low-field magnets in the laboratory environment is optimal for core analysis not only because the magnetic field strength is closer to that of well-logging tools, but also because the magnetic susceptibility contrast is minimized, allowing the acquisition of quantitative image voxel (or pixel) intensities that are directly scalable to liquid volume. Beyond simple determination of macroscopic rock heterogeneity, it is possible to utilize the spatial resolution for monitoring forced displacement of oil by water or chemical agents, determining capillary pressure curves, and estimating wettability. The history of MRI in petrophysics is reviewed and future directions considered, including advanced data processing techniques such as compressed sensing reconstruction and Bayesian inference analysis of under-sampled data. Although this review focuses on rock core analysis, the techniques described are applicable in a wider context to porous media in general, such as cements, soils, ceramics, and catalytic materials.

  16. Application of real rock pore-throat statistics to a regular pore network model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarker, M.R.; McIntyre, D.; Ferer, M.

    2011-01-01

    This work reports the application of real rock statistical data to a previously developed regular pore network model in an attempt to produce an accurate simulation tool with low computational overhead. A core plug from the St. Peter Sandstone formation in Indiana was scanned with a high resolution micro CT scanner. The pore-throat statistics of the three-dimensional reconstructed rock were extracted and the distribution of the pore-throat sizes was applied to the regular pore network model. In order to keep the equivalent model regular, only the throat area or the throat radius was varied. Ten realizations of randomly distributed throatmore » sizes were generated to simulate the drainage process and relative permeability was calculated and compared with the experimentally determined values of the original rock sample. The numerical and experimental procedures are explained in detail and the performance of the model in relation to the experimental data is discussed and analyzed. Petrophysical properties such as relative permeability are important in many applied fields such as production of petroleum fluids, enhanced oil recovery, carbon dioxide sequestration, ground water flow, etc. Relative permeability data are used for a wide range of conventional reservoir engineering calculations and in numerical reservoir simulation. Two-phase oil water relative permeability data are generated on the same core plug from both pore network model and experimental procedure. The shape and size of the relative permeability curves were compared and analyzed and good match has been observed for wetting phase relative permeability but for non-wetting phase, simulation results were found to be deviated from the experimental ones. Efforts to determine petrophysical properties of rocks using numerical techniques are to eliminate the necessity of regular core analysis, which can be time consuming and expensive. So a numerical technique is expected to be fast and to produce reliable results. In applied engineering, sometimes quick result with reasonable accuracy is acceptable than the more time consuming results. Present work is an effort to check the accuracy and validity of a previously developed pore network model for obtaining important petrophysical properties of rocks based on cutting-sized sample data. Introduction« less

  17. Acoustic and Petrophysical Evolution of Organic-Rich Chalk Following Maturation Induced by Unconfined Pyrolysis

    NASA Astrophysics Data System (ADS)

    Shitrit, Omri; Hatzor, Yossef H.; Feinstein, Shimon; Vinegar, Harold J.

    2017-12-01

    Thermal maturation is known to influence the rock physics of organic-rich rocks. While most studies were performed on low-porosity organic-rich shales, here we examine the effect of thermal maturation on a high-porosity organic-rich chalk. We compare the physical properties of native state immature rock with the properties at two pyrolysis-simulated maturity levels: early-mature and over-mature. We further evaluate the applicability of results from unconfined pyrolysis experiments to naturally matured rock properties. Special attention is dedicated to the elastic properties of the organic phase and the influence of bitumen and kerogen contents. Rock physics is studied based on confined petrophysical measurements of porosity, density and permeability, and measurements of bedding-normal acoustic velocities at estimated field stresses. Geochemical parameters like total organic carbon (TOC), bitumen content and thermal maturation indicators are used to monitor variations in density and volume fraction of each phase. We find that porosity increases significantly upon pyrolysis and that P wave velocity decreases in accordance. Solids density versus TOC relationships indicate that the kerogen increases its density from 1.43 to 1.49 g/cc at the immature and early-mature stages to 2.98 g/cc at the over-mature stage. This density value is unusually high, although increase in S wave velocity and backscatter SEM images of the over-mature samples verify that the over-mature kerogen is significantly denser and stiffer. Using the petrophysical and acoustic properties, the elastic moduli of the rock are estimated by two Hashin-Shtrikman (HS)-based models: "HS + BAM" and "HS kerogen." The "HS + BAM" model is calibrated to the post-pyrolysis measurements to describe the mechanical effect of the unconfined pyrolysis on the rock. The absence of compaction in the pyrolysis process causes the post-pyrolysis samples to be extremely porous. The "HS kerogen" model, which simulates a kerogen-supported matrix, depicts a compacted version of the matrix and is believed to be more representative of a naturally matured rock. Rock physics analysis using the "HS kerogen" model indicates strong mechanical dominance of porosity and organic content, and only small maturity-associated effects.

  18. Sedimentary Geology Context and Challenges for Cyberinfrastructure Data Management

    NASA Astrophysics Data System (ADS)

    Chan, M. A.; Budd, D. A.

    2014-12-01

    A cyberinfrastructure data management system for sedimentary geology is crucial to multiple facets of interdisciplinary Earth science research, as sedimentary systems form the deep-time framework for many geoscience communities. The breadth and depth of the sedimentary field spans research on the processes that form, shape and affect the Earth's sedimentary crust and distribute resources such as hydrocarbons, coal, and water. The sedimentary record is used by Earth scientists to explore questions such as the continental crust evolution, dynamics of Earth's past climates and oceans, evolution of the biosphere, and the human interface with Earth surface processes. Major challenges to a data management system for sedimentary geology are the volume and diversity of field, analytical, and experimental data, along with many types of physical objects. Objects include rock samples, biological specimens, cores, and photographs. Field data runs the gamut from discrete location and spatial orientation to vertical records of bed thickness, textures, color, sedimentary structures, and grain types. Ex situ information can include geochemistry, mineralogy, petrophysics, chronologic, and paleobiologic data. All data types cover multiple order-of-magnitude scales, often requiring correlation of the multiple scales with varying degrees of resolution. The stratigraphic framework needs dimensional context with locality, time, space, and depth relationships. A significant challenge is that physical objects represent discrete values at specific points, but measured stratigraphic sections are continuous. In many cases, field data is not easily quantified, and determining uncertainty can be difficult. Despite many possible hurdles, the sedimentary community is anxious to embrace geoinformatic resources that can provide better tools to integrate the many data types, create better search capabilities, and equip our communities to conduct high-impact science at unprecedented levels.

  19. Geologic Assessment of Undiscovered, Technically Recoverable Coalbed-Gas Resources in Cretaceous and Tertiary Rocks, North Slope and Adjacent State Waters, Alaska

    USGS Publications Warehouse

    Roberts, Stephen B.

    2008-01-01

    The purpose of the U.S. Geological Survey's (USGS) National Oil and Gas Assessment is to develop geology-based hypotheses regarding the potential for additions to oil and gas reserves in priority areas of the United States, focusing on the distribution, quantity, and availability of oil and natural gas resources. The USGS has completed an assessment of the undiscovered, technically recoverable coalbed-gas resources in Cretaceous and Tertiary rocks underlying the North Slope and adjacent State waters of Alaska (USGS Northern Alaska Province 5001). The province is a priority Energy Policy and Conservation Act (EPCA) province for the National Assessment because of its potential for oil and gas resources. The assessment of this province is based on geologic principles and uses the total petroleum system concept. The geologic elements of a total petroleum system include hydrocarbon source rocks (source rock maturation, hydrocarbon generation and migration), reservoir rocks (stratigraphy, sedimentology, petrophysical properties), and hydrocarbon traps (trap formation and timing). In the Northern Alaska Province, the USGS used this geologic framework to define one composite coalbed gas total petroleum system and three coalbed gas assessment units within the petroleum system, and quantitatively estimated the undiscovered coalbed-gas resources within each assessment unit.

  20. Petroleum Systems and Assessment of Undiscovered Oil and Gas in the Raton Basin - Sierra Grande Uplift Province, Colorado and New Mexico - USGS Province 41

    USGS Publications Warehouse

    Higley, Debra K.

    2007-01-01

    Introduction The purpose of the U.S. Geological Survey's (USGS) National Oil and Gas Assessment is to develop geologically based hypotheses regarding the potential for additions to oil and gas reserves in priority areas of the United States. The USGS recently completed an assessment of undiscovered oil and gas resources of the Raton Basin-Sierra Grande Uplift Province of southeastern Colorado and northeastern New Mexico (USGS Province 41). The Cretaceous Vermejo Formation and Cretaceous-Tertiary Raton Formation have production and undiscovered resources of coalbed methane. Other formations in the province exhibit potential for gas resources and limited production. This assessment is based on geologic principles and uses the total petroleum system concept. The geologic elements of a total petroleum system include hydrocarbon source rocks (source rock maturation, hydrocarbon generation and migration), reservoir rocks (sequence stratigraphy and petrophysical properties), and hydrocarbon traps (trap formation and timing). The USGS used this geologic framework to define two total petroleum systems and five assessment units. All five assessment units were quantitatively assessed for undiscovered gas resources. Oil resources were not assessed because of the limited potential due to levels of thermal maturity of petroleum source rocks.

  1. National Assessment of Oil and Gas Project: Petroleum systems and assessment of undiscovered oil and gas in the Denver Basin Province, Colorado, Kansas, Nebraska, South Dakota, and Wyoming - USGS Province 39

    USGS Publications Warehouse

    Higley, Debra K.

    2007-01-01

    The purpose of the U.S. Geological Survey's (USGS) National Oil and Gas Assessment is to develop geologically based hypotheses regarding the potential for additions to oil and gas reserves in priority areas of the United States. The USGS recently completed an assessment of undiscovered oil and gas resources of the Denver Basin Province (USGS Province 39), Colorado, Kansas, Nebraska, South Dakota, and Wyoming. Petroleum is produced in the province from sandstone, shale, and limestone reservoirs that range from Pennsylvanian to Upper Cretaceous in age. This assessment is based on geologic principles and uses the total petroleum system concept. The geologic elements of a total petroleum system include hydrocarbon source rocks (source rock maturation, hydrocarbon generation and migration), reservoir rocks (sequence stratigraphy and petrophysical properties), and hydrocarbon traps (trap formation and timing). The USGS used this geologic framework to define seven total petroleum systems and twelve assessment units. Nine of these assessment units were quantitatively assessed for undiscovered oil and gas resources. Gas was not assessed for two coal bed methane assessment units due to lack of information and limited potential; oil resources were not assessed for the Fractured Pierre Shale Assessment Unit due to its mature development status.

  2. A hybrid segmentation approach for geographic atrophy in fundus auto-fluorescence images for diagnosis of age-related macular degeneration.

    PubMed

    Lee, Noah; Laine, Andrew F; Smith, R Theodore

    2007-01-01

    Fundus auto-fluorescence (FAF) images with hypo-fluorescence indicate geographic atrophy (GA) of the retinal pigment epithelium (RPE) in age-related macular degeneration (AMD). Manual quantification of GA is time consuming and prone to inter- and intra-observer variability. Automatic quantification is important for determining disease progression and facilitating clinical diagnosis of AMD. In this paper we describe a hybrid segmentation method for GA quantification by identifying hypo-fluorescent GA regions from other interfering retinal vessel structures. First, we employ background illumination correction exploiting a non-linear adaptive smoothing operator. Then, we use the level set framework to perform segmentation of hypo-fluorescent areas. Finally, we present an energy function combining morphological scale-space analysis with a geometric model-based approach to perform segmentation refinement of false positive hypo- fluorescent areas due to interfering retinal structures. The clinically apparent areas of hypo-fluorescence were drawn by an expert grader and compared on a pixel by pixel basis to our segmentation results. The mean sensitivity and specificity of the ROC analysis were 0.89 and 0.98%.

  3. Uncertainty Quantification and Certification Prediction of Low-Boom Supersonic Aircraft Configurations

    NASA Technical Reports Server (NTRS)

    West, Thomas K., IV; Reuter, Bryan W.; Walker, Eric L.; Kleb, Bil; Park, Michael A.

    2014-01-01

    The primary objective of this work was to develop and demonstrate a process for accurate and efficient uncertainty quantification and certification prediction of low-boom, supersonic, transport aircraft. High-fidelity computational fluid dynamics models of multiple low-boom configurations were investigated including the Lockheed Martin SEEB-ALR body of revolution, the NASA 69 Delta Wing, and the Lockheed Martin 1021-01 configuration. A nonintrusive polynomial chaos surrogate modeling approach was used for reduced computational cost of propagating mixed, inherent (aleatory) and model-form (epistemic) uncertainty from both the computation fluid dynamics model and the near-field to ground level propagation model. A methodology has also been introduced to quantify the plausibility of a design to pass a certification under uncertainty. Results of this study include the analysis of each of the three configurations of interest under inviscid and fully turbulent flow assumptions. A comparison of the uncertainty outputs and sensitivity analyses between the configurations is also given. The results of this study illustrate the flexibility and robustness of the developed framework as a tool for uncertainty quantification and certification prediction of low-boom, supersonic aircraft.

  4. DNA methylation analysis from saliva samples for epidemiological studies.

    PubMed

    Nishitani, Shota; Parets, Sasha E; Haas, Brian W; Smith, Alicia K

    2018-06-18

    Saliva is a non-invasive, easily accessible tissue, which is regularly collected in large epidemiological studies to examine genetic questions. Recently, it is becoming more common to use saliva to assess DNA methylation. However, DNA extracted from saliva is a mixture of both bacterial and human DNA derived from epithelial and immune cells in the mouth. Thus, there are unique challenges to using salivary DNA in methylation studies that can influence data quality. This study assesses: (1) quantification of human DNA after extraction; (2) delineation of human and bacterial DNA; (3) bisulfite conversion (BSC); (4) quantification of BSC DNA; (5) PCR amplification of BSC DNA from saliva and; (6) quantitation of DNA methylation with a targeted assay. The framework proposed will allow saliva samples to be more widely used in targeted epigenetic studies.

  5. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action

    PubMed Central

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels—from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures. PMID:27920748

  6. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action.

    PubMed

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels-from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures.

  7. Application of a Monte Carlo framework with bootstrapping for quantification of uncertainty in baseline map of carbon emissions from deforestation in Tropical Regions

    Treesearch

    William Salas; Steve Hagen

    2013-01-01

    This presentation will provide an overview of an approach for quantifying uncertainty in spatial estimates of carbon emission from land use change. We generate uncertainty bounds around our final emissions estimate using a randomized, Monte Carlo (MC)-style sampling technique. This approach allows us to combine uncertainty from different sources without making...

  8. A fully automated system for quantification of background parenchymal enhancement in breast DCE-MRI

    NASA Astrophysics Data System (ADS)

    Ufuk Dalmiş, Mehmet; Gubern-Mérida, Albert; Borelli, Cristina; Vreemann, Suzan; Mann, Ritse M.; Karssemeijer, Nico

    2016-03-01

    Background parenchymal enhancement (BPE) observed in breast dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) has been identified as an important biomarker associated with risk for developing breast cancer. In this study, we present a fully automated framework for quantification of BPE. We initially segmented fibroglandular tissue (FGT) of the breasts using an improved version of an existing method. Subsequently, we computed BPEabs (volume of the enhancing tissue), BPErf (BPEabs divided by FGT volume) and BPErb (BPEabs divided by breast volume), using different relative enhancement threshold values between 1% and 100%. To evaluate and compare the previous and improved FGT segmentation methods, we used 20 breast DCE-MRI scans and we computed Dice similarity coefficient (DSC) values with respect to manual segmentations. For evaluation of the BPE quantification, we used a dataset of 95 breast DCE-MRI scans. Two radiologists, in individual reading sessions, visually analyzed the dataset and categorized each breast into minimal, mild, moderate and marked BPE. To measure the correlation between automated BPE values to the radiologists' assessments, we converted these values into ordinal categories and we used Spearman's rho as a measure of correlation. According to our results, the new segmentation method obtained an average DSC of 0.81 0.09, which was significantly higher (p<0.001) compared to the previous method (0.76 0.10). The highest correlation values between automated BPE categories and radiologists' assessments were obtained with the BPErf measurement (r=0.55, r=0.49, p<0.001 for both), while the correlation between the scores given by the two radiologists was 0.82 (p<0.001). The presented framework can be used to systematically investigate the correlation between BPE and risk in large screening cohorts.

  9. Geophysical assessments of renewable gas energy compressed in geologic pore storage reservoirs.

    PubMed

    Al Hagrey, Said Attia; Köhn, Daniel; Rabbel, Wolfgang

    2014-01-01

    Renewable energy resources can indisputably minimize the threat of global warming and climate change. However, they are intermittent and need buffer storage to bridge the time-gap between production (off peak) and demand peaks. Based on geologic and geochemical reasons, the North German Basin has a very large capacity for compressed air/gas energy storage CAES in porous saltwater aquifers and salt cavities. Replacing pore reservoir brine with CAES causes changes in physical properties (elastic moduli, density and electrical properties) and justify applications of integrative geophysical methods for monitoring this energy storage. Here we apply techniques of the elastic full waveform inversion FWI, electric resistivity tomography ERT and gravity to map and quantify a gradually saturated gas plume injected in a thin deep saline aquifer within the North German Basin. For this subsurface model scenario we generated different synthetic data sets without and with adding random noise in order to robust the applied techniques for the real field applications. Datasets are inverted by posing different constraints on the initial model. Results reveal principally the capability of the applied integrative geophysical approach to resolve the CAES targets (plume, host reservoir, and cap rock). Constrained inversion models of elastic FWI and ERT are even able to recover well the gradual gas desaturation with depth. The spatial parameters accurately recovered from each technique are applied in the adequate petrophysical equations to yield precise quantifications of gas saturations. Resulting models of gas saturations independently determined from elastic FWI and ERT techniques are in accordance with each other and with the input (true) saturation model. Moreover, the gravity technique show high sensitivity to the mass deficit resulting from the gas storage and can resolve saturations and temporal saturation changes down to ±3% after reducing any shallow fluctuation such as that of groundwater table.

  10. Prediction of shear wave velocity using empirical correlations and artificial intelligence methods

    NASA Astrophysics Data System (ADS)

    Maleki, Shahoo; Moradzadeh, Ali; Riabi, Reza Ghavami; Gholami, Raoof; Sadeghzadeh, Farhad

    2014-06-01

    Good understanding of mechanical properties of rock formations is essential during the development and production phases of a hydrocarbon reservoir. Conventionally, these properties are estimated from the petrophysical logs with compression and shear sonic data being the main input to the correlations. This is while in many cases the shear sonic data are not acquired during well logging, which may be for cost saving purposes. In this case, shear wave velocity is estimated using available empirical correlations or artificial intelligent methods proposed during the last few decades. In this paper, petrophysical logs corresponding to a well drilled in southern part of Iran were used to estimate the shear wave velocity using empirical correlations as well as two robust artificial intelligence methods knows as Support Vector Regression (SVR) and Back-Propagation Neural Network (BPNN). Although the results obtained by SVR seem to be reliable, the estimated values are not very precise and considering the importance of shear sonic data as the input into different models, this study suggests acquiring shear sonic data during well logging. It is important to note that the benefits of having reliable shear sonic data for estimation of rock formation mechanical properties will compensate the possible additional costs for acquiring a shear log.

  11. Joint Stochastic Inversion of Pre-Stack 3D Seismic Data and Well Logs for High Resolution Hydrocarbon Reservoir Characterization

    NASA Astrophysics Data System (ADS)

    Torres-Verdin, C.

    2007-05-01

    This paper describes the successful implementation of a new 3D AVA stochastic inversion algorithm to quantitatively integrate pre-stack seismic amplitude data and well logs. The stochastic inversion algorithm is used to characterize flow units of a deepwater reservoir located in the central Gulf of Mexico. Conventional fluid/lithology sensitivity analysis indicates that the shale/sand interface represented by the top of the hydrocarbon-bearing turbidite deposits generates typical Class III AVA responses. On the other hand, layer- dependent Biot-Gassmann analysis shows significant sensitivity of the P-wave velocity and density to fluid substitution. Accordingly, AVA stochastic inversion, which combines the advantages of AVA analysis with those of geostatistical inversion, provided quantitative information about the lateral continuity of the turbidite reservoirs based on the interpretation of inverted acoustic properties (P-velocity, S-velocity, density), and lithotype (sand- shale) distributions. The quantitative use of rock/fluid information through AVA seismic amplitude data, coupled with the implementation of co-simulation via lithotype-dependent multidimensional joint probability distributions of acoustic/petrophysical properties, yields accurate 3D models of petrophysical properties such as porosity and permeability. Finally, by fully integrating pre-stack seismic amplitude data and well logs, the vertical resolution of inverted products is higher than that of deterministic inversions methods.

  12. An interpretation of core and wireline logs for the Petrophysical evaluation of Upper Shallow Marine sandstone reservoirs of the Bredasdorp Basin, offshore South Africa

    NASA Astrophysics Data System (ADS)

    Magoba, Moses; Opuwari, Mimonitu

    2017-04-01

    This paper embodies a study carried out to assess the Petrophysical evaluation of upper shallow marine sandstone reservoir of 10 selected wells in the Bredasdorp basin, offshore, South Africa. The studied wells were selected randomly across the upper shallow marine formation with the purpose of conducting a regional study to assess the difference in reservoir properties across the formation. The data sets used in this study were geophysical wireline logs, Conventional core analysis and geological well completion report. The physical rock properties, for example, lithology, fluid type, and hydrocarbon bearing zone were qualitatively characterized while different parameters such as volume of clay, porosity, permeability, water saturation ,hydrocarbon saturation, storage and flow capacity were quantitatively estimated. The quantitative results were calibrated with the core data. The upper shallow marine reservoirs were penetrated at different depth ranging from shallow depth of about 2442m to 3715m. The average volume of clay, average effective porosity, average water saturation, hydrocarbon saturation and permeability range from 8.6%- 43%, 9%- 16%, 12%- 68% , 32%- 87.8% and 0.093mD -151.8mD respectively. The estimated rock properties indicate a good reservoir quality. Storage and flow capacity results presented a fair to good distribution of hydrocarbon flow.

  13. Dynamic characterization of fractured carbonates at the Hontomín CO2 storage site

    NASA Astrophysics Data System (ADS)

    Le Gallo, yann; de Dios, José Carlos; Salvador, Ignacio; Acosta Carballo, Taimara

    2017-04-01

    The geological storage of CO2 is investigated at the Technology Development Plant (TDP) at Hontomín (Burgos, Spain) into a deep saline aquifer, formed by fractured carbonates with poor matrix porosity. During the hydraulic characterization tests, 2,300 tons of liquid CO2 and 14,000 m3 synthetic brine were co-injected on site in various sequences to determine the pressure and temperature responses of the facture network. The results of the pressure tests were analyzed using an analytical approach to determine the overall petrophysical characteristics of the storage formation. Later on, these characteristics were implemented in a 3-D numerical model. The model is a compositional dual medium (fracture + matrix) which accounts for temperature effects, as CO2 is liquid at the well bottom-hole, and multiphase flow hysteresis as alternating water and CO2 injection tests were performed. The pressure and temperature responses of the storage formation were history-matched mainly through the petrophysical and geometrical characteristics of the facture network. This dynamic characterization of the fracture network controls the CO2 migration while the matrix does not appear to significantly contribute to the storage capacity. Consequently, the hydrodynamic behavior of the aquifer is one of the main challenge of the modeling workflow.

  14. Computation of fluid flow and pore-space properties estimation on micro-CT images of rock samples

    NASA Astrophysics Data System (ADS)

    Starnoni, M.; Pokrajac, D.; Neilson, J. E.

    2017-09-01

    Accurate determination of the petrophysical properties of rocks, namely REV, mean pore and grain size and absolute permeability, is essential for a broad range of engineering applications. Here, the petrophysical properties of rocks are calculated using an integrated approach comprising image processing, statistical correlation and numerical simulations. The Stokes equations of creeping flow for incompressible fluids are solved using the Finite-Volume SIMPLE algorithm. Simulations are then carried out on three-dimensional digital images obtained from micro-CT scanning of two rock formations: one sandstone and one carbonate. Permeability is predicted from the computed flow field using Darcy's law. It is shown that REV, REA and mean pore and grain size are effectively estimated using the two-point spatial correlation function. Homogeneity and anisotropy are also evaluated using the same statistical tools. A comparison of different absolute permeability estimates is also presented, revealing a good agreement between the numerical value and the experimentally determined one for the carbonate sample, but a large discrepancy for the sandstone. Finally, a new convergence criterion for the SIMPLE algorithm, and more generally for the family of pressure-correction methods, is presented. This criterion is based on satisfaction of bulk momentum balance, which makes it particularly useful for pore-scale modelling of reservoir rocks.

  15. An Overview of Geologic Carbon Sequestration Potential in California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cameron Downey; John Clinkenbeard

    2005-10-01

    As part of the West Coast Regional Carbon Sequestration Partnership (WESTCARB), the California Geological Survey (CGS) conducted an assessment of geologic carbon sequestration potential in California. An inventory of sedimentary basins was screened for preliminary suitability for carbon sequestration. Criteria included porous and permeable strata, seals, and depth sufficient for critical state carbon dioxide (CO{sub 2}) injection. Of 104 basins inventoried, 27 met the criteria for further assessment. Petrophysical and fluid data from oil and gas reservoirs was used to characterize both saline aquifers and hydrocarbon reservoirs. Where available, well log or geophysical information was used to prepare basin-wide mapsmore » showing depth-to-basement and gross sand distribution. California's Cenozoic marine basins were determined to possess the most potential for geologic sequestration. These basins contain thick sedimentary sections, multiple saline aquifers and oil and gas reservoirs, widespread shale seals, and significant petrophysical data from oil and gas operations. Potential sequestration areas include the San Joaquin, Sacramento, Ventura, Los Angeles, and Eel River basins, followed by the smaller Salinas, La Honda, Cuyama, Livermore, Orinda, and Sonoma marine basins. California's terrestrial basins are generally too shallow for carbon sequestration. However, the Salton Trough and several smaller basins may offer opportunities for localized carbon sequestration.« less

  16. Black dimensional stones: Geology, technical properties and deposit characterization of the dolerites from Uruguay

    NASA Astrophysics Data System (ADS)

    Morales Demarco, M.; Oyhantçabal, P.; Stein, K.-J.; Siegesmund, S.

    2012-04-01

    Dimensional stones with a black color occupy a prominent place on the international market. Uruguayan dolerite dikes of andesitic and andesitic-basaltic composition are mined for commercial blocks of black dimensional stones. A total of 16 dikes of both compositions were studied and samples collected for geochemical and petrographical analysis. Color measurements were performed on different black dimensional stones in order to compare them with the Uruguayan dolerites. Samples of the two commercial varieties (Absolute Black and Moderate Black) were obtained for petrophysical analysis (e.g. density, porosity, uniaxial compressive strength, tensile strength, etc.). Detailed structural analyses were performed in several quarries. Geochemistry and petrography determines the intensity of the black color. When compared with commercial samples from China, Brazil, India and South Africa, among others, the Uruguayan dolerite Absolute Black is the darkest black dimensional stone analyzed. In addition, the petrophysical properties of the Uruguayan dolerites make them one of the highest quality black dimensional stones. Structural analyses show that five joint sets have been recognized: two sub-vertical joints, one horizontal and two diagonal. These joint sets are one of the most important factors that control the deposits, since they control the block size distribution and the amount of waste material.

  17. Integration of seismic and petrophysics to characterize reservoirs in "ALA" oil field, Niger Delta.

    PubMed

    Alao, P A; Olabode, S O; Opeloye, S A

    2013-01-01

    In the exploration and production business, by far the largest component of geophysical spending is driven by the need to characterize (potential) reservoirs. The simple reason is that better reservoir characterization means higher success rates and fewer wells for reservoir exploitation. In this research work, seismic and well log data were integrated in characterizing the reservoirs on "ALA" field in Niger Delta. Three-dimensional seismic data was used to identify the faults and map the horizons. Petrophysical parameters and time-depth structure maps were obtained. Seismic attributes was also employed in characterizing the reservoirs. Seven hydrocarbon-bearing reservoirs with thickness ranging from 9.9 to 71.6 m were delineated. Structural maps of horizons in six wells containing hydrocarbon-bearing zones with tops and bottoms at range of -2,453 to -3,950 m were generated; this portrayed the trapping mechanism to be mainly fault-assisted anticlinal closures. The identified prospective zones have good porosity, permeability, and hydrocarbon saturation. The environments of deposition were identified from log shapes which indicate a transitional-to-deltaic depositional environment. In this research work, new prospects have been recommended for drilling and further research work. Geochemical and biostratigraphic studies should be done to better characterize the reservoirs and reliably interpret the depositional environments.

  18. Integration between well logging and seismic reflection techniques for structural a

    NASA Astrophysics Data System (ADS)

    Mohamed, Adel K.; Ghazala, Hosni H.; Mohamed, Lamees

    2016-12-01

    Abu El Gharadig basin is located in the northern part of the Western Desert, Egypt. Geophysical investigation in the form of thirty (3D) seismic lines and well logging data of five wells have been analyzed in the oil field BED-1 that is located in the northwestern part of Abu El Gharadig basin in the Western Desert of Egypt. The reflection sections have been used to shed more light on the tectonic setting of Late Jurassic-Early Cretaceous rocks. While the well logging data have been analyzed for delineating the petrophysical characteristics of the two main reservoirs, Bahariya and Kharita Formations. The constructed subsurface geologic cross sections, seismic sections, and the isochronous reflection maps indicate that the area is structurally controlled by tectonic trends affecting the current shape of Abu El Gharadig basin. Different types of faults are well represented in the area, particularly normal one. The analysis of the average and interval velocities versus depth has shown their effect by facies changes and/or fluid content. On the other hand, the derived petrophysical parameters of Bahariya and Kharita Formations vary from well to another and they have been affected by the gas effect and/or the presence of organic matter, complex lithology, clay content of dispersed habitat, and the pore volume.

  19. Shale characterization on Barito field, Southeast Kalimantan for shale hydrocarbon exploration

    NASA Astrophysics Data System (ADS)

    Sumotarto, T. A.; Haris, A.; Riyanto, A.; Usman, A.

    2017-07-01

    Exploration and exploitation in Indonesia now are still focused on conventional hydrocarbon energy than unconventional hydrocarbon energy such as shale gas. Tanjung Formation is a source rock of Barito Basin located in South Kalimantan that potentially as shale hydrocarbon. In this research, integrated methods using geochemical analysis, mineralogy, petrophysical analysis and seismic interpretation has been applied to explore the shale hydrocarbon potential in Barito Field for Tanjung formation. The first step is conducting geochemical and mineralogy analysis to the shale rock sample. Our analysis shows that the organic richness is ranging from 1.26-5.98 wt.% (good to excellent) with the depth of early mature window of 2170 m. The brittleness index is in an average of 0.44-0.56 (less Brittle) and Kerogen type is classified into II/III type that potentially produces oil and gas. The second step is continued by performing petrophysical analysis, which includes Total Organic Carbon (TOC) calculation and brittleness index continuously. The result has been validated with a laboratory measurement that obtained a good correlation. In addition, seismic interpretation based on inverted acoustic impedance is applied to map the distributions of shale hydrocarbon potential. Our interpretation shows that shale hydrocarbon potential is localized in the eastern and southeastern part of the study area.

  20. Iterative refinement of implicit boundary models for improved geological feature reproduction

    NASA Astrophysics Data System (ADS)

    Martin, Ryan; Boisvert, Jeff B.

    2017-12-01

    Geological domains contain non-stationary features that cannot be described by a single direction of continuity. Non-stationary estimation frameworks generate more realistic curvilinear interpretations of subsurface geometries. A radial basis function (RBF) based implicit modeling framework using domain decomposition is developed that permits introduction of locally varying orientations and magnitudes of anisotropy for boundary models to better account for the local variability of complex geological deposits. The interpolation framework is paired with a method to automatically infer the locally predominant orientations, which results in a rapid and robust iterative non-stationary boundary modeling technique that can refine locally anisotropic geological shapes automatically from the sample data. The method also permits quantification of the volumetric uncertainty associated with the boundary modeling. The methodology is demonstrated on a porphyry dataset and shows improved local geological features.

  1. Hierarchical Bayesian Modeling of Fluid-Induced Seismicity

    NASA Astrophysics Data System (ADS)

    Broccardo, M.; Mignan, A.; Wiemer, S.; Stojadinovic, B.; Giardini, D.

    2017-11-01

    In this study, we present a Bayesian hierarchical framework to model fluid-induced seismicity. The framework is based on a nonhomogeneous Poisson process with a fluid-induced seismicity rate proportional to the rate of injected fluid. The fluid-induced seismicity rate model depends upon a set of physically meaningful parameters and has been validated for six fluid-induced case studies. In line with the vision of hierarchical Bayesian modeling, the rate parameters are considered as random variables. We develop both the Bayesian inference and updating rules, which are used to develop a probabilistic forecasting model. We tested the Basel 2006 fluid-induced seismic case study to prove that the hierarchical Bayesian model offers a suitable framework to coherently encode both epistemic uncertainty and aleatory variability. Moreover, it provides a robust and consistent short-term seismic forecasting model suitable for online risk quantification and mitigation.

  2. V&V framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hills, Richard G.; Maniaci, David Charles; Naughton, Jonathan W.

    2015-09-01

    A Verification and Validation (V&V) framework is presented for the development and execution of coordinated modeling and experimental program s to assess the predictive capability of computational models of complex systems through focused, well structured, and formal processes.The elements of the framework are based on established V&V methodology developed by various organizations including the Department of Energy, National Aeronautics and Space Administration, the American Institute of Aeronautics and Astronautics, and the American Society of Mechanical Engineers. Four main topics are addressed: 1) Program planning based on expert elicitation of the modeling physics requirements, 2) experimental design for model assessment, 3)more » uncertainty quantification for experimental observations and computational model simulations, and 4) assessment of the model predictive capability. The audience for this document includes program planners, modelers, experimentalist, V &V specialist, and customers of the modeling results.« less

  3. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test

    PubMed Central

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G.

    2015-01-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as “gold standard” for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  4. Tributyltin--critical pollutant in whole water samples--development of traceable measurement methods for monitoring under the European Water Framework Directive (WFD) 2000/60/EC.

    PubMed

    Richter, Janine; Fettig, Ina; Philipp, Rosemarie; Jakubowski, Norbert

    2015-07-01

    Tributyltin is listed as one of the priority substances in the European Water Framework Directive (WFD). Despite its decreasing input in the environment, it is still present and has to be monitored. In the European Metrology Research Programme project ENV08, a sensitive and reliable analytical method according to the WFD was developed to quantify this environmental pollutant at a very low limit of quantification. With the development of such a primary reference method for tributyltin, the project helped to improve the quality and comparability of monitoring data. An overview of project aims and potential analytical tools is given.

  5. Eigenspace perturbations for structural uncertainty estimation of turbulence closure models

    NASA Astrophysics Data System (ADS)

    Jofre, Lluis; Mishra, Aashwin; Iaccarino, Gianluca

    2017-11-01

    With the present state of computational resources, a purely numerical resolution of turbulent flows encountered in engineering applications is not viable. Consequently, investigations into turbulence rely on various degrees of modeling. Archetypal amongst these variable resolution approaches would be RANS models in two-equation closures, and subgrid-scale models in LES. However, owing to the simplifications introduced during model formulation, the fidelity of all such models is limited, and therefore the explicit quantification of the predictive uncertainty is essential. In such scenario, the ideal uncertainty estimation procedure must be agnostic to modeling resolution, methodology, and the nature or level of the model filter. The procedure should be able to give reliable prediction intervals for different Quantities of Interest, over varied flows and flow conditions, and at diametric levels of modeling resolution. In this talk, we present and substantiate the Eigenspace perturbation framework as an uncertainty estimation paradigm that meets these criteria. Commencing from a broad overview, we outline the details of this framework at different modeling resolution. Thence, using benchmark flows, along with engineering problems, the efficacy of this procedure is established. This research was partially supported by NNSA under the Predictive Science Academic Alliance Program (PSAAP) II, and by DARPA under the Enabling Quantification of Uncertainty in Physical Systems (EQUiPS) project (technical monitor: Dr Fariba Fahroo).

  6. Classification of Dynamical Diffusion States in Single Molecule Tracking Microscopy

    PubMed Central

    Bosch, Peter J.; Kanger, Johannes S.; Subramaniam, Vinod

    2014-01-01

    Single molecule tracking of membrane proteins by fluorescence microscopy is a promising method to investigate dynamic processes in live cells. Translating the trajectories of proteins to biological implications, such as protein interactions, requires the classification of protein motion within the trajectories. Spatial information of protein motion may reveal where the protein interacts with cellular structures, because binding of proteins to such structures often alters their diffusion speed. For dynamic diffusion systems, we provide an analytical framework to determine in which diffusion state a molecule is residing during the course of its trajectory. We compare different methods for the quantification of motion to utilize this framework for the classification of two diffusion states (two populations with different diffusion speed). We found that a gyration quantification method and a Bayesian statistics-based method are the most accurate in diffusion-state classification for realistic experimentally obtained datasets, of which the gyration method is much less computationally demanding. After classification of the diffusion, the lifetime of the states can be determined, and images of the diffusion states can be reconstructed at high resolution. Simulations validate these applications. We apply the classification and its applications to experimental data to demonstrate the potential of this approach to obtain further insights into the dynamics of cell membrane proteins. PMID:25099798

  7. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    NASA Astrophysics Data System (ADS)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  8. Quantification of complex modular architecture in plants.

    PubMed

    Reeb, Catherine; Kaandorp, Jaap; Jansson, Fredrik; Puillandre, Nicolas; Dubuisson, Jean-Yves; Cornette, Raphaël; Jabbour, Florian; Coudert, Yoan; Patiño, Jairo; Flot, Jean-François; Vanderpoorten, Alain

    2018-04-01

    Morphometrics, the assignment of quantities to biological shapes, is a powerful tool to address taxonomic, evolutionary, functional and developmental questions. We propose a novel method for shape quantification of complex modular architecture in thalloid plants, whose extremely reduced morphologies, combined with the lack of a formal framework for thallus description, have long rendered taxonomic and evolutionary studies extremely challenging. Using graph theory, thalli are described as hierarchical series of nodes and edges, allowing for accurate, homologous and repeatable measurements of widths, lengths and angles. The computer program MorphoSnake was developed to extract the skeleton and contours of a thallus and automatically acquire, at each level of organization, width, length, angle and sinuosity measurements. Through the quantification of leaf architecture in Hymenophyllum ferns (Polypodiopsida) and a fully worked example of integrative taxonomy in the taxonomically challenging thalloid liverwort genus Riccardia, we show that MorphoSnake is applicable to all ramified plants. This new possibility of acquiring large numbers of quantitative traits in plants with complex modular architectures opens new perspectives of applications, from the development of rapid species identification tools to evolutionary analyses of adaptive plasticity. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.

  9. Adaptive quantification and longitudinal analysis of pulmonary emphysema with a hidden Markov measure field model.

    PubMed

    Hame, Yrjo; Angelini, Elsa D; Hoffman, Eric A; Barr, R Graham; Laine, Andrew F

    2014-07-01

    The extent of pulmonary emphysema is commonly estimated from CT scans by computing the proportional area of voxels below a predefined attenuation threshold. However, the reliability of this approach is limited by several factors that affect the CT intensity distributions in the lung. This work presents a novel method for emphysema quantification, based on parametric modeling of intensity distributions and a hidden Markov measure field model to segment emphysematous regions. The framework adapts to the characteristics of an image to ensure a robust quantification of emphysema under varying CT imaging protocols, and differences in parenchymal intensity distributions due to factors such as inspiration level. Compared to standard approaches, the presented model involves a larger number of parameters, most of which can be estimated from data, to handle the variability encountered in lung CT scans. The method was applied on a longitudinal data set with 87 subjects and a total of 365 scans acquired with varying imaging protocols. The resulting emphysema estimates had very high intra-subject correlation values. By reducing sensitivity to changes in imaging protocol, the method provides a more robust estimate than standard approaches. The generated emphysema delineations promise advantages for regional analysis of emphysema extent and progression.

  10. TU-AB-201-11: A Novel Theoretical Framework for MRI-Only Image Guided LDR Prostate and Breast Brachytherapy Implant Dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soliman, A; Elzibak, A; Fatemi, A

    Purpose: To propose a novel framework for accurate model-based dose calculations using only MR images for LDR prostate and breast seed implant brachytherapy. Methods: Model-based dose calculation methodologies recommended by TG-186 require further knowledge about specific tissue composition, which is challenging with MRI. However, relying on MRI-only for implant dosimetry would reduce the soft tissue delineation uncertainty, costs, and uncertainties associated with multi-modality registration and fusion processes. We propose a novel framework to address this problem using quantitative MRI acquisitions and reconstruction techniques. The framework includes three steps: (1) Identify the locations of seeds(2) Identify the presence (or absence) ofmore » calcification(s)(3) Quantify the water and fat content in the underlying tissueSteps (1) and (2) consider the sources that limit patient dosimetry, particularly the inter-seed attenuation and the calcified regions; while step (3) targets the quantification of the tissue composition to consider the heterogeneities in the medium. Our preliminary work has shown that the seeds and the calcifications can be identified with MRI using both the magnitude and the phase images. By employing susceptibility-weighted imaging with specific post-processing techniques, the phase images can be further explored to distinguish the seeds from the calcifications. Absolute quantification of tissue, water, and fat content is feasible and was previously demonstrated in phantoms and in-vivo applications, particularly for brain diseases. The approach relies on the proportionality of the MR signal to the number of protons in an image volume. By employing appropriate correction algorithms for T1 - and T2*-related biases, B1 transmit and receive field inhomogeneities, absolute water/fat content can be determined. Results: By considering calcification and interseed attenuation, and through the knowledge of water and fat mass density, accurate patient-specific implant dosimetry can be achieved with MRI-only. Conclusion: The proposed framework showed that model-based dose calculation is feasible using MRI-only state-of-the-art techniques.« less

  11. Using Uncertainty Quantification to Guide Development and Improvements of a Regional-Scale Model of the Coastal Lowlands Aquifer System Spanning Texas, Louisiana, Mississippi, Alabama and Florida

    NASA Astrophysics Data System (ADS)

    Foster, L. K.; Clark, B. R.; Duncan, L. L.; Tebo, D. T.; White, J.

    2017-12-01

    Several historical groundwater models exist within the Coastal Lowlands Aquifer System (CLAS), which spans the Gulf Coastal Plain in Texas, Louisiana, Mississippi, Alabama, and Florida. The largest of these models, called the Gulf Coast Regional Aquifer System Analysis (RASA) model, has been brought into a new framework using the Newton formulation for MODFLOW-2005 (MODFLOW-NWT) and serves as the starting point of a new investigation underway by the U.S. Geological Survey to improve understanding of the CLAS and provide predictions of future groundwater availability within an uncertainty quantification (UQ) framework. The use of an UQ framework will not only provide estimates of water-level observation worth, hydraulic parameter uncertainty, boundary-condition uncertainty, and uncertainty of future potential predictions, but it will also guide the model development process. Traditionally, model development proceeds from dataset construction to the process of deterministic history matching, followed by deterministic predictions using the model. This investigation will combine the use of UQ with existing historical models of the study area to assess in a quantitative framework the effect model package and property improvements have on the ability to represent past-system states, as well as the effect on the model's ability to make certain predictions of water levels, water budgets, and base-flow estimates. Estimates of hydraulic property information and boundary conditions from the existing models and literature, forming the prior, will be used to make initial estimates of model forecasts and their corresponding uncertainty, along with an uncalibrated groundwater model run within an unconstrained Monte Carlo analysis. First-Order Second-Moment (FOSM) analysis will also be used to investigate parameter and predictive uncertainty, and guide next steps in model development prior to rigorous history matching by using PEST++ parameter estimation code.

  12. National-scale aboveground biomass geostatistical mapping with FIA inventory and GLAS data: Preparation for sparsely sampled lidar assisted forest inventory

    NASA Astrophysics Data System (ADS)

    Babcock, C. R.; Finley, A. O.; Andersen, H. E.; Moskal, L. M.; Morton, D. C.; Cook, B.; Nelson, R.

    2017-12-01

    Upcoming satellite lidar missions, such as GEDI and IceSat-2, are designed to collect laser altimetry data from space for narrow bands along orbital tracts. As a result lidar metric sets derived from these sources will not be of complete spatial coverage. This lack of complete coverage, or sparsity, means traditional regression approaches that consider lidar metrics as explanatory variables (without error) cannot be used to generate wall-to-wall maps of forest inventory variables. We implement a coregionalization framework to jointly model sparsely sampled lidar information and point-referenced forest variable measurements to create wall-to-wall maps with full probabilistic uncertainty quantification of all inputs. We inform the model with USFS Forest Inventory and Analysis (FIA) in-situ forest measurements and GLAS lidar data to spatially predict aboveground forest biomass (AGB) across the contiguous US. We cast our model within a Bayesian hierarchical framework to better model complex space-varying correlation structures among the lidar metrics and FIA data, which yields improved prediction and uncertainty assessment. To circumvent computational difficulties that arise when fitting complex geostatistical models to massive datasets, we use a Nearest Neighbor Gaussian process (NNGP) prior. Results indicate that a coregionalization modeling approach to leveraging sampled lidar data to improve AGB estimation is effective. Further, fitting the coregionalization model within a Bayesian mode of inference allows for AGB quantification across scales ranging from individual pixel estimates of AGB density to total AGB for the continental US with uncertainty. The coregionalization framework examined here is directly applicable to future spaceborne lidar acquisitions from GEDI and IceSat-2. Pairing these lidar sources with the extensive FIA forest monitoring plot network using a joint prediction framework, such as the coregionalization model explored here, offers the potential to improve forest AGB accounting certainty and provide maps for post-model fitting analysis of the spatial distribution of AGB.

  13. Micro-computed tomography pore-scale study of flow in porous media: Effect of voxel resolution

    NASA Astrophysics Data System (ADS)

    Shah, S. M.; Gray, F.; Crawshaw, J. P.; Boek, E. S.

    2016-09-01

    A fundamental understanding of flow in porous media at the pore-scale is necessary to be able to upscale average displacement processes from core to reservoir scale. The study of fluid flow in porous media at the pore-scale consists of two key procedures: Imaging - reconstruction of three-dimensional (3D) pore space images; and modelling such as with single and two-phase flow simulations with Lattice-Boltzmann (LB) or Pore-Network (PN) Modelling. Here we analyse pore-scale results to predict petrophysical properties such as porosity, single-phase permeability and multi-phase properties at different length scales. The fundamental issue is to understand the image resolution dependency of transport properties, in order to up-scale the flow physics from pore to core scale. In this work, we use a high resolution micro-computed tomography (micro-CT) scanner to image and reconstruct three dimensional pore-scale images of five sandstones (Bentheimer, Berea, Clashach, Doddington and Stainton) and five complex carbonates (Ketton, Estaillades, Middle Eastern sample 3, Middle Eastern sample 5 and Indiana Limestone 1) at four different voxel resolutions (4.4 μm, 6.2 μm, 8.3 μm and 10.2 μm), scanning the same physical field of view. Implementing three phase segmentation (macro-pore phase, intermediate phase and grain phase) on pore-scale images helps to understand the importance of connected macro-porosity in the fluid flow for the samples studied. We then compute the petrophysical properties for all the samples using PN and LB simulations in order to study the influence of voxel resolution on petrophysical properties. We then introduce a numerical coarsening scheme which is used to coarsen a high voxel resolution image (4.4 μm) to lower resolutions (6.2 μm, 8.3 μm and 10.2 μm) and study the impact of coarsening data on macroscopic and multi-phase properties. Numerical coarsening of high resolution data is found to be superior to using a lower resolution scan because it avoids the problem of partial volume effects and reduces the scaling effect by preserving the pore-space properties influencing the transport properties. This is evidently compared in this study by predicting several pore network properties such as number of pores and throats, average pore and throat radius and coordination number for both scan based analysis and numerical coarsened data.

  14. Integrated petrophysical and sedimentological study of the Middle Miocene Nullipore Formation (Ras Fanar Field, Gulf of Suez, Egypt): An approach to volumetric analysis of reservoirs

    NASA Astrophysics Data System (ADS)

    Afife, Mohamed M.; Sallam, Emad S.; Faris, Mohamed

    2017-10-01

    This study aims to integrate sedimentological, log and core analyses data of the Middle Miocene Nullipore Formation at the Ras Fanar Field (west central Gulf of Suez, Egypt) to evaluate and reconstruct a robust petrophysical model for this reservoir. The Nullipore Formation attains a thickness ranging from 400 to 980 ft and represents a syn-rift succession of the Middle Miocene marine facies. It consists of coralline-algal-reefal limestone, dolomitic limestone and dolostone facies, with few clay and anhydrite intercalations. Petrographically, seven microfacies types (MF1 to MF7) have been recognized and assembled genetically into three related facies associations (FA1 to FA3). These associations accumulated in three depositional environments: 1) peritidal flat, 2) restricted lagoon, and 3) back-shoal environments situated on a shallow inner ramp (homoclinal) setting. The studied rocks have been influenced by different diagenetic processes (dolomitization, cementation, compaction, authigenesis and dissolution), which led to diminishing and/or enhancing the reservoir quality. Three superimposed 3rd-order depositional sequences are included in the Nullipore succession displaying both retrogradational and aggradational packages of facies. Given the hydrocarbon potential of the Nullipore Formation, conventional well logs of six boreholes and core analyses data from one of these wells (RF-B12) are used to identify electrofacies zones of the Nullipore Formation. The Nullipore Formation has been subdivided into three electrofacies zones (the Nullipore-I, Nullipore-II, and Nullipore-III) that are well-correlated with the three depositional sequences. Results of petrographical studies and log analyses data have been employed in volumetric calculations to estimate the amount of hydrocarbon-in-place and then the ultimate recovery of the Nullipore reservoir. The volumetric calculations indicate that the total volume of oil-in-place is 371 MMSTB at 50% probability (P50), whereas the total recoverable oil is 148.5 MMSTB at P50. The volumetric calculations for the Nullipore zones match the production data indicating a good simulation for the reservoir productivity through the petrophysical parameters. Comparison of the volumetric calculations of the oil and the cumulative production of the Ras Fanar Oil Field indicates remaining reserves of less than 30% of the total recoverable oil. Therefore, the search for unconventional and/or deeper reservoirs at other water contacts is recommended.

  15. Simulation of the mulltizones clastic reservoir: A case study of Upper Qishn Clastic Member, Masila Basin-Yemen

    NASA Astrophysics Data System (ADS)

    Khamis, Mohamed; Marta, Ebrahim Bin; Al Natifi, Ali; Fattah, Khaled Abdel; Lashin, Aref

    2017-06-01

    The Upper Qishn Clastic Member is one of the main oil-bearing reservoirs that are located at Masila Basin-Yemen. It produces oil from many zones with different reservoir properties. The aim of this study is to simulate and model the Qishn sandstone reservoir to provide more understanding of its properties. The available, core plugs, petrophysical, PVT, pressure and production datasets, as well as the seismic structural and geologic information, are all integrated and used in the simulation process. Eclipse simulator was used as a powerful tool for reservoir modeling. A simplified approach based on a pseudo steady-state productivity index and a material balance relationship between the aquifer pressure and the cumulative influx, is applied. The petrophysical properties of the Qishn sandstone reservoir are mainly investigated based on the well logging and core plug analyses. Three reservoir zones of good hydrocarbon potentiality are indicated and named from above to below as S1A, S1C and S2. Among of these zones, the S1A zone attains the best petrophysical and reservoir quality properties. It has an average hydrocarbon saturation of more than 65%, high effective porosity up to 20% and good permeability record (66 mD). The reservoir structure is represented by faulted anticline at the middle of the study with a down going decrease in geometry from S1A zone to S2 zone. It is limited by NE-SW and E-W bounding faults, with a weak aquifer connection from the east. The analysis of pressure and PVT data has revealed that the reservoir fluid type is dead oil with very low gas liquid ratio (GLR). The simulation results indicate heterogeneous reservoir associated with weak aquifer, supported by high initial water saturation and high water cut. Initial oil in place is estimated to be around 628 MM BBL, however, the oil recovery during the period of production is very low (<10%) because of the high water cut due to the fractures associated with many faults. Hence, secondary and tertiary methods are needed to enhance the oil recovery. Water flooding is recommended as the first step of oil recovery enhancement by changing some of high water cut wells to injectors.

  16. Ultra-Scalable Algorithms for Large-Scale Uncertainty Quantification in Inverse Wave Propagation

    DTIC Science & Technology

    2016-03-04

    53] N. Petra , J. Martin , G. Stadler, and O. Ghattas, A computational framework for infinite-dimensional Bayesian inverse problems: Part II...positions: Alen Alexanderian (NC State), Tan Bui-Thanh (UT-Austin), Carsten Burstedde (University of Bonn), Noemi Petra (UC Merced), Georg Stalder (NYU), Hari...Baltimore, MD, Nov. 2002. SC2002 Best Technical Paper Award. [3] A. Alexanderian, N. Petra , G. Stadler, and O. Ghattas, A-optimal design of exper

  17. Seismic velocity uncertainties and their effect on geothermal predictions: A case study

    NASA Astrophysics Data System (ADS)

    Rabbel, Wolfgang; Köhn, Daniel; Bahadur Motra, Hem; Niederau, Jan; Thorwart, Martin; Wuttke, Frank; Descramble Working Group

    2017-04-01

    Geothermal exploration relies in large parts on geophysical subsurface models derived from seismic reflection profiling. These models are the framework of hydro-geothermal modeling, which further requires estimating thermal and hydraulic parameters to be attributed to the seismic strata. All petrophysical and structural properties involved in this process can be determined only with limited accuracy and thus impose uncertainties onto the resulting model predictions of temperature-depth profiles and hydraulic flow, too. In the present study we analyze sources and effects of uncertainties of the seismic velocity field, which translate directly into depth uncertainties of the hydraulically and thermally relevant horizons. Geological sources of these uncertainties are subsurface heterogeneity and seismic anisotropy, methodical sources are limitations in spread length and physical resolution. We demonstrate these effects using data of the EU-Horizon 2020 project DESCRAMBLE investigating a shallow super-critical geothermal reservoir in the Larderello area. The study is based on 2D- and 3D seismic reflection data and laboratory measurements on representative rock samples under simulated in-situ conditions. The rock samples consistently show P-wave anisotropy values of 10-20% order of magnitude. However, the uncertainty of layer depths induced by anisotropy is likely to be lower depending on the accuracy, with which the spatial orientation of bedding planes can be determined from the seismic reflection images.

  18. The key to commercial-scale geological CO2 sequestration: Displaced fluid management

    USGS Publications Warehouse

    Surdam, R.C.; Jiao, Z.; Stauffer, P.; Miller, T.

    2011-01-01

    The Wyoming State Geological Survey has completed a thorough inventory and prioritization of all Wyoming stratigraphic units and geologic sites capable of sequestering commercial quantities of CO2 (5-15 Mt CO 2/year). This multi-year study identified the Paleozoic Tensleep/Weber Sandstone and Madison Limestone (and stratigraphic equivalent units) as the leading clastic and carbonate reservoir candidates for commercial-scale geological CO2 sequestration in Wyoming. This conclusion was based on unit thickness, overlying low permeability lithofacies, reservoir storage and continuity properties, regional distribution patterns, formation fluid chemistry characteristics, and preliminary fluid-flow modeling. This study also identified the Rock Springs Uplift in southwestern Wyoming as the most promising geological CO2 sequestration site in Wyoming and probably in any Rocky Mountain basin. The results of the WSGS CO2 geological sequestration inventory led the agency and colleagues at the UW School of Energy Resources Carbon Management Institute (CMI) to collect available geologic, petrophysical, geochemical, and geophysical data on the Rock Springs Uplift, and to build a regional 3-D geologic framework model of the Uplift. From the results of these tasks and using the FutureGen protocol, the WSGS showed that on the Rock Springs Uplift, the Weber Sandstone has sufficient pore space to sequester 18 billion tons (Gt) of CO2, and the Madison Limestone has sufficient pore space to sequester 8 Gt of CO2. ?? 2011 Published by Elsevier Ltd.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao Hanqing; Fu Zhiguo; Lu Xiaoguang

    Guided by the sedimentation theory and knowledge of modern and ancient fluvial deposition and utilizing the abundant information of sedimentary series, microfacies type and petrophysical parameters from well logging curves of close spaced thousands of wells located in a large area. A new method for establishing detailed sedimentation and permeability distribution models for fluvial reservoirs have been developed successfully. This study aimed at the geometry and internal architecture of sandbodies, in accordance to their hierarchical levels of heterogeneity and building up sedimentation and permeability distribution models of fluvial reservoirs, describing the reservoir heterogeneity on the light of the river sedimentarymore » rules. The results and methods obtained in outcrop and modem sedimentation studies have successfully supported the study. Taking advantage of this method, the major producing layers (PI{sub 1-2}), which have been considered as heterogeneous and thick fluvial reservoirs extending widely in lateral are researched in detail. These layers are subdivided into single sedimentary units vertically and the microfacies are identified horizontally. Furthermore, a complex system is recognized according to their hierarchical levels from large to small, meander belt, single channel sandbody, meander scroll, point bar, and lateral accretion bodies of point bar. The achieved results improved the description of areal distribution of point bar sandbodies, provide an accurate and detailed framework model for establishing high resolution predicting model. By using geostatistic technique, it also plays an important role in searching for enriched zone of residual oil distribution.« less

  20. Increasing the feasibility of minimally invasive procedures in type A aortic dissections: a framework for segmentation and quantification.

    PubMed

    Morariu, Cosmin Adrian; Terheiden, Tobias; Dohle, Daniel Sebastian; Tsagakis, Konstantinos; Pauli, Josef

    2016-02-01

    Our goal is to provide precise measurements of the aortic dimensions in case of dissection pathologies. Quantification of surface lengths and aortic radii/diameters together with the visualization of the dissection membrane represents crucial prerequisites for enabling minimally invasive treatment of type A dissections, which always also imply the ascending aorta. We seek a measure invariant to luminance and contrast for aortic outer wall segmentation. Therefore, we propose a 2D graph-based approach using phase congruency combined with additional features. Phase congruency is extended to 3D by designing a novel conic directional filter and adding a lowpass component to the 3D Log-Gabor filterbank for extracting the fine dissection membrane, which separates the true lumen from the false one within the aorta. The result of the outer wall segmentation is compared with manually annotated axial slices belonging to 11 CTA datasets. Quantitative assessment of our novel 2D/3D membrane extraction algorithms has been obtained for 10 datasets and reveals subvoxel accuracy in all cases. Aortic inner and outer surface lengths, determined within 2 cadaveric CT datasets, are validated against manual measurements performed by a vascular surgeon on excised aortas of the body donors. This contribution proposes a complete pipeline for segmentation and quantification of aortic dissections. Validation against ground truth of the 3D contour lengths quantification represents a significant step toward custom-designed stent-grafts.

  1. Real-time polymerase chain reaction-based approach for quantification of the pat gene in the T25 Zea mays event.

    PubMed

    Weighardt, Florian; Barbati, Cristina; Paoletti, Claudia; Querci, Maddalena; Kay, Simon; De Beuckeleer, Marc; Van den Eede, Guy

    2004-01-01

    In Europe, a growing interest for reliable techniques for the quantification of genetically modified component(s) of food matrixes is arising from the need to comply with the European legislative framework on novel food products. Real-time polymerase chain reaction (PCR) is currently the most powerful technique for the quantification of specific nucleic acid sequences. Several real-time PCR methodologies based on different molecular principles have been developed for this purpose. The most frequently used approach in the field of genetically modified organism (GMO) quantification in food or feed samples is based on the 5'-3'-exonuclease activity of Taq DNA polymerase on specific degradation probes (TaqMan principle). A novel approach was developed for the establishment of a TaqMan quantification system assessing GMO contents around the 1% threshold stipulated under European Union (EU) legislation for the labeling of food products. The Zea mays T25 elite event was chosen as a model for the development of the novel GMO quantification approach. The most innovative aspect of the system is represented by the use of sequences cloned in plasmids as reference standards. In the field of GMO quantification, plasmids are an easy to use, cheap, and reliable alternative to Certified Reference Materials (CRMs), which are only available for a few of the GMOs authorized in Europe, have a relatively high production cost, and require further processing to be suitable for analysis. Strengths and weaknesses of the use of novel plasmid-based standards are addressed in detail. In addition, the quantification system was designed to avoid the use of a reference gene (e.g., a single copy, species-specific gene) as normalizer, i.e., to perform a GMO quantification based on an absolute instead of a relative measurement. In fact, experimental evidences show that the use of reference genes adds variability to the measurement system because a second independent real-time PCR-based measurement must be performed. Moreover, for some reference genes no sufficient information on copy number in and among genomes of different lines is available, making adequate quantification difficult. Once developed, the method was subsequently validated according to IUPAC and ISO 5725 guidelines. Thirteen laboratories from 8 EU countries participated in the trial. Eleven laboratories provided results complying with the predefined study requirements. Repeatability (RSDr) values ranged from 8.7 to 15.9%, with a mean value of 12%. Reproducibility (RSDR) values ranged from 16.3 to 25.5%, with a mean value of 21%. Following Codex Alimentarius Committee guidelines, both the limits of detection and quantitation were determined to be <0.1%.

  2. Overview of a compre­hensive resource database for the assessment of recoverable hydrocarbons produced by carbon dioxide enhanced oil recovery

    USGS Publications Warehouse

    Carolus, Marshall; Biglarbigi, Khosrow; Warwick, Peter D.; Attanasi, Emil D.; Freeman, Philip A.; Lohr, Celeste D.

    2017-10-24

    A database called the “Comprehensive Resource Database” (CRD) was prepared to support U.S. Geological Survey (USGS) assessments of technically recoverable hydrocarbons that might result from the injection of miscible or immiscible carbon dioxide (CO2) for enhanced oil recovery (EOR). The CRD was designed by INTEK Inc., a consulting company under contract to the USGS. The CRD contains data on the location, key petrophysical properties, production, and well counts (number of wells) for the major oil and gas reservoirs in onshore areas and State waters of the conterminous United States and Alaska. The CRD includes proprietary data on petrophysical properties of fields and reservoirs from the “Significant Oil and Gas Fields of the United States Database,” prepared by Nehring Associates in 2012, and proprietary production and drilling data from the “Petroleum Information Data Model Relational U.S. Well Data,” prepared by IHS Inc. in 2012. This report describes the CRD and the computer algorithms used to (1) estimate missing reservoir property values in the Nehring Associates (2012) database, and to (2) generate values of additional properties used to characterize reservoirs suitable for miscible or immiscible CO2 flooding for EOR. Because of the proprietary nature of the data and contractual obligations, the CRD and actual data from Nehring Associates (2012) and IHS Inc. (2012) cannot be presented in this report.

  3. Against the grain: The physical properties of anisotropic partially molten rocks

    NASA Astrophysics Data System (ADS)

    Ghanbarzadeh, S.; Hesse, M. A.; Prodanovic, M.

    2014-12-01

    Partially molten rocks commonly develop textures that appear close to textural equilibrium, where the melt network evolves to minimize the energy of the melt-solid interfaces, while maintaining the dihedral angle θ at solid-solid-melt contact lines. Textural equilibrium provides a powerful model for the melt distribution that controls the petro-physical properties of partially molten rocks, e.g., permeability, elastic moduli, and electrical resistivity. We present the first level-set computations of three-dimensional texturally equilibrated melt networks in rocks with an anisotropic fabric. Our results show that anisotropy induces wetting of smaller grain boundary faces for θ > 0 at realistic porosities ϕ < 3%. This was previously not thought to be possible at textural equilibrium and reconciles the theory with experimental observations. Wetting of the grain boundary faces leads to a dramatic redistribution of the melt from the edges to the faces that introduces strong anisotropy in the petro-physical properties such as permeability, effective electrical conductivity and mechanical properties. Figure, on left, shows that smaller grain boundaries become wetted at relatively low melt fractions of 3% in stretched polyhedral grains with elongation factor 1.5. Right plot represents the ratio of melt electrical conductivity to effective conductivity of medium (known as formation factor) as an example of anisotropy in physical properties. The plot shows that even slight anisotropy in grains induces considerable anisotropy in electrical properties.

  4. Petrophysics of low-permeability medina sandstone, northwestern Pennsylvania, Appalachian Basin

    USGS Publications Warehouse

    Castle, J.W.; Byrnes, A.P.

    1998-01-01

    Petrophysical core testing combined with geophysical log analysis of low-permeability, Lower Silurian sandstones of the Appalachian basin provides guidelines and equations for predicting gas producibility. Permeability values are predictable from the borehole logs by applying empirically derived equations based on correlation between in-situ porosity and in-situ effective gas permeability. An Archie-form equation provides reasonable accuracy of log-derived water saturations because of saturated brine salinities and low clay content in the sands. Although measured porosity and permeability average less than 6% and 0.1 mD, infrequent values as high as 18% and 1,048 mD occur. Values of effective gas permeability at irreducible water saturation (Swi) range from 60% to 99% of routine values for the highest permeability rocks to several orders of magnitude less for the lowest permeability rocks. Sandstones having porosity greater than 6% and effective gas permeability greater than 0.01 mD exhibit Swi less than 20%. With decreasing porosity, Swi sharply increases to values near 40% at 3 porosity%. Analysis of cumulative storage and flow capacity indicates zones with porosity greater than 6% generally contain over 90% of flow capacity and hold a major portion of storage capacity. For rocks with Swi < 20%, gas relative permeabilities exceed 45%. Gas relative permeability and hydrocarbon volume decrease rapidly with increasing Swi as porosity drops below 6%. At Swi above 40%, gas relative permeabilities are less than approximately 10%.

  5. Numerical simulations of groundwater flow at New Jersey Shallow Shelf

    NASA Astrophysics Data System (ADS)

    Fehr, Annick; Patterson, Fabian; Lofi, Johanna; Reiche, Sönke

    2016-04-01

    During IODP Expedition 313, three boreholes were drilled in the so-called New Jersey transect. Hydrochemical studies revealed the groundwater situation as more complex than expected, characterized by several sharp boundaries between fresh and saline groundwater. Two conflicting hypotheses regarding the nature of these freshwater reservoirs are currently debated. One hypothesis is that these reservoirs are connected with onshore aquifers and continuously recharged by seaward-flowing groundwater. The second hypothesis is that fresh groundwater was emplaced during the last glacial period. In addition to the petrophysical properties measured during IODP 313 expedition, Nuclear Magnetic Resonance (NMR) measurements were performed on samples from boreholes M0027, M0028 and M0029 in order to deduce porosities and permeabilities. These results are compared with data from alternative laboratory measurements and with petrophysical properties inferred from downhole logging data. We incorporate these results into a 2D numerical model that reflects the shelf architecture as known from drillings and seismic data to perform submarine groundwater flow simulations. In order to account for uncertainties related to the spatial distribution of physical properties, such as porosity and permeability, systematic variation of input parameters was performed during simulation runs. The target is to test the two conflicting hypotheses of fresh groundwater emplacements offshore New Jersey and to improve the understanding of fluid flow processes at marine passive margins.

  6. Effects of water saturation on P-wave propagation in fractured coals: An experimental perspective

    NASA Astrophysics Data System (ADS)

    Liu, Jie; Liu, Dameng; Cai, Yidong; Gan, Quan; Yao, Yanbin

    2017-09-01

    Internal structure of coalbed methane (CBM) reservoirs can be evaluated through ultrasonic measurements. The compressional wave that propagates in a fractured coal reservoir may indicate the internal coal structure and fluid characteristics. The P-wave propagation was proposed to study the relations between petrophysical parameters (including water saturation, fractures, porosity and permeability) of coals and the P-wave velocity (Vp), using a KON-NM-4A ultrasonic velocity meter. In this study, the relations between Vps and water saturations were established: Type I is mainly controlled by capillary of developed seepage pores. The controlling factors on Type II and Type III are internal homogeneity of pores/fractures and developed micro-fractures, respectively. Micro-fractures density linearly correlates with the Vp due to the fracture volume and dispersion of P-wave; and micro-fractures of types C and D have a priority in Vp. For dry coals, no clear relation exists between porosity, permeability and the Vp. However, as for water-saturated coals, the correlation coefficients of porosity, permeability and Vp are slightly improved. The Vp of saturated coals could be predicted with the equation of Vp-saturated = 1.4952Vp-dry-26.742 m/s. The relation between petrophysical parameters of coals and Vp under various water saturations can be used to evaluate the internal structure in fractured coals. Therefore, these relations have significant implications for coalbed methane (CBM) exploration.

  7. Making the most of CZ seismics: Improving shallow critical zone characterization using surface-wave analysis

    NASA Astrophysics Data System (ADS)

    Pasquet, S.; Wang, W.; Holbrook, W. S.; Bodet, L.; Carr, B.; Flinchum, B. A.

    2017-12-01

    Estimating porosity and saturation in the shallow subsurface over large lateral scales is vitally important for understanding the development and evolution of the Critical Zone (CZ). Because elastic properties (P- and S-wave velocities) are particularly sensitive to porosity and saturation, seismic methods (in combination with petrophysical models) are effective tools for mapping CZ architecture and processes. While many studies employ P-wave refraction methods, fewer use the surface waves that are typically also recorded in those same surveys. Here we show the value of exploiting surface waves to extract supplementary shear-wave velocity (Vs) information in the CZ. We use a new, user-friendly, open-source MATLAB-based package (SWIP) to invert surface-wave data and estimate lateral variations of Vs in the CZ. Results from synthetics show that this approach enables the resolution of physical property variations in the upper 10-15 m below the surface with lateral scales of about 5 m - a vast improvement compared to P-wave tomography alone. A field example at a Yellowstone hydrothermal system also demonstrates the benefits of including Vs in the petrophysical models to estimate not only porosity but also saturation, thus highlighting subsurface gas pathways. In light of these results, we strongly suggest that surface-wave analysis should become a standard approach in CZ seismic surveys.

  8. A network model for characterizing brine channels in sea ice

    NASA Astrophysics Data System (ADS)

    Lieblappen, Ross M.; Kumar, Deip D.; Pauls, Scott D.; Obbard, Rachel W.

    2018-03-01

    The brine pore space in sea ice can form complex connected structures whose geometry is critical in the governance of important physical transport processes between the ocean, sea ice, and surface. Recent advances in three-dimensional imaging using X-ray micro-computed tomography have enabled the visualization and quantification of the brine network morphology and variability. Using imaging of first-year sea ice samples at in situ temperatures, we create a new mathematical network model to characterize the topology and connectivity of the brine channels. This model provides a statistical framework where we can characterize the pore networks via two parameters, depth and temperature, for use in dynamical sea ice models. Our approach advances the quantification of brine connectivity in sea ice, which can help investigations of bulk physical properties, such as fluid permeability, that are key in both global and regional sea ice models.

  9. AGScan: a pluggable microarray image quantification software based on the ImageJ library.

    PubMed

    Cathelin, R; Lopez, F; Klopp, Ch

    2007-01-15

    Many different programs are available to analyze microarray images. Most programs are commercial packages, some are free. In the latter group only few propose automatic grid alignment and batch mode. More often than not a program implements only one quantification algorithm. AGScan is an open source program that works on all major platforms. It is based on the ImageJ library [Rasband (1997-2006)] and offers a plug-in extension system to add new functions to manipulate images, align grid and quantify spots. It is appropriate for daily laboratory use and also as a framework for new algorithms. The program is freely distributed under X11 Licence. The install instructions can be found in the user manual. The software can be downloaded from http://mulcyber.toulouse.inra.fr/projects/agscan/. The questions and plug-ins can be sent to the contact listed below.

  10. Valency-Controlled Framework Nucleic Acid Signal Amplifiers.

    PubMed

    Liu, Qi; Ge, Zhilei; Mao, Xiuhai; Zhou, Guobao; Zuo, Xiaolei; Shen, Juwen; Shi, Jiye; Li, Jiang; Wang, Lihua; Chen, Xiaoqing; Fan, Chunhai

    2018-06-11

    Weak ligand-receptor recognition events are often amplified by recruiting multiple regulatory biomolecules to the action site in biological systems. However, signal amplification in in vitro biomimetic systems generally lack the spatiotemporal regulation in vivo. Herein we report a framework nucleic acid (FNA)-programmed strategy to develop valence-controlled signal amplifiers with high modularity for ultrasensitive biosensing. We demonstrated that the FNA-programmed signal amplifiers could recruit nucleic acids, proteins, and inorganic nanoparticles in a stoichiometric manner. The valence-controlled signal amplifier enhanced the quantification ability of electrochemical biosensors, and enabled ultrasensitive detection of tumor-relevant circulating free DNA (cfDNA) with sensitivity enhancement of 3-5 orders of magnitude and improved dynamic range. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Quantifying drivers of wild pig movement across multiple spatial and temporal scales

    USGS Publications Warehouse

    Kay, Shannon L.; Fischer, Justin W.; Monaghan, Andrew J.; Beasley, James C; Boughton, Raoul; Campbell, Tyler A; Cooper, Susan M; Ditchkoff, Stephen S.; Hartley, Stephen B.; Kilgo, John C; Wisely, Samantha M; Wyckoff, A Christy; Vercauteren, Kurt C.; Pipen, Kim M

    2017-01-01

    The analytical framework we present can be used to assess movement patterns arising from multiple data sources for a range of species while accounting for spatio-temporal correlations. Our analyses show the magnitude by which reaction norms can change based on the temporal scale of response data, illustrating the importance of appropriately defining temporal scales of both the movement response and covariates depending on the intended implications of research (e.g., predicting effects of movement due to climate change versus planning local-scale management). We argue that consideration of multiple spatial scales within the same framework (rather than comparing across separate studies post-hoc) gives a more accurate quantification of cross-scale spatial effects by appropriately accounting for error correlation.

  12. An explainable deep machine vision framework for plant stress phenotyping.

    PubMed

    Ghosal, Sambuddha; Blystone, David; Singh, Asheesh K; Ganapathysubramanian, Baskar; Singh, Arti; Sarkar, Soumik

    2018-05-01

    Current approaches for accurate identification, classification, and quantification of biotic and abiotic stresses in crop research and production are predominantly visual and require specialized training. However, such techniques are hindered by subjectivity resulting from inter- and intrarater cognitive variability. This translates to erroneous decisions and a significant waste of resources. Here, we demonstrate a machine learning framework's ability to identify and classify a diverse set of foliar stresses in soybean [ Glycine max (L.) Merr.] with remarkable accuracy. We also present an explanation mechanism, using the top-K high-resolution feature maps that isolate the visual symptoms used to make predictions. This unsupervised identification of visual symptoms provides a quantitative measure of stress severity, allowing for identification (type of foliar stress), classification (low, medium, or high stress), and quantification (stress severity) in a single framework without detailed symptom annotation by experts. We reliably identified and classified several biotic (bacterial and fungal diseases) and abiotic (chemical injury and nutrient deficiency) stresses by learning from over 25,000 images. The learned model is robust to input image perturbations, demonstrating viability for high-throughput deployment. We also noticed that the learned model appears to be agnostic to species, seemingly demonstrating an ability of transfer learning. The availability of an explainable model that can consistently, rapidly, and accurately identify and quantify foliar stresses would have significant implications in scientific research, plant breeding, and crop production. The trained model could be deployed in mobile platforms (e.g., unmanned air vehicles and automated ground scouts) for rapid, large-scale scouting or as a mobile application for real-time detection of stress by farmers and researchers. Copyright © 2018 the Author(s). Published by PNAS.

  13. A framework for intelligent data acquisition and real-time database searching for shotgun proteomics.

    PubMed

    Graumann, Johannes; Scheltema, Richard A; Zhang, Yong; Cox, Jürgen; Mann, Matthias

    2012-03-01

    In the analysis of complex peptide mixtures by MS-based proteomics, many more peptides elute at any given time than can be identified and quantified by the mass spectrometer. This makes it desirable to optimally allocate peptide sequencing and narrow mass range quantification events. In computer science, intelligent agents are frequently used to make autonomous decisions in complex environments. Here we develop and describe a framework for intelligent data acquisition and real-time database searching and showcase selected examples. The intelligent agent is implemented in the MaxQuant computational proteomics environment, termed MaxQuant Real-Time. It analyzes data as it is acquired on the mass spectrometer, constructs isotope patterns and SILAC pair information as well as controls MS and tandem MS events based on real-time and prior MS data or external knowledge. Re-implementing a top10 method in the intelligent agent yields similar performance to the data dependent methods running on the mass spectrometer itself. We demonstrate the capabilities of MaxQuant Real-Time by creating a real-time search engine capable of identifying peptides "on-the-fly" within 30 ms, well within the time constraints of a shotgun fragmentation "topN" method. The agent can focus sequencing events onto peptides of specific interest, such as those originating from a specific gene ontology (GO) term, or peptides that are likely modified versions of already identified peptides. Finally, we demonstrate enhanced quantification of SILAC pairs whose ratios were poorly defined in survey spectra. MaxQuant Real-Time is flexible and can be applied to a large number of scenarios that would benefit from intelligent, directed data acquisition. Our framework should be especially useful for new instrument types, such as the quadrupole-Orbitrap, that are currently becoming available.

  14. Study the effect of reservoir spatial heterogeneity on CO2 sequestration under an uncertainty quantification (UQ) software framework

    NASA Astrophysics Data System (ADS)

    Fang, Y.; Hou, J.; Engel, D.; Lin, G.; Yin, J.; Han, B.; Fang, Z.; Fountoulakis, V.

    2011-12-01

    In this study, we introduce an uncertainty quantification (UQ) software framework for carbon sequestration, with the focus of studying being the effect of spatial heterogeneity of reservoir properties on CO2 migration. We use a sequential Gaussian method (SGSIM) to generate realizations of permeability fields with various spatial statistical attributes. To deal with the computational difficulties, we integrate the following ideas/approaches: 1) firstly, we use three different sampling approaches (probabilistic collocation, quasi-Monte Carlo, and adaptive sampling approaches) to reduce the required forward calculations while trying to explore the parameter space and quantify the input uncertainty; 2) secondly, we use eSTOMP as the forward modeling simulator. eSTOMP is implemented using the Global Arrays toolkit (GA) that is based on one-sided inter-processor communication and supports a shared memory programming style on distributed memory platforms. It provides highly-scalable performance. It uses a data model to partition most of the large scale data structures into a relatively small number of distinct classes. The lower level simulator infrastructure (e.g. meshing support, associated data structures, and data mapping to processors) is separated from the higher level physics and chemistry algorithmic routines using a grid component interface; and 3) besides the faster model and more efficient algorithms to speed up the forward calculation, we built an adaptive system infrastructure to select the best possible data transfer mechanisms, to optimally allocate system resources to improve performance, and to integrate software packages and data for composing carbon sequestration simulation, computation, analysis, estimation and visualization. We will demonstrate the framework with a given CO2 injection scenario in a heterogeneous sandstone reservoir.

  15. A Framework for Intelligent Data Acquisition and Real-Time Database Searching for Shotgun Proteomics*

    PubMed Central

    Graumann, Johannes; Scheltema, Richard A.; Zhang, Yong; Cox, Jürgen; Mann, Matthias

    2012-01-01

    In the analysis of complex peptide mixtures by MS-based proteomics, many more peptides elute at any given time than can be identified and quantified by the mass spectrometer. This makes it desirable to optimally allocate peptide sequencing and narrow mass range quantification events. In computer science, intelligent agents are frequently used to make autonomous decisions in complex environments. Here we develop and describe a framework for intelligent data acquisition and real-time database searching and showcase selected examples. The intelligent agent is implemented in the MaxQuant computational proteomics environment, termed MaxQuant Real-Time. It analyzes data as it is acquired on the mass spectrometer, constructs isotope patterns and SILAC pair information as well as controls MS and tandem MS events based on real-time and prior MS data or external knowledge. Re-implementing a top10 method in the intelligent agent yields similar performance to the data dependent methods running on the mass spectrometer itself. We demonstrate the capabilities of MaxQuant Real-Time by creating a real-time search engine capable of identifying peptides “on-the-fly” within 30 ms, well within the time constraints of a shotgun fragmentation “topN” method. The agent can focus sequencing events onto peptides of specific interest, such as those originating from a specific gene ontology (GO) term, or peptides that are likely modified versions of already identified peptides. Finally, we demonstrate enhanced quantification of SILAC pairs whose ratios were poorly defined in survey spectra. MaxQuant Real-Time is flexible and can be applied to a large number of scenarios that would benefit from intelligent, directed data acquisition. Our framework should be especially useful for new instrument types, such as the quadrupole-Orbitrap, that are currently becoming available. PMID:22171319

  16. The early diagenetic and PETROphysical behaviour of recent cold-water CARbonate mounds in Deep Environments (PETROCARDE)

    NASA Astrophysics Data System (ADS)

    Foubert, Anneleen; Pirlet, Hans; Thierens, Mieke; de Mol, Ben; Henriet, Jean-Pierre; Swennen, Rudy

    2010-05-01

    Sub-recent cold-water carbonate mounds localized in deeper slope settings on the Atlantic continental margins cannot be any longer neglected in the study of carbonate systems. They clearly play a major role in the dynamics of mixed siliciclastic-carbonate and/or carbonate-dominated continental slopes. Carbonate accumulation rates of cold-water carbonate mounds are about 4 to 12 % of the carbonate accumulation rates of tropical shallow-water reefs but exceed the carbonate accumulation rates of their slope settings by a factor of 4 to 12 (Titschack et al., 2009). These findings emphasize the importance of these carbonate factories as carbonate niches on the continental margins. The primary environmental architecture of such carbonate bodies is well-characterized. However, despite proven evidences of early diagenesis overprinting the primary environmental record (e.g. aragonite dissolution) (Foubert & Henriet, 2009), the extent of early diagenetic and biogeochemical processes shaping the petrophysical nature of mounds is until now not yet fully understood. Understanding (1) the functioning of a carbonate mound as biogeochemical reactor triggering early diagenetic processes and (2) the impact of early diagenesis on the petrophysical behaviour of a carbonate mound in space and through time are necessary (vital) for the reliable prediction of potential late diagenetic processes. Approaching the fossil carbonate mound record, through a profound study of recent carbonate bodies is innovative and will help to better understand processes observed in the fossil mound world (such as cementation, brecciation, fracturing, etc…). In this study, the 155-m high Challenger mound (Porcupine Seabight, SW of Ireland), drilled during IODP Expedition 307 aboard the R/V Joides Resolution (Foubert & Henriet, 2009), and mounds from the Gulf of Cadiz (Moroccan margin) will be discussed in terms of early diagenetic processes and petrophysical behaviour. Early differential diagenesis overprints the primary environmental signals in Challenger mound, with extensive coral dissolution and the genesis of small-scaled semi-lithified layers in the Ca-rich intervals. The low cementation rates compared to the extensive dissolution patterns can be explained by an open-system diagenetic model. Moreover, Pirlet et al. (2009) emphasizes the occurrence of gypsum and dolomite in another mound system (Mound Perseverance) in Porcupine Seabight, which might be also related with fluid oxidation events in a semi-open diagenetic system. Along the Moroccan margins, fluid seepage and fluxes in pore water transport affect the development of mound structures, enhancing extensive cold-water coral dissolution and precipitation of diagenetic minerals such as dolomite, calcite, pyrite, etc. (Foubert et al., 2008). Recent carbonate mounds provide indeed an excellent opportunity to study early diagenetic processes in carbonate systems without the complications of burial and/or later meteoric diagenesis. References Foubert, A. and Henriet, J.P. (2009) Nature and Significance of the Recent Carbonate Mound Record: The Mound Challenger Code. Lecture Notes in Earth Sciences, Vol. 126. Springer, 298 pp. ISBN: 978-3-642-00289-2. Pirlet, H., Wehrmann, L., Brunner, B., Frank, N., Dewanckele, J., Van Rooij, D., Foubert, A., Swennen, R., Naudts, L., Boone, M., Cnudde, V. and Henriet, J.P. (2009) Diagenetic formation of gypsum and dolomite in a cold-water coral mound in the Porcupine Seabight, off Ireland. Sedimentology. doi: 10.1111/j.1365-3091.2009.01119.x. Titschack, J., Thierens, M., Dorschel, B., Schulbert, C., Freiwald, A., Kano, A., Takashima, C., Kawagoe, N., Li, X. and the IODP Expedition 307 Scientific Party (2009) Carbonate budget of a cold-water coral mound (Challenger Mound, IODP Exp. 307). Marine Geology, 259, 36-46.

  17. Investigating the potential of metal-organic framework material as an adsorbent for matrix solid-phase dispersion extraction of pesticides during analysis of dehydrated Hyptis pectinata medicinal plant by GC/MS.

    PubMed

    Aquino, Adriano; Ferreira, Jordana Alves; Navickiene, Sandro; Wanderley, Kaline A; de Sá, Gilberto F; Júnior, Severino A

    2012-01-01

    Metal-organic frameworks aluminum terephthalate MIL-53 and Cu-benzene-1,3,5-tricarboxylate (BTC) were tested for extraction of pyrimethanil, ametryn, dichlofluanid, tetraconazole, flumetralin, kresoximmethyl, and tebuconazole from the medicinal plant Hyptis pectinata, with analysis using GC/MS in the selected ion monitoring mode. Experiments carried out at different fortification levels (0.1, 0.5, and 1.0 microg/g) resulted in recoveries in the range 61 to 107% with RSD values between 3 and 12% for the metal-organic framework materials. Detection and quantification limits ranged from 0.02 to 0.07 and 0.05 to 0.1 microg/g, respectively, for the different pesticides studied. The method developed was linear over the range tested (0.04-20.0 microg/g), with correlation coefficients ranging from 0.9987 to 0.9998. Comparison of MIL-53 and Cu-BTC with C18-bonded silica showed good performance of the MIL-53 metal-organic framework as a sorbent for the pesticides tested.

  18. Noise Propagation and Uncertainty Quantification in Hybrid Multiphysics Models: Initiation and Reaction Propagation in Energetic Materials

    DTIC Science & Technology

    2016-05-23

    general model for heterogeneous granular media under compaction and (ii) the lack of a reliable multiscale discrete -to-continuum framework for...dynamics. These include a continuum- discrete model of heat dissipation/diffusion and a continuum- discrete model of compaction of a granular material with...the lack of a general model for het- erogeneous granular media under compac- tion and (ii) the lack of a reliable multi- scale discrete -to-continuum

  19. Uncertainty Assessment of Synthetic Design Hydrographs for Gauged and Ungauged Catchments

    NASA Astrophysics Data System (ADS)

    Brunner, Manuela I.; Sikorska, Anna E.; Furrer, Reinhard; Favre, Anne-Catherine

    2018-03-01

    Design hydrographs described by peak discharge, hydrograph volume, and hydrograph shape are essential for engineering tasks involving storage. Such design hydrographs are inherently uncertain as are classical flood estimates focusing on peak discharge only. Various sources of uncertainty contribute to the total uncertainty of synthetic design hydrographs for gauged and ungauged catchments. These comprise model uncertainties, sampling uncertainty, and uncertainty due to the choice of a regionalization method. A quantification of the uncertainties associated with flood estimates is essential for reliable decision making and allows for the identification of important uncertainty sources. We therefore propose an uncertainty assessment framework for the quantification of the uncertainty associated with synthetic design hydrographs. The framework is based on bootstrap simulations and consists of three levels of complexity. On the first level, we assess the uncertainty due to individual uncertainty sources. On the second level, we quantify the total uncertainty of design hydrographs for gauged catchments and the total uncertainty of regionalizing them to ungauged catchments but independently from the construction uncertainty. On the third level, we assess the coupled uncertainty of synthetic design hydrographs in ungauged catchments, jointly considering construction and regionalization uncertainty. We find that the most important sources of uncertainty in design hydrograph construction are the record length and the choice of the flood sampling strategy. The total uncertainty of design hydrographs in ungauged catchments depends on the catchment properties and is not negligible in our case.

  20. Information theoretic quantification of diagnostic uncertainty.

    PubMed

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  1. A novel framework for the local extraction of extra-axial cerebrospinal fluid from MR brain images

    NASA Astrophysics Data System (ADS)

    Mostapha, Mahmoud; Shen, Mark D.; Kim, SunHyung; Swanson, Meghan; Collins, D. Louis; Fonov, Vladimir; Gerig, Guido; Piven, Joseph; Styner, Martin A.

    2018-03-01

    The quantification of cerebrospinal fluid (CSF) in the human brain has shown to play an important role in early postnatal brain developmental. Extr a-axial fluid (EA-CSF), which is characterized by the CSF in the subarachnoid space, is promising in the early detection of children at risk for neurodevelopmental disorders. Currently, though, there is no tool to extract local EA-CSF measurements in a way that is suitable for localized analysis. In this paper, we propose a novel framework for the localized, cortical surface based analysis of EA-CSF. In our proposed processing, we combine probabilistic brain tissue segmentation, cortical surface reconstruction as well as streamline based local EA-CSF quantification. For streamline computation, we employ the vector field generated by solving a Laplacian partial differential equation (PDE) between the cortical surface and the outer CSF hull. To achieve sub-voxel accuracy while minimizing numerical errors, fourth-order Runge-Kutta (RK4) integration was used to generate the streamlines. Finally, the local EA-CSF is computed by integrating the CSF probability along the generated streamlines. The proposed local EA-CSF extraction tool was used to study the early postnatal brain development in typically developing infants. The results show that the proposed localized EA-CSF extraction pipeline can produce statistically significant regions that are not observed in previous global approach.

  2. Ecosystem services in urban water investment.

    PubMed

    Kandulu, John M; Connor, Jeffery D; MacDonald, Darla Hatton

    2014-12-01

    Increasingly, water agencies and utilities have an obligation to consider the broad environmental impacts associated with investments. To aid in understanding water cycle interdependencies when making urban water supply investment decisions, an ecosystem services typology was augmented with the concept of integrated water resources management. This framework is applied to stormwater harvesting in a case study catchment in Adelaide, South Australia. Results show that this methodological framework can effectively facilitate systematic consideration and quantitative assessment of broad environmental impacts of water supply investments. Five ecosystem service impacts were quantified including provision of 1) urban recreational amenity, 2) regulation of coastal water quality, 3) salinity, 4) greenhouse gas emissions, and 5) support of estuarine habitats. This study shows that ignoring broad environmental impacts can underestimate ecosystem service benefits of water supply investments by a value of up to A$1.36/kL, or three times the cost of operating and maintenance of stormwater harvesting. Rigorous assessment of the public welfare impacts of water infrastructure investments is required to guide long-term optimal water supply investment decisions. Numerous challenges remain in the quantification of broad environmental impacts of a water supply investment including a lack of peer-reviewed studies of environmental impacts, aggregation of incommensurable impacts, potential for double-counting errors, uncertainties in available impact estimates, and how to determine the most suitable quantification technique. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Liange; Rutqvist, Jonny; Kim, Kunhwi

    The focus of research within the UFD Campaign is on repository-induced interactions that may affect the key safety characteristics of an argillaceous rock. These include thermal-hydrological-mechanical-chemical (THMC) process interactions that occur as a result of repository construction and waste emplacement. Some of the key questions addressed in this report include the development of fracturing in the excavation damaged zone (EDZ) and THMC effects on the near-field argillaceous rock and buffer minerals and petrophysical characteristics, particularly the impacts of induced temperature rise caused by waste heat.

  4. Working Smarter Not Harder - Developing a Virtual Subsurface Data Framework for U.S. Energy R&D

    NASA Astrophysics Data System (ADS)

    Rose, K.; Baker, D.; Bauer, J.; Dehlin, M.; Jones, T. J.; Rowan, C.

    2017-12-01

    The data revolution has resulted in a proliferation of resources that span beyond commercial and social networking domains. Research, scientific, and engineering data resources, including subsurface characterization, modeling, and analytical datasets, are increasingly available through online portals, warehouses, and systems. Data for subsurface systems is still challenging to access, discontinuous, and varies in resolution. However, with the proliferation of online data there are significant opportunities to advance access and knowledge of subsurface systems. The Energy Data eXchange (EDX) is an online platform designed to address research data needs by improving access to energy R&D products through advanced search capabilities. In addition, EDX hosts private, virtualized computational workspaces in support of multi-organizational R&D. These collaborative workspaces allow teams to share working data resources and connect to a growing number of analytical tools to support research efforts. One recent application, a team digital data notebook tool, called DataBook, was introduced within EDX workspaces to allow teams to capture contextual and structured data resources. Starting with DOE's subsurface R&D community, the EDX team has been developing DataBook to support scientists and engineers working on subsurface energy research, allowing them to contribute and curate both structured and unstructured data and knowledge about subsurface systems. These resources span petrophysical, geologic, engineering, geophysical, interpretations, models, and analyses associated with carbon storage, water, oil, gas, geothermal, induced seismicity and other subsurface systems to support the development of a virtual subsurface data framework. The integration of EDX and DataBook allows for these systems to leverage each other's best features, such as the ability to interact with other systems (Earthcube, OpenEI.net, NGDS, etc.) and leverage custom machine learning algorithms and capabilities to enhance user experience, make access and connection to relevant subsurface data resources more efficient for research teams to use, analyze and draw insights. Ultimately, the public and private resources in EDX seek to make subsurface energy research more efficient, reduce redundancy, and drive innovation.

  5. Inferring lateral density variations in Great Geneva Basin, western Switzerland from wells and gravity data.

    NASA Astrophysics Data System (ADS)

    Carrier, Aurore; Lupi, Matteo; Clerc, Nicolas; Rusillon, Elme; Do Couto, Damien

    2017-04-01

    In the framework of sustainable energy development Switzerland supports the growth of renewable energies. SIG (Services Industriels de Genève) and the Canton of Geneva intend to develop the use of hydrothermal energy in western Switzerland. As a Mesozoïc-formed sedimentary basin, the Great Geneva Basin (GGB) shares geological and petrophysical similarities with the Munich area (Baviera, Germany) and Paris Basin (France). The latter already provide significant amounts of geothermal energy for district heating. The prospection phase has been launched in 2014 by SIG and aims at identifying relevant geological units and defining their geometries. Lower Cretaceous and Tertiary geological units have first been targeted as potential layers. At the depth we find these units (and according to the normal geothermal gradient), low enthalpy geothermal resources are rather expected. In this framework, our study aims at constraining and refining lateral and vertical heterogeneities of Quaternary to Cretaceous sedimentary layers in GGB. Linear velocity law is inverted at wells and then interpolated to the whole basin for each geological layer. Using time pickings from available data and Quaternary information from previous studies time to depth conversion is performed. Thickness map of every geological unit is then produced. Tertiary thickness ranges from 0 m at the NW border of the GGB at the foothill of the Jura Mountains to 3000 m in the SE of the GGB at the border with the French Alps. These observations are consistent with field and well observations. The produced thickness map will be used as a geometry support for gravity data inversion and then density lateral variations estimation. Unconstrained, and a priori constrained inversion has been performed in GGB using Gauss-Newton algorithms. Velocity versus density relationships will then enable to refine velocity law interpolation. Our procedure allowed us to reduce the uncertainty of key target formation and represents an important step towards the development of geothermal energy in the Great Geneva Basin.

  6. Provenance, diagenesis, tectonic setting and reservoir quality of the sandstones of the Kareem Formation, Gulf of Suez, Egypt

    NASA Astrophysics Data System (ADS)

    Zaid, Samir M.

    2013-09-01

    The Middle Miocene Kareem sandstones are important oil reservoirs in the southwestern part of the Gulf of Suez basin, Egypt. However, their diagenesis and provenance and their impact on reservoir quality, are virtually unknown. Samples from the Zeit Bay Oil Field, and the East Zeit Oil Field represent the Lower Kareem (Rahmi Member) and the Upper Kareem (Shagar Member), were studied using a combination of petrographic, mineralogical and geochemical techniques. The Lower Rahmi sandstones have an average framework composition of Q95F3.4R1.6, and 90% of the quartz grains are monocrystalline. By contrast, the Upper Shagar sandstones are only slightly less quartzose with an average framework composition of Q76F21R3 and 82% of the quartz grains are monocrystalline. The Kareem sandstones are mostly quartzarenite with subordinate subarkose and arkose. Petrographical and geochemical data of sandstones indicate that they were derived from granitic and metamorphic terrains as the main source rock with a subordinate quartzose recycled sedimentary rocks and deposited in a passive continental margin of a syn rift basin. The sandstones of the Kareem Formation show upward decrease in maturity. Petrographic study revealed that dolomite is the dominant cement and generally occurs as fine to medium rhombs pore occluding phase and locally as a grain replacive phase. Authigenic quartz occurs as small euhedral crystals, locally as large pyramidal crystals in the primary pores. Authigenic anhydrites typically occur as poikilotopic rhombs or elongate laths infilling pores but also as vein filling cement. The kaolinite is a by-product of feldspar leaching in the presence of acidic fluid produced during the maturation of organic matter in the adjacent Miocene rocks. Diagenetic features include compaction; dolomite, silica and anhydrite cementation with minor iron-oxide, illite, kaolinite and pyrite cements; dissolution of feldspars, rock fragments. Silica dissolution, grain replacement and carbonate dissolution greatly enhance the petrophysical properties of many sandstone samples.

  7. Application of information-theoretic measures to quantitative analysis of immunofluorescent microscope imaging.

    PubMed

    Shutin, Dmitriy; Zlobinskaya, Olga

    2010-02-01

    The goal of this contribution is to apply model-based information-theoretic measures to the quantification of relative differences between immunofluorescent signals. Several models for approximating the empirical fluorescence intensity distributions are considered, namely Gaussian, Gamma, Beta, and kernel densities. As a distance measure the Hellinger distance and the Kullback-Leibler divergence are considered. For the Gaussian, Gamma, and Beta models the closed-form expressions for evaluating the distance as a function of the model parameters are obtained. The advantages of the proposed quantification framework as compared to simple mean-based approaches are analyzed with numerical simulations. Two biological experiments are also considered. The first is the functional analysis of the p8 subunit of the TFIIH complex responsible for a rare hereditary multi-system disorder--trichothiodystrophy group A (TTD-A). In the second experiment the proposed methods are applied to assess the UV-induced DNA lesion repair rate. A good agreement between our in vivo results and those obtained with an alternative in vitro measurement is established. We believe that the computational simplicity and the effectiveness of the proposed quantification procedure will make it very attractive for different analysis tasks in functional proteomics, as well as in high-content screening. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.

  8. Dependence of quantitative accuracy of CT perfusion imaging on system parameters

    NASA Astrophysics Data System (ADS)

    Li, Ke; Chen, Guang-Hong

    2017-03-01

    Deconvolution is a popular method to calculate parametric perfusion parameters from four dimensional CT perfusion (CTP) source images. During the deconvolution process, the four dimensional space is squeezed into three-dimensional space by removing the temporal dimension, and a prior knowledge is often used to suppress noise associated with the process. These additional complexities confound the understanding about deconvolution-based CTP imaging system and how its quantitative accuracy depends on parameters and sub-operations involved in the image formation process. Meanwhile, there has been a strong clinical need in answering this question, as physicians often rely heavily on the quantitative values of perfusion parameters to make diagnostic decisions, particularly during an emergent clinical situation (e.g. diagnosis of acute ischemic stroke). The purpose of this work was to develop a theoretical framework that quantitatively relates the quantification accuracy of parametric perfusion parameters with CTP acquisition and post-processing parameters. This goal was achieved with the help of a cascaded systems analysis for deconvolution-based CTP imaging systems. Based on the cascaded systems analysis, the quantitative relationship between regularization strength, source image noise, arterial input function, and the quantification accuracy of perfusion parameters was established. The theory could potentially be used to guide developments of CTP imaging technology for better quantification accuracy and lower radiation dose.

  9. A python framework for environmental model uncertainty analysis

    USGS Publications Warehouse

    White, Jeremy; Fienen, Michael N.; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  10. Uncertainty quantification in LES of channel flow

    DOE PAGES

    Safta, Cosmin; Blaylock, Myra; Templeton, Jeremy; ...

    2016-07-12

    Here, in this paper, we present a Bayesian framework for estimating joint densities for large eddy simulation (LES) sub-grid scale model parameters based on canonical forced isotropic turbulence direct numerical simulation (DNS) data. The framework accounts for noise in the independent variables, and we present alternative formulations for accounting for discrepancies between model and data. To generate probability densities for flow characteristics, posterior densities for sub-grid scale model parameters are propagated forward through LES of channel flow and compared with DNS data. Synthesis of the calibration and prediction results demonstrates that model parameters have an explicit filter width dependence andmore » are highly correlated. Discrepancies between DNS and calibrated LES results point to additional model form inadequacies that need to be accounted for.« less

  11. Traits Without Borders: Integrating Functional Diversity Across Scales.

    PubMed

    Carmona, Carlos P; de Bello, Francesco; Mason, Norman W H; Lepš, Jan

    2016-05-01

    Owing to the conceptual complexity of functional diversity (FD), a multitude of different methods are available for measuring it, with most being operational at only a small range of spatial scales. This causes uncertainty in ecological interpretations and limits the potential to generalize findings across studies or compare patterns across scales. We solve this problem by providing a unified framework expanding on and integrating existing approaches. The framework, based on trait probability density (TPD), is the first to fully implement the Hutchinsonian concept of the niche as a probabilistic hypervolume in estimating FD. This novel approach could revolutionize FD-based research by allowing quantification of the various FD components from organismal to macroecological scales, and allowing seamless transitions between scales. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. The effects of dolomitization on petrophysical properties and fracture distribution within rift-related carbonates (Hammam Faraun Fault Block, Suez Rift, Egypt)

    NASA Astrophysics Data System (ADS)

    Korneva, I.; Bastesen, E.; Corlett, H.; Eker, A.; Hirani, J.; Hollis, C.; Gawthorpe, R. L.; Rotevatn, A.; Taylor, R.

    2018-03-01

    Petrographic and petrophysical data from different limestone lithofacies (skeletal packstones, matrix-supported conglomerates and foraminiferal grainstones) and their dolomitized equivalents within a slope carbonate succession (Eocene Thebes Formation) of Hammam Faraun Fault Block (Suez Rift, Egypt) have been analyzed in order to link fracture distribution with mechanical and textural properties of these rocks. Two phases of dolomitization resulted in facies-selective stratabound dolostones extending up to two and a half kilometers from the Hammam Faraun Fault, and massive dolostones in the vicinity of the fault (100 metres). Stratabound dolostones are characterized by up to 8 times lower porosity and 6 times higher frequency of fractures compared to the host limestones. Precursor lithofacies type has no significant effect on fracture frequency in the stratabound dolostones. At a distance of 100 metres from the fault, massive dolostones are present which have 0.5 times porosity of precursor limestones, and lithofacies type exerts a stronger control on fracture frequency than the presence of dolomitization (undolomitized vs. dolomitized). Massive dolomitization corresponds to increased fracture intensity in conglomerates and grainstones but decreased fracture intensity in packstones. This corresponds to a decrease of grain/crystal size in conglomerates and grainstones and its increase in packstones after massive dolomitization. Since fractures may contribute significantly to the flow properties of a carbonate rock, the work presented herein has significant applicability to hydrocarbon exploration and production from limestone and dolostone reservoirs, particularly where matrix porosities are low.

  13. Determining Representative Elementary Volume For Multiple Petrophysical Parameters using a Convex Hull Analysis of Digital Rock Data

    NASA Astrophysics Data System (ADS)

    Shah, S.; Gray, F.; Yang, J.; Crawshaw, J.; Boek, E.

    2016-12-01

    Advances in 3D pore-scale imaging and computational methods have allowed an exceptionally detailed quantitative and qualitative analysis of the fluid flow in complex porous media. A fundamental problem in pore-scale imaging and modelling is how to represent and model the range of scales encountered in porous media, starting from the smallest pore spaces. In this study, a novel method is presented for determining the representative elementary volume (REV) of a rock for several parameters simultaneously. We calculate the two main macroscopic petrophysical parameters, porosity and single-phase permeability, using micro CT imaging and Lattice Boltzmann (LB) simulations for 14 different porous media, including sandpacks, sandstones and carbonates. The concept of the `Convex Hull' is then applied to calculate the REV for both parameters simultaneously using a plot of the area of the convex hull as a function of the sub-volume, capturing the different scales of heterogeneity from the pore-scale imaging. The results also show that the area of the convex hull (for well-chosen parameters such as the log of the permeability and the porosity) decays exponentially with sub-sample size suggesting a computationally efficient way to determine the system size needed to calculate the parameters to high accuracy (small convex hull area). Finally we propose using a characteristic length such as the pore size to choose an efficient absolute voxel size for the numerical rock.

  14. Hybrid network modeling and the effect of image resolution on digitally-obtained petrophysical and two-phase flow properties

    NASA Astrophysics Data System (ADS)

    Aghaei, A.

    2017-12-01

    Digital imaging and modeling of rocks and subsequent simulation of physical phenomena in digitally-constructed rock models are becoming an integral part of core analysis workflows. One of the inherent limitations of image-based analysis, at any given scale, is image resolution. This limitation becomes more evident when the rock has multiple scales of porosity such as in carbonates and tight sandstones. Multi-scale imaging and constructions of hybrid models that encompass images acquired at multiple scales and resolutions are proposed as a solution to this problem. In this study, we investigate the effect of image resolution and unresolved porosity on petrophysical and two-phase flow properties calculated based on images. A helical X-ray micro-CT scanner with a high cone-angle is used to acquire digital rock images that are free of geometric distortion. To remove subjectivity from the analyses, a semi-automated image processing technique is used to process and segment the acquired data into multiple phases. Direct and pore network based models are used to simulate physical phenomena and obtain absolute permeability, formation factor and two-phase flow properties such as relative permeability and capillary pressure. The effect of image resolution on each property is investigated. Finally a hybrid network model incorporating images at multiple resolutions is built and used for simulations. The results from the hybrid model are compared against results from the model built at the highest resolution and those from laboratory tests.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    W. Lynn Watney; John H. Doveton

    GEMINI (Geo-Engineering Modeling through Internet Informatics) is a public-domain web application focused on analysis and modeling of petroleum reservoirs and plays (http://www.kgs.ukans.edu/Gemini/index.html). GEMINI creates a virtual project by ''on-the-fly'' assembly and analysis of on-line data either from the Kansas Geological Survey or uploaded from the user. GEMINI's suite of geological and engineering web applications for reservoir analysis include: (1) petrofacies-based core and log modeling using an interactive relational rock catalog and log analysis modules; (2) a well profile module; (3) interactive cross sections to display ''marked'' wireline logs; (4) deterministic gridding and mapping of petrophysical data; (5) calculation and mappingmore » of layer volumetrics; (6) material balance calculations; (7) PVT calculator; (8) DST analyst, (9) automated hydrocarbon association navigator (KHAN) for database mining, and (10) tutorial and help functions. The Kansas Hydrocarbon Association Navigator (KHAN) utilizes petrophysical databases to estimate hydrocarbon pay or other constituent at a play- or field-scale. Databases analyzed and displayed include digital logs, core analysis and photos, DST, and production data. GEMINI accommodates distant collaborations using secure password protection and authorized access. Assembled data, analyses, charts, and maps can readily be moved to other applications. GEMINI's target audience includes small independents and consultants seeking to find, quantitatively characterize, and develop subtle and bypassed pays by leveraging the growing base of digital data resources. Participating companies involved in the testing and evaluation of GEMINI included Anadarko, BP, Conoco-Phillips, Lario, Mull, Murfin, and Pioneer Resources.« less

  16. SU-D-218-05: Material Quantification in Spectral X-Ray Imaging: Optimization and Validation.

    PubMed

    Nik, S J; Thing, R S; Watts, R; Meyer, J

    2012-06-01

    To develop and validate a multivariate statistical method to optimize scanning parameters for material quantification in spectral x-rayimaging. An optimization metric was constructed by extensively sampling the thickness space for the expected number of counts for m (two or three) materials. This resulted in an m-dimensional confidence region ofmaterial quantities, e.g. thicknesses. Minimization of the ellipsoidal confidence region leads to the optimization of energy bins. For the given spectrum, the minimum counts required for effective material separation can be determined by predicting the signal-to-noise ratio (SNR) of the quantification. A Monte Carlo (MC) simulation framework using BEAM was developed to validate the metric. Projection data of the m-materials was generated and material decomposition was performed for combinations of iodine, calcium and water by minimizing the z-score between the expected spectrum and binned measurements. The mean square error (MSE) and variance were calculated to measure the accuracy and precision of this approach, respectively. The minimum MSE corresponds to the optimal energy bins in the BEAM simulations. In the optimization metric, this is equivalent to the smallest confidence region. The SNR of the simulated images was also compared to the predictions from the metric. TheMSE was dominated by the variance for the given material combinations,which demonstrates accurate material quantifications. The BEAMsimulations revealed that the optimization of energy bins was accurate to within 1keV. The SNRs predicted by the optimization metric yielded satisfactory agreement but were expectedly higher for the BEAM simulations due to the inclusion of scattered radiation. The validation showed that the multivariate statistical method provides accurate material quantification, correct location of optimal energy bins and adequateprediction of image SNR. The BEAM code system is suitable for generating spectral x- ray imaging simulations. © 2012 American Association of Physicists in Medicine.

  17. Luminescence signal profiling: a new proxy for sedimentologically "invisible" marine Mass Transport Deposits (MTDs)

    NASA Astrophysics Data System (ADS)

    López, Gloria I.; Bialik, Or; Waldmann, Nicolas

    2017-04-01

    When dealing with fine-grained, organic-rich, colour-monotone, underwater marine sediment cores retrieved from the continental shelf or slope, the initial visual impression, upon split-opening the vessels, is often of a "disappointing" homogeneous, monotonous, continuous archive. Only after thorough, micro- to macro-scale, multi-parameter investigations the sediment reveals its treasures, initially by performing some measurements on the intact core itself, hence depicting for the first time its contents, and subsequently by carrying out the destructive, multi-proxy sample-based analyses. Usually, routine Multi-Sensor Core Logger (MSCL) measurements of petrophysical parameters (e.g. magnetic susceptibility, density, P-Wave velocity) on un-split sediment cores are the first undertaken while still on-board in the field or back at the laboratory. Less often done, but equally valuable, are continuous X-Ray and CT scan imaging of the same intact archives. Upon splitting, routine granulometry, micro- and macro-fossil and invertebrate identification, total organic / inorganic carbon content (TOC / TIC), amid other analyses take place. The geochronology component is also established usually by AMS 14C on selected organic-rich units, and less common is Optically Stimulated Luminescence (OSL) dating used on the coarser-grained, siliciclastic layers. A relatively new tool used in Luminescence, the Portable OSL Reader, employed to rapidly assess the luminescence signal of untreated poly-mineral samples to assist with targeted field sampling for full OSL dating, was used for the first time in marine sediment cores as a novel petrophysical characterization tool with astonishing results. In this study, two 2 m-long underwater piston sediment cores recovered from 200 m depths on the continental shelf off-southern Israel, were subjected to pulsed-photon stimulation (PPSL) obtaining favourable luminescence signals along their entire lengths. Astoundingly, luminescence signals were obtained on both, already split-opened cores. Both cores depicted the monotonous characteristics of homogeneousness down-core as per most of the results obtained from the non-destructive and destructive tests. One of the cores showed several small higher energy events, including a Mass Transport Deposit (MTD) within its first 10 cm, only fully visible on the CT scan imaging, the PPSL profile and particle size distribution plot. This initial investigation demonstrates the feasibility and usefulness of luminescence profiling as a new sedimentological and petrophysical proxy to better visualize homogeneous yet complex, fine-grained, underwater archives. Moreover, it helps to understand the continuity of the stratigraphy and linearity of deposition of the sediment, besides assisting on the estimation of relative ages provided that good OSL ages are obtained throughout the recovered archive.

  18. Petrophysical analysis of geophysical logs of the National Drilling Company-U.S. Geological Survey ground-water research project for Abu Dhabi Emirate, United Arab Emirates

    USGS Publications Warehouse

    Jorgensen, Donald G.; Petricola, Mario

    1994-01-01

    A program of borehole-geophysical logging was implemented to supply geologic and geohydrologic information for a regional ground-water investigation of Abu Dhabi Emirate. Analysis of geophysical logs was essential to provide information on geohydrologic properties because drill cuttings were not always adequate to define lithologic boundaries. The standard suite of logs obtained at most project test holes consisted of caliper, spontaneous potential, gamma ray, dual induction, microresistivity, compensated neutron, compensated density, and compensated sonic. Ophiolitic detritus from the nearby Oman Mountains has unusual petrophysical properties that complicated the interpretation of geophysical logs. The density of coarse ophiolitic detritus is typically greater than 3.0 grams per cubic centimeter, porosity values are large, often exceeding 45 percent, and the clay fraction included unusual clays, such as lizardite. Neither the spontaneous-potential log nor the natural gamma-ray log were useable clay indicators. Because intrinsic permeability is a function of clay content, additional research in determining clay content was critical. A research program of geophysical logging was conducted to determine the petrophysical properties of the shallow subsurface formations. The logging included spectral-gamma and thermal-decay-time logs. These logs, along with the standard geophysical logs, were correlated to mineralogy and whole-rock chemistry as determined from sidewall cores. Thus, interpretation of lithology and fluids was accomplished. Permeability and specific yield were calculated from geophysical-log data and correlated to results from an aquifer test. On the basis of results from the research logging, a method of lithologic and water-resistivity interpretation was developed for the test holes at which the standard suite of logs were obtained. In addition, a computer program was developed to assist in the analysis of log data. Geohydrologic properties were estimated, including volume of clay matrix, volume of matrix other than clay, density of matrix other than clay, density of matrix, intrinsic permeability, specific yield, and specific storage. Geophysical logs were used to (1) determine lithology, (2) correlate lithologic and permeable zones, (3) calibrate seismic reprocessing, (4) calibrate transient-electromagnetic surveys, and (5) calibrate uphole-survey interpretations. Logs were used at the drill site to (1) determine permeability zones, (2) determine dissolved-solids content, which is a function of water resistivity, and (3) design wells accordingly. Data and properties derived from logs were used to determine transmissivity and specific yield of aquifer materials.

  19. Quantitative impact of hydrothermal alteration on electrical resistivity in geothermal systems from a joint analysis of laboratory measurements and borehole data in Krafla area, N-E Iceland

    NASA Astrophysics Data System (ADS)

    Lévy, Léa; Páll Hersir, Gylfi; Flóvenz, Ólafur; Gibert, Benoit; Pézard, Philippe; Sigmundsson, Freysteinn; Briole, Pierre

    2016-04-01

    Rock permeability and fluid temperature are the two most decisive factors for a successful geothermal drilling. While those parameters are only measured from drilling, they might be estimated on the basis of their impact on electrical resistivity that might be imaged from surface soundings, for example through TEM (Transient Electro Magnetic) down to one km depth. The electrical conductivity of reservoir rocks is the sum of a volume term depending on fluid parameters and a surface term related to rock alteration. Understanding the link between electrical resistivity and geothermal key parameters requires the knowledge of hydrothermal alteration and its petrophysical signature with the Cation Exchange Capacity (CEC). Fluid-rock interactions related to hydrothermal circulation trigger the precipitation of alteration minerals, which are both witnesses of the temperature at the time of reaction and new paths for the electrical current. Alteration minerals include zeolites, smectites, chlorites, epidotes and amphiboles among which low temperatures parageneses are often the most conductive. The CEC of these mineral phases contributes to account for surface conductivity occuring at the water-rock interface. In cooling geothermal systems, these minerals constitute in petrophysical terms and from surface electrical conduction a memory of the equilibrium phase revealed from electrical probing at all scales. The qualitative impact of alteration minerals on resistivity structure has been studied over the years in the Icelandic geothermal context. In this work, the CEC impact on pore surfaces electrical conductivity is studied quantitatively at the borehole scale, where several types of volcanic rocks are mixed together, with various degrees of alteration and porosity. Five boreholes located within a few km at the Krafla volcano, Northeast Iceland, constitute the basis for this study. The deepest and reference hole, KJ-18, provides cuttings of rock and logging data down to 2215 m depth; CEC measurements performed on cuttings show. KH-1 and KH-3 have cores and logs in the top 200 m only. Boreholes KH-5 and KH-6 sample cores with higher temperature alteration minerals down to 600 m. Together, these 4 shallow holes cover the diversity of rock types and alterations facies found in KJ-18. The petrophysical calibration obtained from cores will then be upscaled to log data analysis in KJ-18: porosity, formation factor, permeability, acoustic velocity, electrical surface conduction at different temperatures and CEC. This research is supported by the IMAGE FP7 EC project (Integrated Methods for Advanced Geothermal Exploration, grant agreement No. 608553).

  20. The injury List of All Deficits (LOAD) Framework--conceptualizing the full range of deficits and adverse outcomes following injury and violence.

    PubMed

    Lyons, Ronan A; Finch, Caroline F; McClure, Rod; van Beeck, Ed; Macey, Steven

    2010-09-01

    Over recent years, there has been increasing recognition that the burden of injuries and violence includes more than just the direct and indirect monetary costs associated with their medical outcomes. However, quantification of the total burden has been seriously hampered by lack of a framework for considering the range of outcomes which comprise the burden, poor identification of the outcomes and their imprecise measurement. This article proposes a new conceptual framework, the List of All Deficits (or LOAD) Framework, that has been developed from extensive expert discussion and consensus meetings to facilitate the measurement of the full burden of injuries and violence. The LOAD Framework recognises the multidimensional nature of injury burden across individual, family and societal domains. This classification of potential consequences of injury was built on the International Classification of Functioning concept of disability. Examples of empirical support for each consequence were obtained from the scientific literature. Determining the multidimensional injury burden requires the assessment and combination of 20 domains of potential consequences. The resulting LOAD Framework classification and concept diagram describes 12 groups of injury consequences for individuals, three for family and close friends and five for wider society. Understanding the extent of the negative implications (or deficits) of injury, through application of the LOAD Framework, is needed to put existing burden of injury studies into context and to highlight the inter-relationship between the direct and indirect burden of injury relative to other conditions.

  1. Development and Implementation of a Formal Framework for Bottom-up Uncertainty Analysis of Input Emissions: Case Study of Residential Wood Combustion

    NASA Astrophysics Data System (ADS)

    Zhao, S.; Mashayekhi, R.; Saeednooran, S.; Hakami, A.; Ménard, R.; Moran, M. D.; Zhang, J.

    2016-12-01

    We have developed a formal framework for documentation, quantification, and propagation of uncertainties in upstream emissions inventory data at various stages leading to the generation of model-ready gridded emissions through emissions processing software such as the EPA's SMOKE (Sparse Matrix Operator Kernel Emissions) system. To illustrate this framework we present a proof-of-concept case study of a bottom-up quantitative assessment of uncertainties in emissions from residential wood combustion (RWC) in the U.S. and Canada. Uncertainties associated with key inventory parameters are characterized based on existing information sources, including the American Housing Survey (AHS) from the U.S. Census Bureau, Timber Products Output (TPO) surveys from the U.S. Forest Service, TNS Canadian Facts surveys, and the AP-42 emission factor document from the U.S. EPA. The propagation of uncertainties is based on Monte Carlo simulation code external to SMOKE. Latin Hypercube Sampling (LHS) is implemented to generate a set of random realizations of each RWC inventory parameter, for which the uncertainties are assumed to be normally distributed. Random realizations are also obtained for each RWC temporal and chemical speciation profile and spatial surrogate field external to SMOKE using the LHS approach. SMOKE outputs for primary emissions (e.g., CO, VOC) using both RWC emission inventory realizations and perturbed temporal and chemical profiles and spatial surrogates show relative uncertainties of about 30-50% across the U.S. and about 70-100% across Canada. Positive skewness values (up to 2.7) and variable kurtosis values (up to 4.8) were also found. Spatial allocation contributes significantly to the overall uncertainty, particularly in Canada. By applying this framework we are able to produce random realizations of model-ready gridded emissions that along with available meteorological ensembles can be used to propagate uncertainties through chemical transport models. The approach described here provides an effective means for formal quantification of uncertainties in estimated emissions from various source sectors and for continuous documentation, assessment, and reduction of emission uncertainties.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cornaggia, F.; Congo, S.A.; Agostino, M.

    Kitina field is located in Marine VII permit, offshore Congo. The field was discovered in 1991 by a joint venture composed of Agip Recherches Congo (operator), Hydrocongo and Chevron International Limited. The field is a structural four-way dip closure trap shaped as turtle-back. Halokinetic movements are responsible for the structuring. The seismic imaging of the reservoir is affected by strong lateral velocity variations caused by different sedimentation across the paleo-shelf edge in the post-Albian sequence. One pass 3D poststack depth migration, performed with a velocity field obtained by means of geostatistical integration of 2D seismic and wellbore velocities, achieved amore » good compromise between high dip reflector imaging and depths at well location. Three main reservoirs of lower Albian age exist between -2100 and -3100m. They are separated by tight mudstones which act as intraformational seal. Seismic trace inversion improved the resolution of petrophysical variations in some of the field reservoirs, which have the following characteristics (from top to bottom): reservoir 2A is composed of bioclastic and oolitic packstone-grainstone laid down during regional regressive phase in insulated offshore bars on the crest of structural high. Early diagenetic phenomena lead to the development of world class permeability framework. Reservoir 1A-1B are composed of sandstone bodies which were deposited as shoreface to offshore bars during short-term regressive pulse. The 1A-1B reservoir, are embedded in mudstones deposited during long lasting phases of relative high stand in relatively deep offshore setting characterised by high, halokinetic driven subsidence.« less

  3. Robust approaches to quantification of margin and uncertainty for sparse data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hund, Lauren; Schroeder, Benjamin B.; Rumsey, Kelin

    Characterizing the tails of probability distributions plays a key role in quantification of margins and uncertainties (QMU), where the goal is characterization of low probability, high consequence events based on continuous measures of performance. When data are collected using physical experimentation, probability distributions are typically fit using statistical methods based on the collected data, and these parametric distributional assumptions are often used to extrapolate about the extreme tail behavior of the underlying probability distribution. In this project, we character- ize the risk associated with such tail extrapolation. Specifically, we conducted a scaling study to demonstrate the large magnitude of themore » risk; then, we developed new methods for communicat- ing risk associated with tail extrapolation from unvalidated statistical models; lastly, we proposed a Bayesian data-integration framework to mitigate tail extrapolation risk through integrating ad- ditional information. We conclude that decision-making using QMU is a complex process that cannot be achieved using statistical analyses alone.« less

  4. Simulation Credibility: Advances in Verification, Validation, and Uncertainty Quantification

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B. (Editor); Eklund, Dean R.; Romero, Vicente J.; Pearce, Jeffrey A.; Keim, Nicholas S.

    2016-01-01

    Decision makers and other users of simulations need to know quantified simulation credibility to make simulation-based critical decisions and effectively use simulations, respectively. The credibility of a simulation is quantified by its accuracy in terms of uncertainty, and the responsibility of establishing credibility lies with the creator of the simulation. In this volume, we present some state-of-the-art philosophies, principles, and frameworks. The contributing authors involved in this publication have been dedicated to advancing simulation credibility. They detail and provide examples of key advances over the last 10 years in the processes used to quantify simulation credibility: verification, validation, and uncertainty quantification. The philosophies and assessment methods presented here are anticipated to be useful to other technical communities conducting continuum physics-based simulations; for example, issues related to the establishment of simulation credibility in the discipline of propulsion are discussed. We envision that simulation creators will find this volume very useful to guide and assist them in quantitatively conveying the credibility of their simulations.

  5. Performance of uncertainty quantification methodologies and linear solvers in cardiovascular simulations

    NASA Astrophysics Data System (ADS)

    Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison

    2017-11-01

    Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.

  6. The hidden-Markov brain: comparison and inference of white matter hyperintensities on magnetic resonance imaging (MRI)

    NASA Astrophysics Data System (ADS)

    Pham, Tuan D.; Salvetti, Federica; Wang, Bing; Diani, Marco; Heindel, Walter; Knecht, Stefan; Wersching, Heike; Baune, Bernhard T.; Berger, Klaus

    2011-02-01

    Rating and quantification of cerebral white matter hyperintensities on magnetic resonance imaging (MRI) are important tasks in various clinical and scientific settings. As manual evaluation is time consuming and imprecise, much effort has been made to automate the quantification of white matter hyperintensities. There is rarely any report that attempts to study the similarity/dissimilarity of white matter hyperintensity patterns that have different sizes, shapes and spatial localizations on the MRI. This paper proposes an original computational neuroscience framework for such a conceptual study with a standpoint that the prior knowledge about white matter hyperintensities can be accumulated and utilized to enable a reliable inference of the rating of a new white matter hyperintensity observation. This computational approach for rating inference of white matter hyperintensities, which appears to be the first study, can be utilized as a computerized rating-assisting tool and can be very economical for diagnostic evaluation of brain tissue lesions.

  7. Uncertainty Quantification and Statistical Convergence Guidelines for PIV Data

    NASA Astrophysics Data System (ADS)

    Stegmeir, Matthew; Kassen, Dan

    2016-11-01

    As Particle Image Velocimetry has continued to mature, it has developed into a robust and flexible technique for velocimetry used by expert and non-expert users. While historical estimates of PIV accuracy have typically relied heavily on "rules of thumb" and analysis of idealized synthetic images, recently increased emphasis has been placed on better quantifying real-world PIV measurement uncertainty. Multiple techniques have been developed to provide per-vector instantaneous uncertainty estimates for PIV measurements. Often real-world experimental conditions introduce complications in collecting "optimal" data, and the effect of these conditions is important to consider when planning an experimental campaign. The current work utilizes the results of PIV Uncertainty Quantification techniques to develop a framework for PIV users to utilize estimated PIV confidence intervals to compute reliable data convergence criteria for optimal sampling of flow statistics. Results are compared using experimental and synthetic data, and recommended guidelines and procedures leveraging estimated PIV confidence intervals for efficient sampling for converged statistics are provided.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kress, Joel David

    The development and scale up of cost effective carbon capture processes is of paramount importance to enable the widespread deployment of these technologies to significantly reduce greenhouse gas emissions. The U.S. Department of Energy initiated the Carbon Capture Simulation Initiative (CCSI) in 2011 with the goal of developing a computational toolset that would enable industry to more effectively identify, design, scale up, operate, and optimize promising concepts. The first half of the presentation will introduce the CCSI Toolset consisting of basic data submodels, steady-state and dynamic process models, process optimization and uncertainty quantification tools, an advanced dynamic process control framework,more » and high-resolution filtered computationalfluid- dynamics (CFD) submodels. The second half of the presentation will describe a high-fidelity model of a mesoporous silica supported, polyethylenimine (PEI)-impregnated solid sorbent for CO 2 capture. The sorbent model includes a detailed treatment of transport and amine-CO 2- H 2O interactions based on quantum chemistry calculations. Using a Bayesian approach for uncertainty quantification, we calibrate the sorbent model to Thermogravimetric (TGA) data.« less

  9. Towards robust quantification and reduction of uncertainty in hydrologic predictions: Integration of particle Markov chain Monte Carlo and factorial polynomial chaos expansion

    NASA Astrophysics Data System (ADS)

    Wang, S.; Huang, G. H.; Baetz, B. W.; Ancell, B. C.

    2017-05-01

    The particle filtering techniques have been receiving increasing attention from the hydrologic community due to its ability to properly estimate model parameters and states of nonlinear and non-Gaussian systems. To facilitate a robust quantification of uncertainty in hydrologic predictions, it is necessary to explicitly examine the forward propagation and evolution of parameter uncertainties and their interactions that affect the predictive performance. This paper presents a unified probabilistic framework that merges the strengths of particle Markov chain Monte Carlo (PMCMC) and factorial polynomial chaos expansion (FPCE) algorithms to robustly quantify and reduce uncertainties in hydrologic predictions. A Gaussian anamorphosis technique is used to establish a seamless bridge between the data assimilation using the PMCMC and the uncertainty propagation using the FPCE through a straightforward transformation of posterior distributions of model parameters. The unified probabilistic framework is applied to the Xiangxi River watershed of the Three Gorges Reservoir (TGR) region in China to demonstrate its validity and applicability. Results reveal that the degree of spatial variability of soil moisture capacity is the most identifiable model parameter with the fastest convergence through the streamflow assimilation process. The potential interaction between the spatial variability in soil moisture conditions and the maximum soil moisture capacity has the most significant effect on the performance of streamflow predictions. In addition, parameter sensitivities and interactions vary in magnitude and direction over time due to temporal and spatial dynamics of hydrologic processes.

  10. Quantitative Seismic Interpretation: Applying Rock Physics Tools to Reduce Interpretation Risk

    NASA Astrophysics Data System (ADS)

    Sondergeld, Carl H.

    This book is divided into seven chapters that cover rock physics, statistical rock physics, seismic inversion techniques, case studies, and work flows. On balance, the emphasis is on rock physics. Included are 56 color figures that greatly help in the interpretation of more complicated plots and displays.The domain of rock physics falls between petrophysics and seismics. It is the basis for interpreting seismic observations and therefore is pivotal to the understanding of this book. The first two chapters are dedicated to this topic (109 pages).

  11. Model-Data Fusion and Adaptive Sensing for Large Scale Systems: Applications to Atmospheric Release Incidents

    NASA Astrophysics Data System (ADS)

    Madankan, Reza

    All across the world, toxic material clouds are emitted from sources, such as industrial plants, vehicular traffic, and volcanic eruptions can contain chemical, biological or radiological material. With the growing fear of natural, accidental or deliberate release of toxic agents, there is tremendous interest in precise source characterization and generating accurate hazard maps of toxic material dispersion for appropriate disaster management. In this dissertation, an end-to-end framework has been developed for probabilistic source characterization and forecasting of atmospheric release incidents. The proposed methodology consists of three major components which are combined together to perform the task of source characterization and forecasting. These components include Uncertainty Quantification, Optimal Information Collection, and Data Assimilation. Precise approximation of prior statistics is crucial to ensure performance of the source characterization process. In this work, an efficient quadrature based method has been utilized for quantification of uncertainty in plume dispersion models that are subject to uncertain source parameters. In addition, a fast and accurate approach is utilized for the approximation of probabilistic hazard maps, based on combination of polynomial chaos theory and the method of quadrature points. Besides precise quantification of uncertainty, having useful measurement data is also highly important to warranty accurate source parameter estimation. The performance of source characterization is highly affected by applied sensor orientation for data observation. Hence, a general framework has been developed for the optimal allocation of data observation sensors, to improve performance of the source characterization process. The key goal of this framework is to optimally locate a set of mobile sensors such that measurement of textit{better} data is guaranteed. This is achieved by maximizing the mutual information between model predictions and observed data, given a set of kinetic constraints on mobile sensors. Dynamic Programming method has been utilized to solve the resulting optimal control problem. To complete the loop of source characterization process, two different estimation techniques, minimum variance estimation framework and Bayesian Inference method has been developed to fuse model forecast with measurement data. Incomplete information regarding the distribution of associated noise signal in measurement data, is another major challenge in the source characterization of plume dispersion incidents. This frequently happens in data assimilation of atmospheric data by using the satellite imagery. This occurs due to the fact that satellite imagery data can be polluted with noise, depending on weather conditions, clouds, humidity, etc. Unfortunately, there is no accurate procedure to quantify the error in recorded satellite data. Hence, using classical data assimilation methods in this situation is not straight forward. In this dissertation, the basic idea of a novel approach has been proposed to tackle these types of real world problems with more accuracy and robustness. A simple example demonstrating the real-world scenario is presented to validate the developed methodology.

  12. Toward a Multi-City Framework for Urban GHG Estimation in the United States: Methods, Uncertainties, and Future Goals

    NASA Astrophysics Data System (ADS)

    Mueller, K. L.; Callahan, W.; Davis, K. J.; Dickerson, R. R.; Duren, R. M.; Gurney, K. R.; Karion, A.; Keeling, R. F.; Kim, J.; Lauvaux, T.; Miller, C. E.; Shepson, P. B.; Turnbull, J. C.; Weiss, R. F.; Whetstone, J. R.

    2017-12-01

    City and State governments are increasingly interested in mitigating greenhouse gas (GHG) emissions to improve sustainability within their jurisdictions. Estimation of urban GHG emissions remains an active research area with many sources of uncertainty. To support the effort of improving measurement of trace gas emissions in city environments, several federal agencies along with academic, research, and private entities have been working within a handful of domestic metropolitan areas to improve both (1) the assessment of GHG emissions accuracy using a variety of measurement technologies, and (2) the tools that can better assess GHG inventory data at urban mitigation scales based upon these measurements. The National Institute of Standards and Technology (NIST) activities have focused on three areas, or testbeds: Indianapolis (INFLUX experiment), Los Angeles (the LA Megacities project), and the Northeastern Corridor areas encompassing Washington and Baltimore (the NEC/BW GHG Measurements project). These cities represent diverse meteorological, terrain, demographic, and emissions characteristics having a broad range of complexities. To date this research has involved multiple measurement systems and integrated observing approaches, all aimed at advancing development of a robust, science-base upon which higher accuracy quantification approaches can rest. Progress toward such scientifically robust, widely-accepted emissions quantification methods will rely upon continuous performance assessment. Such assessment is challenged by the complexities of cities themselves (e.g., population, urban form) along with the many variables impacting a city's technological ability to estimate its GHG emissions (e.g., meteorology, density of observations). We present the different NIST testbeds and a proposal to initiate conceptual development of a reference framework supporting the comparison of multi-city GHG emissions estimates. Such a reference framework has potential to provide the basis for city governments to choose the measurements and methods that can quantify their GHG and related trace gas emissions at levels commensurate with their needs.

  13. A Simple Demonstration of Concrete Structural Health Monitoring Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahadevan, Sankaran; Agarwal, Vivek; Cai, Guowei

    Assessment and management of aging concrete structures in nuclear power plants require a more systematic approach than simple reliance on existing code margins of safety. Structural health monitoring of concrete structures aims to understand the current health condition of a structure based on heterogeneous measurements to produce high confidence actionable information regarding structural integrity that supports operational and maintenance decisions. This ongoing research project is seeking to develop a probabilistic framework for health diagnosis and prognosis of aging concrete structures in a nuclear power plant subjected to physical, chemical, environment, and mechanical degradation. The proposed framework consists of four elements—damagemore » modeling, monitoring, data analytics, and uncertainty quantification. This report describes a proof-of-concept example on a small concrete slab subjected to a freeze-thaw experiment that explores techniques in each of the four elements of the framework and their integration. An experimental set-up at Vanderbilt University’s Laboratory for Systems Integrity and Reliability is used to research effective combination of full-field techniques that include infrared thermography, digital image correlation, and ultrasonic measurement. The measured data are linked to the probabilistic framework: the thermography, digital image correlation data, and ultrasonic measurement data are used for Bayesian calibration of model parameters, for diagnosis of damage, and for prognosis of future damage. The proof-of-concept demonstration presented in this report highlights the significance of each element of the framework and their integration.« less

  14. Modeling and simulation of high dimensional stochastic multiscale PDE systems at the exascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zabaras, Nicolas J.

    2016-11-08

    Predictive Modeling of multiscale and Multiphysics systems requires accurate data driven characterization of the input uncertainties, and understanding of how they propagate across scales and alter the final solution. This project develops a rigorous mathematical framework and scalable uncertainty quantification algorithms to efficiently construct realistic low dimensional input models, and surrogate low complexity systems for the analysis, design, and control of physical systems represented by multiscale stochastic PDEs. The work can be applied to many areas including physical and biological processes, from climate modeling to systems biology.

  15. Upper Lithospheric Sources of Magnetic and Gravity Anomalies of The Fennoscandian Shield

    NASA Astrophysics Data System (ADS)

    Korhonen, J. V.; Koistinen, T.; Working GroupFennoscandian Geophysical Maps

    Magnetic total intensity anomalies (DGRF-65), Bouguer anomalies (d=2670 kg/m3) and geological units from 3400 Ma to present of the Fennoscandian Shield have been digitally compiled and printed as maps 1:2 000 000. Insert maps 1:15,000,000 com- pare anomaly components in different source scales: pseudogravimetric anomaly ver- sus Bouguer anomaly, DGRF-65 anomaly versus pseudomagnetic anomaly, magnetic vertical derivative versus second derivative of Bouguer anomaly. Data on bulk density, total magnetisation and lithology of samples have been presented as scatter diagrams and distribution maps of the average petrophysical properties in space and time. In sample level, the bulk density correlates with the lithology and, together with mag- netisation, establishes four principal populations of petrophysical properties. The av- erage properties, calculated for 5 km x 5 km cells, correlate only weakly with av- erage Bouguer-anomaly and magnetic anomaly, revealing major deep seated sources of anomalies. Pseudogravimetric and Bouguer anomalies correlate only locally with each other. The correlation is negative in the area of felsic Palaeoproterozoic rocks in W- and NW-parts of the Shield. In 2D models the sources of gravity anomalies are explained by lateral variation of density in upper and lower crust. Smoothly varying regional components are explained by boundaries of the lower crust, the upper mantle and the astenosphere. Magnetic anomalies are explained by lateral variation of magnetisation in the upper crust. Re- gional components are due to the lateral variation of magnetisation in the lower crust and the boundaries of lower crust and mantle and the Curie isotherm of magnetite.

  16. Evolution of petrophysical properties of across natural faults: a study on cores from the Tournemire underground research laboratory (France)

    NASA Astrophysics Data System (ADS)

    Bonnelye, Audrey; David, Christian; Schubnel, Alexandre; Wassermann, Jérôme; Lefèvre, Mélody; Henry, Pierre; Guglielmi, Yves; Castilla, Raymi; Dick, Pierre

    2017-04-01

    Faults in general, and in clay materials in particular, have complex structures that can be linked to both a polyphased tectonic history and the anisotropic nature of the material. Drilling through faults in shaly materials allows one to measure properties such as the structure, the mineralogical composition, the stress orientation or physical properties. These relations can be investigated in the laboratory in order to have a better understanding on in-situ mechanisms. In this study we used shales of Toarcian age from the Tournemire underground research laboratory (France). We decided to couple different petrophysical measurements on core samples retrieved from a borehole drilled perpendicularly to a fault plane, and the fault size is of the order of tens of meters. This 25m long borehole was sampled in order to perform several types of measurements: density, porosity, saturation directly in the field, and velocity of elastic waves and magnetic susceptibility anisotropy in the laboratory. For all these measurements, special protocols were developed in order to preserve as much as possible the saturation state of the samples. All these measurements were carried out in three zones that intersects the borehole: the intact zone , the damaged zone and the fault core zone. From our measurements, we were able to associate specific properties to each zone of the fault. We then calculated Thomsen's parameters in order to quantify the elastic anisotropy across the fault. Our results show strong variations of the elastic anisotropy with the distance to the fault core as well as the occurrence of anisotropy reversal.

  17. Coupling geostatistics to detailed reservoir description allows better visualization and more accurate characterization/simulation of turbidite reservoirs: Elk Hills oil field, California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allan, M.E.; Wilson, M.L.; Wightman, J.

    1996-12-31

    The Elk Hills giant oilfield, located in the southern San Joaquin Valley of California, has produced 1.1 billion barrels of oil from Miocene and shallow Pliocene reservoirs. 65% of the current 64,000 BOPD production is from the pressure-supported, deeper Miocene turbidite sands. In the turbidite sands of the 31 S structure, large porosity & permeability variations in the Main Body B and Western 31 S sands cause problems with the efficiency of the waterflooding. These variations have now been quantified and visualized using geostatistics. The end result is a more detailed reservoir characterization for simulation. Traditional reservoir descriptions based onmore » marker correlations, cross-sections and mapping do not provide enough detail to capture the short-scale stratigraphic heterogeneity needed for adequate reservoir simulation. These deterministic descriptions are inadequate to tie with production data as the thinly bedded sand/shale sequences blur into a falsely homogenous picture. By studying the variability of the geologic & petrophysical data vertically within each wellbore and spatially from well to well, a geostatistical reservoir description has been developed. It captures the natural variability of the sands and shales that was lacking from earlier work. These geostatistical studies allow the geologic and petrophysical characteristics to be considered in a probabilistic model. The end-product is a reservoir description that captures the variability of the reservoir sequences and can be used as a more realistic starting point for history matching and reservoir simulation.« less

  18. Coupling geostatistics to detailed reservoir description allows better visualization and more accurate characterization/simulation of turbidite reservoirs: Elk Hills oil field, California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allan, M.E.; Wilson, M.L.; Wightman, J.

    1996-01-01

    The Elk Hills giant oilfield, located in the southern San Joaquin Valley of California, has produced 1.1 billion barrels of oil from Miocene and shallow Pliocene reservoirs. 65% of the current 64,000 BOPD production is from the pressure-supported, deeper Miocene turbidite sands. In the turbidite sands of the 31 S structure, large porosity permeability variations in the Main Body B and Western 31 S sands cause problems with the efficiency of the waterflooding. These variations have now been quantified and visualized using geostatistics. The end result is a more detailed reservoir characterization for simulation. Traditional reservoir descriptions based on markermore » correlations, cross-sections and mapping do not provide enough detail to capture the short-scale stratigraphic heterogeneity needed for adequate reservoir simulation. These deterministic descriptions are inadequate to tie with production data as the thinly bedded sand/shale sequences blur into a falsely homogenous picture. By studying the variability of the geologic petrophysical data vertically within each wellbore and spatially from well to well, a geostatistical reservoir description has been developed. It captures the natural variability of the sands and shales that was lacking from earlier work. These geostatistical studies allow the geologic and petrophysical characteristics to be considered in a probabilistic model. The end-product is a reservoir description that captures the variability of the reservoir sequences and can be used as a more realistic starting point for history matching and reservoir simulation.« less

  19. Hardrock Elastic Physical Properties: Birch's Seismic Parameter Revisited

    NASA Astrophysics Data System (ADS)

    Wu, M.; Milkereit, B.

    2014-12-01

    Identifying rock composition and properties is imperative in a variety of fields including geotechnical engineering, mining, and petroleum exploration, in order to accurately make any petrophysical calculations. Density is, in particular, an important parameter that allows us to differentiate between lithologies and estimate or calculate other petrophysical properties. It is well established that compressional and shear wave velocities of common crystalline rocks increase with increasing densities (i.e. the Birch and Nafe-Drake relationships). Conventional empirical relations do not take into account S-wave velocity. Physical properties of Fe-oxides and massive sulfides, however, differ significantly from the empirical velocity-density relationships. Currently, acquiring in-situ density data is challenging and problematic, and therefore, developing an approximation for density based on seismic wave velocity and elastic moduli would be beneficial. With the goal of finding other possible or better relationships between density and the elastic moduli, a database of density, P-wave velocity, S-wave velocity, bulk modulus, shear modulus, Young's modulus, and Poisson's ratio was compiled based on a multitude of lab samples. The database is comprised of isotropic, non-porous metamorphic rock. Multi-parameter cross plots of the various elastic parameters have been analyzed in order to find a suitable parameter combination that reduces high density outliers. As expected, the P-wave velocity to S-wave velocity ratios show no correlation with density. However, Birch's seismic parameter, along with the bulk modulus, shows promise in providing a link between observed compressional and shear wave velocities and rock densities, including massive sulfides and Fe-oxides.

  20. Time-Lapse Electrical Geophysical Monitoring of Amendment-Based Biostimulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Timothy C.; Versteeg, Roelof; Day-Lewis, Frederick D.

    Biostimulation is increasingly used to accelerate microbial remediation of recalcitrant groundwater contaminants. Effective application of biostimulation requires successful emplacement of amendment in the contaminant target zone. Verification of remediation performance requires postemplacement assessment and contaminant monitoring. Sampling based approaches are expensive and provide low-density spatial and temporal information. Time-lapse electrical resistivity tomography (ERT) is an effective geophysical method for determining temporal changes in subsurface electrical conductivity. Because remedial amendments and biostimulation-related biogeochemical processes often change subsurface electrical conductivity, ERT can complement and enhance sampling-based approaches for assessing emplacement and monitoring biostimulation-based remediation. Field studies demonstrating the ability of time-lapse ERTmore » to monitor amendment emplacement and behavior were performed during a biostimulation remediation effort conducted at the Department of Defense Reutilization and Marketing Office (DRMO) Yard, in Brandywine, Maryland, United States. Geochemical fluid sampling was used to calibrate a petrophysical relation in order to predict groundwater indicators of amendment distribution. The petrophysical relations were field validated by comparing predictions to sequestered fluid sample results, thus demonstrating the potential of electrical geophysics for quantitative assessment of amendment-related geochemical properties. Crosshole radar zero-offset profile and borehole geophysical logging were also performed to augment the data set and validate interpretation. In addition to delineating amendment transport in the first 10 months after emplacement, the time-lapse ERT results show later changes in bulk electrical properties interpreted as mineral precipitation. Results support the use of more cost-effective surfacebased ERT in conjunction with limited field sampling to improve spatial and temporal monitoring of amendment emplacement and remediation performance.« less

  1. Time-lapse electrical geophysical monitoring of amendment-based biostimulation

    USGS Publications Warehouse

    Johnson, Timothy C.; Versteeg, Roelof J.; Day-Lewis, Frederick D.; Major, William; Lane, John W.

    2015-01-01

    Biostimulation is increasingly used to accelerate microbial remediation of recalcitrant groundwater contaminants. Effective application of biostimulation requires successful emplacement of amendment in the contaminant target zone. Verification of remediation performance requires postemplacement assessment and contaminant monitoring. Sampling-based approaches are expensive and provide low-density spatial and temporal information. Time-lapse electrical resistivity tomography (ERT) is an effective geophysical method for determining temporal changes in subsurface electrical conductivity. Because remedial amendments and biostimulation-related biogeochemical processes often change subsurface electrical conductivity, ERT can complement and enhance sampling-based approaches for assessing emplacement and monitoring biostimulation-based remediation.Field studies demonstrating the ability of time-lapse ERT to monitor amendment emplacement and behavior were performed during a biostimulation remediation effort conducted at the Department of Defense Reutilization and Marketing Office (DRMO) Yard, in Brandywine, Maryland, United States. Geochemical fluid sampling was used to calibrate a petrophysical relation in order to predict groundwater indicators of amendment distribution. The petrophysical relations were field validated by comparing predictions to sequestered fluid sample results, thus demonstrating the potential of electrical geophysics for quantitative assessment of amendment-related geochemical properties. Crosshole radar zero-offset profile and borehole geophysical logging were also performed to augment the data set and validate interpretation.In addition to delineating amendment transport in the first 10 months after emplacement, the time-lapse ERT results show later changes in bulk electrical properties interpreted as mineral precipitation. Results support the use of more cost-effective surface-based ERT in conjunction with limited field sampling to improve spatial and temporal monitoring of amendment emplacement and remediation performance.

  2. Characterization of Unconventional Reservoirs: CO2 Induced Petrophysics

    NASA Astrophysics Data System (ADS)

    Verba, C.; Goral, J.; Washburn, A.; Crandall, D.; Moore, J.

    2017-12-01

    As concerns about human-driven CO2 emissions grow, it is critical to develop economically and environmentally effective strategies to mitigate impacts associated with fossil energy. Geologic carbon storage (GCS) is a potentially promising technique which involves the injection of captured CO2 into subsurface formations. Unconventional shale formations are attractive targets for GCS while concurrently improving gas recovery. However, shales are inherently heterogeneous, and minor differences can impact the ability of the shale to effectively adsorb and store CO2. Understanding GCS capacity from such endemic heterogeneities is further complicated by the complex geochemical processes which can dynamically alter shale petrophysics. We investigated the size distribution, connectivity, and type (intraparticle, interparticle, and organic) of pores in shale; the mineralogy of cores from unconventional shale (e.g. Bakken); and the changes to these properties under simulated GCS conditions. Electron microscopy and dual beam focused ion beam scanning electron microscopy were used to reconstruct 2D/3D digital matrix and pore structures. Comparison of pre and post-reacted samples gives insights into CO2-shale interactions - such as the mechanism of CO2 sorption in shales- intended for enhanced oil recovery and GCS initiatives. These comparisons also show how geochemical processes proceed differently across shales based on their initial diagenesis. Results show that most shale pore sizes fall within meso-macro pore classification (> 2 nm), but have variable porosity and organic content. The formation of secondary minerals (calcite, gypsum, and halite) may play a role in the infilling of fractures and pore spaces in the shale, which may reduce permeability and inhibit the flow of fluids.

  3. Spatial Multicriteria Decision Analysis of Flood Risks in Aging-Dam Management in China: A Framework and Case Study

    PubMed Central

    Yang, Meng; Qian, Xin; Zhang, Yuchao; Sheng, Jinbao; Shen, Dengle; Ge, Yi

    2011-01-01

    Approximately 30,000 dams in China are aging and are considered to be high-level risks. Developing a framework for analyzing spatial multicriteria flood risk is crucial to ranking management scenarios for these dams, especially in densely populated areas. Based on the theories of spatial multicriteria decision analysis, this report generalizes a framework consisting of scenario definition, problem structuring, criteria construction, spatial quantification of criteria, criteria weighting, decision rules, sensitivity analyses, and scenario appraisal. The framework is presented in detail by using a case study to rank dam rehabilitation, decommissioning and existing-condition scenarios. The results show that there was a serious inundation, and that a dam rehabilitation scenario could reduce the multicriteria flood risk by 0.25 in the most affected areas; this indicates a mean risk decrease of less than 23%. Although increased risk (<0.20) was found for some residential and commercial buildings, if the dam were to be decommissioned, the mean risk would not be greater than the current existing risk, indicating that the dam rehabilitation scenario had a higher rank for decreasing the flood risk than the decommissioning scenario, but that dam rehabilitation alone might be of little help in abating flood risk. With adjustments and improvement to the specific methods (according to the circumstances and available data) this framework may be applied to other sites. PMID:21655125

  4. Quantifying uncertainty and computational complexity for pore-scale simulations

    NASA Astrophysics Data System (ADS)

    Chen, C.; Yuan, Z.; Wang, P.; Yang, X.; Zhenyan, L.

    2016-12-01

    Pore-scale simulation is an essential tool to understand the complex physical process in many environmental problems, from multi-phase flow in the subsurface to fuel cells. However, in practice, factors such as sample heterogeneity, data sparsity and in general, our insufficient knowledge of the underlying process, render many simulation parameters and hence the prediction results uncertain. Meanwhile, most pore-scale simulations (in particular, direct numerical simulation) incur high computational cost due to finely-resolved spatio-temporal scales, which further limits our data/samples collection. To address those challenges, we propose a novel framework based on the general polynomial chaos (gPC) and build a surrogate model representing the essential features of the underlying system. To be specific, we apply the novel framework to analyze the uncertainties of the system behavior based on a series of pore-scale numerical experiments, such as flow and reactive transport in 2D heterogeneous porous media and 3D packed beds. Comparing with recent pore-scale uncertainty quantification studies using Monte Carlo techniques, our new framework requires fewer number of realizations and hence considerably reduce the overall computational cost, while maintaining the desired accuracy.

  5. Upscaling species richness and abundances in tropical forests

    PubMed Central

    Tovo, Anna; Suweis, Samir; Formentin, Marco; Favretti, Marco; Volkov, Igor; Banavar, Jayanth R.; Azaele, Sandro; Maritan, Amos

    2017-01-01

    The quantification of tropical tree biodiversity worldwide remains an open and challenging problem. More than two-fifths of the number of worldwide trees can be found either in tropical or in subtropical forests, but only ≈0.000067% of species identities are known. We introduce an analytical framework that provides robust and accurate estimates of species richness and abundances in biodiversity-rich ecosystems, as confirmed by tests performed on both in silico–generated and real forests. Our analysis shows that the approach outperforms other methods. In particular, we find that upscaling methods based on the log-series species distribution systematically overestimate the number of species and abundances of the rare species. We finally apply our new framework on 15 empirical tropical forest plots and quantify the minimum percentage cover that should be sampled to achieve a given average confidence interval in the upscaled estimate of biodiversity. Our theoretical framework confirms that the forests studied are comprised of a large number of rare or hyper-rare species. This is a signature of critical-like behavior of species-rich ecosystems and can provide a buffer against extinction. PMID:29057324

  6. Automatic segmentation of 4D cardiac MR images for extraction of ventricular chambers using a spatio-temporal approach

    NASA Astrophysics Data System (ADS)

    Atehortúa, Angélica; Zuluaga, Maria A.; Ourselin, Sébastien; Giraldo, Diana; Romero, Eduardo

    2016-03-01

    An accurate ventricular function quantification is important to support evaluation, diagnosis and prognosis of several cardiac pathologies. However, expert heart delineation, specifically for the right ventricle, is a time consuming task with high inter-and-intra observer variability. A fully automatic 3D+time heart segmentation framework is herein proposed for short-axis-cardiac MRI sequences. This approach estimates the heart using exclusively information from the sequence itself without tuning any parameters. The proposed framework uses a coarse-to-fine approach, which starts by localizing the heart via spatio-temporal analysis, followed by a segmentation of the basal heart that is then propagated to the apex by using a non-rigid-registration strategy. The obtained volume is then refined by estimating the ventricular muscle by locally searching a prior endocardium- pericardium intensity pattern. The proposed framework was applied to 48 patients datasets supplied by the organizers of the MICCAI 2012 Right Ventricle segmentation challenge. Results show the robustness, efficiency and competitiveness of the proposed method both in terms of accuracy and computational load.

  7. Stochastic Analysis and Design of Heterogeneous Microstructural Materials System

    NASA Astrophysics Data System (ADS)

    Xu, Hongyi

    Advanced materials system refers to new materials that are comprised of multiple traditional constituents but complex microstructure morphologies, which lead to superior properties over the conventional materials. To accelerate the development of new advanced materials system, the objective of this dissertation is to develop a computational design framework and the associated techniques for design automation of microstructure materials systems, with an emphasis on addressing the uncertainties associated with the heterogeneity of microstructural materials. Five key research tasks are identified: design representation, design evaluation, design synthesis, material informatics and uncertainty quantification. Design representation of microstructure includes statistical characterization and stochastic reconstruction. This dissertation develops a new descriptor-based methodology, which characterizes 2D microstructures using descriptors of composition, dispersion and geometry. Statistics of 3D descriptors are predicted based on 2D information to enable 2D-to-3D reconstruction. An efficient sequential reconstruction algorithm is developed to reconstruct statistically equivalent random 3D digital microstructures. In design evaluation, a stochastic decomposition and reassembly strategy is developed to deal with the high computational costs and uncertainties induced by material heterogeneity. The properties of Representative Volume Elements (RVE) are predicted by stochastically reassembling SVE elements with stochastic properties into a coarse representation of the RVE. In design synthesis, a new descriptor-based design framework is developed, which integrates computational methods of microstructure characterization and reconstruction, sensitivity analysis, Design of Experiments (DOE), metamodeling and optimization the enable parametric optimization of the microstructure for achieving the desired material properties. Material informatics is studied to efficiently reduce the dimension of microstructure design space. This dissertation develops a machine learning-based methodology to identify the key microstructure descriptors that highly impact properties of interest. In uncertainty quantification, a comparative study on data-driven random process models is conducted to provide guidance for choosing the most accurate model in statistical uncertainty quantification. Two new goodness-of-fit metrics are developed to provide quantitative measurements of random process models' accuracy. The benefits of the proposed methods are demonstrated by the example of designing the microstructure of polymer nanocomposites. This dissertation provides material-generic, intelligent modeling/design methodologies and techniques to accelerate the process of analyzing and designing new microstructural materials system.

  8. Knowledge Extraction from Atomically Resolved Images.

    PubMed

    Vlcek, Lukas; Maksov, Artem; Pan, Minghu; Vasudevan, Rama K; Kalinin, Sergei V

    2017-10-24

    Tremendous strides in experimental capabilities of scanning transmission electron microscopy and scanning tunneling microscopy (STM) over the past 30 years made atomically resolved imaging routine. However, consistent integration and use of atomically resolved data with generative models is unavailable, so information on local thermodynamics and other microscopic driving forces encoded in the observed atomic configurations remains hidden. Here, we present a framework based on statistical distance minimization to consistently utilize the information available from atomic configurations obtained from an atomically resolved image and extract meaningful physical interaction parameters. We illustrate the applicability of the framework on an STM image of a FeSe x Te 1-x superconductor, with the segregation of the chalcogen atoms investigated using a nonideal interacting solid solution model. This universal method makes full use of the microscopic degrees of freedom sampled in an atomically resolved image and can be extended via Bayesian inference toward unbiased model selection with uncertainty quantification.

  9. Generalized concurrence measure for faithful quantification of multiparticle pure state entanglement using Lagrange's identity and wedge product

    NASA Astrophysics Data System (ADS)

    Bhaskara, Vineeth S.; Panigrahi, Prasanta K.

    2017-05-01

    Concurrence, introduced by Hill and Wootters (Phys Rev Lett 78:5022, 1997), provides an important measure of entanglement for a general pair of qubits that is faithful: strictly positive for entangled states and vanishing for all separable states. Such a measure captures the entire content of entanglement, providing necessary and sufficient conditions for separability. We present an extension of concurrence to multiparticle pure states in arbitrary dimensions by a new framework using the Lagrange's identity and wedge product representation of separability conditions, which coincides with the "I-concurrence" of Rungta et al. (Phys Rev A 64:042315, 2001) who proposed by extending Wootters's spin-flip operator to a so-called universal inverter superoperator. Our framework exposes an inherent geometry of entanglement and may be useful for the further extensions to mixed and continuous variable states.

  10. Prediction of compression-induced image interpretability degradation

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Chen, Hua-Mei; Irvine, John M.; Wang, Zhonghai; Chen, Genshe; Nagy, James; Scott, Stephen

    2018-04-01

    Image compression is an important component in modern imaging systems as the volume of the raw data collected is increasing. To reduce the volume of data while collecting imagery useful for analysis, choosing the appropriate image compression method is desired. Lossless compression is able to preserve all the information, but it has limited reduction power. On the other hand, lossy compression, which may result in very high compression ratios, suffers from information loss. We model the compression-induced information loss in terms of the National Imagery Interpretability Rating Scale or NIIRS. NIIRS is a user-based quantification of image interpretability widely adopted by the Geographic Information System community. Specifically, we present the Compression Degradation Image Function Index (CoDIFI) framework that predicts the NIIRS degradation (i.e., a decrease of NIIRS level) for a given compression setting. The CoDIFI-NIIRS framework enables a user to broker the maximum compression setting while maintaining a specified NIIRS rating.

  11. Unified framework for information integration based on information geometry

    PubMed Central

    Oizumi, Masafumi; Amari, Shun-ichi

    2016-01-01

    Assessment of causal influences is a ubiquitous and important subject across diverse research fields. Drawn from consciousness studies, integrated information is a measure that defines integration as the degree of causal influences among elements. Whereas pairwise causal influences between elements can be quantified with existing methods, quantifying multiple influences among many elements poses two major mathematical difficulties. First, overestimation occurs due to interdependence among influences if each influence is separately quantified in a part-based manner and then simply summed over. Second, it is difficult to isolate causal influences while avoiding noncausal confounding influences. To resolve these difficulties, we propose a theoretical framework based on information geometry for the quantification of multiple causal influences with a holistic approach. We derive a measure of integrated information, which is geometrically interpreted as the divergence between the actual probability distribution of a system and an approximated probability distribution where causal influences among elements are statistically disconnected. This framework provides intuitive geometric interpretations harmonizing various information theoretic measures in a unified manner, including mutual information, transfer entropy, stochastic interaction, and integrated information, each of which is characterized by how causal influences are disconnected. In addition to the mathematical assessment of consciousness, our framework should help to analyze causal relationships in complex systems in a complete and hierarchical manner. PMID:27930289

  12. Environmental exposure assessment framework for nanoparticles in solid waste.

    PubMed

    Boldrin, Alessio; Hansen, Steffen Foss; Baun, Anders; Hartmann, Nanna Isabella Bloch; Astrup, Thomas Fruergaard

    2014-01-01

    Information related to the potential environmental exposure of engineered nanomaterials (ENMs) in the solid waste management phase is extremely scarce. In this paper, we define nanowaste as separately collected or collectable waste materials which are or contain ENMs, and we present a five-step framework for the systematic assessment of ENM exposure during nanowaste management. The framework includes deriving EOL nanoproducts and evaluating the physicochemical properties of the nanostructure, matrix properties and nanowaste treatment processes as well as transformation processes and environment releases, eventually leading to a final assessment of potential ENM exposure. The proposed framework was applied to three selected nanoproducts: nanosilver polyester textile, nanoTiO 2 sunscreen lotion and carbon nanotube tennis racquets. We found that the potential global environmental exposure of ENMs associated with these three products was an estimated 0.5-143 Mg/year, which can also be characterised qualitatively as medium, medium, low, respectively. Specific challenges remain and should be subject to further research: (1) analytical techniques for the characterisation of nanowaste and its transformation during waste treatment processes, (2) mechanisms for the release of ENMs, (3) the quantification of nanowaste amounts at the regional scale, (4) a definition of acceptable limit values for exposure to ENMs from nanowaste and (5) the reporting of nanowaste generation data.

  13. Stochastic simulation of ecohydrological interactions between vegetation and groundwater

    NASA Astrophysics Data System (ADS)

    Dwelle, M. C.; Ivanov, V. Y.; Sargsyan, K.

    2017-12-01

    The complex interactions between groundwater and vegetation in the Amazon rainforest may yield vital ecophysiological interactions in specific landscape niches such as buffering plant water stress during dry season or suppression of water uptake due to anoxic conditions. Representation of such processes is greatly impacted by both external and internal sources of uncertainty: inaccurate data and subjective choice of model representation. The models that can simulate these processes are complex and computationally expensive, and therefore make it difficult to address uncertainty using traditional methods. We use the ecohydrologic model tRIBS+VEGGIE and a novel uncertainty quantification framework applied to the ZF2 watershed near Manaus, Brazil. We showcase the capability of this framework for stochastic simulation of vegetation-hydrology dynamics. This framework is useful for simulation with internal and external stochasticity, but this work will focus on internal variability of groundwater depth distribution and model parameterizations. We demonstrate the capability of this framework to make inferences on uncertain states of groundwater depth from limited in situ data, and how the realizations of these inferences affect the ecohydrological interactions between groundwater dynamics and vegetation function. We place an emphasis on the probabilistic representation of quantities of interest and how this impacts the understanding and interpretation of the dynamics at the groundwater-vegetation interface.

  14. Defining an additivity framework for mixture research in inducible whole-cell biosensors

    NASA Astrophysics Data System (ADS)

    Martin-Betancor, K.; Ritz, C.; Fernández-Piñas, F.; Leganés, F.; Rodea-Palomares, I.

    2015-11-01

    A novel additivity framework for mixture effect modelling in the context of whole cell inducible biosensors has been mathematically developed and implemented in R. The proposed method is a multivariate extension of the effective dose (EDp) concept. Specifically, the extension accounts for differential maximal effects among analytes and response inhibition beyond the maximum permissive concentrations. This allows a multivariate extension of Loewe additivity, enabling direct application in a biphasic dose-response framework. The proposed additivity definition was validated, and its applicability illustrated by studying the response of the cyanobacterial biosensor Synechococcus elongatus PCC 7942 pBG2120 to binary mixtures of Zn, Cu, Cd, Ag, Co and Hg. The novel method allowed by the first time to model complete dose-response profiles of an inducible whole cell biosensor to mixtures. In addition, the approach also allowed identification and quantification of departures from additivity (interactions) among analytes. The biosensor was found to respond in a near additive way to heavy metal mixtures except when Hg, Co and Ag were present, in which case strong interactions occurred. The method is a useful contribution for the whole cell biosensors discipline and related areas allowing to perform appropriate assessment of mixture effects in non-monotonic dose-response frameworks

  15. Environmental exposure assessment framework for nanoparticles in solid waste

    NASA Astrophysics Data System (ADS)

    Boldrin, Alessio; Hansen, Steffen Foss; Baun, Anders; Hartmann, Nanna Isabella Bloch; Astrup, Thomas Fruergaard

    2014-06-01

    Information related to the potential environmental exposure of engineered nanomaterials (ENMs) in the solid waste management phase is extremely scarce. In this paper, we define nanowaste as separately collected or collectable waste materials which are or contain ENMs, and we present a five-step framework for the systematic assessment of ENM exposure during nanowaste management. The framework includes deriving EOL nanoproducts and evaluating the physicochemical properties of the nanostructure, matrix properties and nanowaste treatment processes as well as transformation processes and environment releases, eventually leading to a final assessment of potential ENM exposure. The proposed framework was applied to three selected nanoproducts: nanosilver polyester textile, nanoTiO2 sunscreen lotion and carbon nanotube tennis racquets. We found that the potential global environmental exposure of ENMs associated with these three products was an estimated 0.5-143 Mg/year, which can also be characterised qualitatively as medium, medium, low, respectively. Specific challenges remain and should be subject to further research: (1) analytical techniques for the characterisation of nanowaste and its transformation during waste treatment processes, (2) mechanisms for the release of ENMs, (3) the quantification of nanowaste amounts at the regional scale, (4) a definition of acceptable limit values for exposure to ENMs from nanowaste and (5) the reporting of nanowaste generation data.

  16. Applying image quality in cell phone cameras: lens distortion

    NASA Astrophysics Data System (ADS)

    Baxter, Donald; Goma, Sergio R.; Aleksic, Milivoje

    2009-01-01

    This paper describes the framework used in one of the pilot studies run under the I3A CPIQ initiative to quantify overall image quality in cell-phone cameras. The framework is based on a multivariate formalism which tries to predict overall image quality from individual image quality attributes and was validated in a CPIQ pilot program. The pilot study focuses on image quality distortions introduced in the optical path of a cell-phone camera, which may or may not be corrected in the image processing path. The assumption is that the captured image used is JPEG compressed and the cellphone camera is set to 'auto' mode. As the used framework requires that the individual attributes to be relatively perceptually orthogonal, in the pilot study, the attributes used are lens geometric distortion (LGD) and lateral chromatic aberrations (LCA). The goal of this paper is to present the framework of this pilot project starting with the definition of the individual attributes, up to their quantification in JNDs of quality, a requirement of the multivariate formalism, therefore both objective and subjective evaluations were used. A major distinction in the objective part from the 'DSC imaging world' is that the LCA/LGD distortions found in cell-phone cameras, rarely exhibit radial behavior, therefore a radial mapping/modeling cannot be used in this case.

  17. A Probabilistic Framework for Quantifying Mixed Uncertainties in Cyber Attacker Payoffs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chatterjee, Samrat; Tipireddy, Ramakrishna; Oster, Matthew R.

    Quantification and propagation of uncertainties in cyber attacker payoffs is a key aspect within multiplayer, stochastic security games. These payoffs may represent penalties or rewards associated with player actions and are subject to various sources of uncertainty, including: (1) cyber-system state, (2) attacker type, (3) choice of player actions, and (4) cyber-system state transitions over time. Past research has primarily focused on representing defender beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and mathematical intervals. For cyber-systems, probability distributions may helpmore » address statistical (aleatory) uncertainties where the defender may assume inherent variability or randomness in the factors contributing to the attacker payoffs. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as generalizations of probability boxes. This paper explores the mathematical treatment of such mixed payoff uncertainties. A conditional probabilistic reasoning approach is adopted to organize the dependencies between a cyber-system’s state, attacker type, player actions, and state transitions. This also enables the application of probabilistic theories to propagate various uncertainties in the attacker payoffs. An example implementation of this probabilistic framework and resulting attacker payoff distributions are discussed. A goal of this paper is also to highlight this uncertainty quantification problem space to the cyber security research community and encourage further advancements in this area.« less

  18. Advances in interpretation of subsurface processes with time-lapse electrical imaging

    USGS Publications Warehouse

    Singha, Kaminit; Day-Lewis, Frederick D.; Johnson, Tim B.; Slater, Lee D.

    2015-01-01

    Electrical geophysical methods, including electrical resistivity, time-domain induced polarization, and complex resistivity, have become commonly used to image the near subsurface. Here, we outline their utility for time-lapse imaging of hydrological, geochemical, and biogeochemical processes, focusing on new instrumentation, processing, and analysis techniques specific to monitoring. We review data collection procedures, parameters measured, and petrophysical relationships and then outline the state of the science with respect to inversion methodologies, including coupled inversion. We conclude by highlighting recent research focused on innovative applications of time-lapse imaging in hydrology, biology, ecology, and geochemistry, among other areas of interest.

  19. Advances in interpretation of subsurface processes with time-lapse electrical imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singha, Kamini; Day-Lewis, Frederick D.; Johnson, Timothy C.

    2015-03-15

    Electrical geophysical methods, including electrical resistivity, time-domain induced polarization, and complex resistivity, have become commonly used to image the near subsurface. Here, we outline their utility for time-lapse imaging of hydrological, geochemical, and biogeochemical processes, focusing on new instrumentation, processing, and analysis techniques specific to monitoring. We review data collection procedures, parameters measured, and petrophysical relationships and then outline the state of the science with respect to inversion methodologies, including coupled inversion. We conclude by highlighting recent research focused on innovative applications of time-lapse imaging in hydrology, biology, ecology, and geochemistry, among other areas of interest.

  20. Geological and geophysical properties of cap rock in a natural CO2 occurrence, Mihályi-Répcelak area, Western Hungary

    NASA Astrophysics Data System (ADS)

    Király, Csilla; Szamosfalvi, Ágnes; Sendula, Eszter; Páles, Mariann; Kovács, István; Kónya, Péter; Falus, György; Szabó, Csaba

    2015-04-01

    The physical and geochemical consistency of the cap rock is primarily important for safe geological storage of CO2.. As a consequence of CO2 injection reactions took place between the minerals of the reservoir, the cap rock and CO2 saturated pore water. These reactions may change the mineral composition and petrophysical properties of the storage reservoir as well as the cap rock that provides the only physical barrier that retains carbon dioxide in the target reservoir formation. Study of the natural CO2 occurrences delivers information to understand which properties of a cap rock provide the sustainable closure and retainment. Knowledge of the long term effect of CO2 on the behavior of the cap rock is an important input in the selection procedure of a potential CO2 injection site. Yet, very few data exist on geochemical properties and reactivity of the cap rocks. During normal commercial operations the reservoir is typically cored, but not the cap rock. This study may enhance our knowledge about possible mineralogical reactions, which can occur in clayey-aleuritic cap rocks. The Mihályi-Répcelak natural CO2 occurrence is believed to be leakage safe. There is no known seepage on the surface. It is suggested that the aleuritic clay rich cap rock occurring at the natural reservoir can stop CO2 migration into other reservoirs or to the surface. The most important characteristics of cap rocks that they have low permeability (<0.1 mD) and porosity (eff.por. = 4%) and high clayeyness (approx. 80%). However, we demonstrate that in addition to these parameters the geochemical properties of cap rock is also important. In order to characterize the natural CO2 occurrence, we applied the following analysis, like XRD, FTIR, SEM. The petrophysical properties are determined from the interpretation of geophysical well-logs and grain size distribution. The most important result of this study that adequate petrophysical properties do not completely define the suitability of a cap rock. The effective porosity (~4 %), permeability (0.026 mD) and clayeyness (~80%) data imply that the studied aleurolites are good cap rocks. The mineral composition of cap rock is similar to that of reservoir rock, however, the ratio of components is different. The mineralogical analysis and petrography yield to the reaction between CO2 and the cap rocks. The most visible effect of CO2 presence is the dawsonite precipitation after albite dissolution within the cap rocks. Therefore, the CO2 may migrate through the cap rocks in geological time scale, however the total system could be leakage safe.

  1. Identification and cause of decay of building materials used in the architectural heritage of Bizerte city (Tunisia)

    NASA Astrophysics Data System (ADS)

    Zoghlami, Karima; Lopez-Arce, Paula; Navarro, Antonia; Zornoza-Indart, Ainara; Gómez, David

    2017-04-01

    Monuments and historical buildings of Bizerte show a disturbing state of degradation. In order to propose a compatible materials for the restauration works such as stone of substitution and restauration mortars, a geological context was analysed with the objectif to localize historical quarries accompanied by a sedimentological study to identify the exploited geological formations. Petrophysical and chemical caracterisation of both stone and mortars have been carried out. With the aim to determine the origin of the erosion and the degree of stone decay, a combination of micro-destructive and non-destructive techniques have been used on-site and in-lab. Moisture measurements, ultrasonic velocity propagation and water absorption by Karsten pipe test together with polarized light and fluorescence optical microscopy, mercury intrusion porosimetry and ion chromatography analyses were carried out to perform petrophysical characterization of stone samples and determination of soluble salts. For the characterization of mortars, granulometric study was performed to determine the nature of components and their grain size distribution. Thin sections of mortar samples were examined for the petrographical and mineralogical characterization. X-ray diffraction (XRD) analysis of finely pulverized samples was performed in order to identify the mineral crystalline phases of the mortars. Thermal analyses [thermogravimetry (TG)] were performed in order to determine the nature of the binder and its properties. Porosity was determined following UNE-EN 1936 (2007) standart test. Geological and petrographical study showed that historical buildings are essentially built with high porous bioclastic calcarenite partially cemented by calcite which is Würm in age and outcrops all along the northern coast of Bizerte where several historical quarries were identified. Occasionally, two other types of lithologies were used as building stones and they correspond to two varieties of oligocene sandstones (brown quartz-arenite cemented by iron oxide and ochre-green colored sandstone cemented by calcite) and an eocene white limestone corresponding to a fine-grained globigerine wackstone according to Dunham classification. Results of the petrophysical study show that small variations in the petrographic characteristics of the building geomaterials, such as type and degree of cementation, porous network configuration and presence or absence of soluble salts leads to differential stone weathering. Results of study's mortars show that original and restoration mortars have similar mineralogical composition but different grain size distribution and proportion of binder/agregats. They differ equally by the nature of raw materials as demonstrated by the thermal analyses. The study show that little variation of these parameters can affect the durability and the performance of mortars and can accelerate the degradation process of the building stones, especially the oligocene and eocene lithotypes.

  2. Drill Cuttings-based Methodology to Optimize Multi-stage Hydraulic Fracturing in Horizontal Wells and Unconventional Gas Reservoirs

    NASA Astrophysics Data System (ADS)

    Ortega Mercado, Camilo Ernesto

    Horizontal drilling and hydraulic fracturing techniques have become almost mandatory technologies for economic exploitation of unconventional gas reservoirs. Key to commercial success is minimizing the risk while drilling and hydraulic fracturing these wells. Data collection is expensive and as a result this is one of the first casualties during budget cuts. As a result complete data sets in horizontal wells are nearly always scarce. In order to minimize the data scarcity problem, the research addressed throughout this thesis concentrates on using drill cuttings, an inexpensive direct source of information, for developing: 1) A new methodology for multi-stage hydraulic fracturing optimization of horizontal wells without any significant increases in operational costs. 2) A new method for petrophysical evaluation in those wells with limited amount of log information. The methods are explained using drill cuttings from the Nikanassin Group collected in the Deep Basin of the Western Canada Sedimentary Basin (WCSB). Drill cuttings are the main source of information for the proposed methodology in Item 1, which involves the creation of three 'log tracks' containing the following parameters for improving design of hydraulic fracturing jobs: (a) Brittleness Index, (b) Measured Permeability and (c) An Indicator of Natural Fractures. The brittleness index is primarily a function of Poisson's ratio and Young Modulus, parameters that are obtained from drill cuttings and sonic logs formulations. Permeability is measured on drill cuttings in the laboratory. The indication of natural fractures is obtained from direct observations on drill cuttings under the microscope. Drill cuttings are also the main source of information for the new petrophysical evaluation method mentioned above in Item 2 when well logs are not available. This is important particularly in horizontal wells where the amount of log data is almost non-existent in the vast majority of the wells. By combining data from drill cuttings and previously available empirical relationships developed from cores it is possible to estimate water saturations, pore throat apertures, capillary pressures, flow units, porosity (or cementation) exponent m, true formation resistivity Rt, distance to a water table (if present), and to distinguish the contributions of viscous and diffusion-like flow in the tight gas formation. The method further allows the construction of Pickett plots using porosity and permeability obtained from drill cuttings, without previous availability of well logs. The method assumes the existence of intervals at irreducible water saturation, which is the case of the Nikanassin Group throughout the gas column. The new methods mentioned above are not meant to replace the use of detailed and sophisticated evaluation techniques. But the proposed methods provide a valuable and practical aid in those cases where geomechanical and petrophysical information are scarce.

  3. Tentacle: distributed quantification of genes in metagenomes.

    PubMed

    Boulund, Fredrik; Sjögren, Anders; Kristiansson, Erik

    2015-01-01

    In metagenomics, microbial communities are sequenced at increasingly high resolution, generating datasets with billions of DNA fragments. Novel methods that can efficiently process the growing volumes of sequence data are necessary for the accurate analysis and interpretation of existing and upcoming metagenomes. Here we present Tentacle, which is a novel framework that uses distributed computational resources for gene quantification in metagenomes. Tentacle is implemented using a dynamic master-worker approach in which DNA fragments are streamed via a network and processed in parallel on worker nodes. Tentacle is modular, extensible, and comes with support for six commonly used sequence aligners. It is easy to adapt Tentacle to different applications in metagenomics and easy to integrate into existing workflows. Evaluations show that Tentacle scales very well with increasing computing resources. We illustrate the versatility of Tentacle on three different use cases. Tentacle is written for Linux in Python 2.7 and is published as open source under the GNU General Public License (v3). Documentation, tutorials, installation instructions, and the source code are freely available online at: http://bioinformatics.math.chalmers.se/tentacle.

  4. Intrusive Method for Uncertainty Quantification in a Multiphase Flow Solver

    NASA Astrophysics Data System (ADS)

    Turnquist, Brian; Owkes, Mark

    2016-11-01

    Uncertainty quantification (UQ) is a necessary, interesting, and often neglected aspect of fluid flow simulations. To determine the significance of uncertain initial and boundary conditions, a multiphase flow solver is being created which extends a single phase, intrusive, polynomial chaos scheme into multiphase flows. Reliably estimating the impact of input uncertainty on design criteria can help identify and minimize unwanted variability in critical areas, and has the potential to help advance knowledge in atomizing jets, jet engines, pharmaceuticals, and food processing. Use of an intrusive polynomial chaos method has been shown to significantly reduce computational cost over non-intrusive collocation methods such as Monte-Carlo. This method requires transforming the model equations into a weak form through substitution of stochastic (random) variables. Ultimately, the model deploys a stochastic Navier Stokes equation, a stochastic conservative level set approach including reinitialization, as well as stochastic normals and curvature. By implementing these approaches together in one framework, basic problems may be investigated which shed light on model expansion, uncertainty theory, and fluid flow in general. NSF Grant Number 1511325.

  5. The logical primitives of thought: Empirical foundations for compositional cognitive models.

    PubMed

    Piantadosi, Steven T; Tenenbaum, Joshua B; Goodman, Noah D

    2016-07-01

    The notion of a compositional language of thought (LOT) has been central in computational accounts of cognition from earliest attempts (Boole, 1854; Fodor, 1975) to the present day (Feldman, 2000; Penn, Holyoak, & Povinelli, 2008; Fodor, 2008; Kemp, 2012; Goodman, Tenenbaum, & Gerstenberg, 2015). Recent modeling work shows how statistical inferences over compositionally structured hypothesis spaces might explain learning and development across a variety of domains. However, the primitive components of such representations are typically assumed a priori by modelers and theoreticians rather than determined empirically. We show how different sets of LOT primitives, embedded in a psychologically realistic approximate Bayesian inference framework, systematically predict distinct learning curves in rule-based concept learning experiments. We use this feature of LOT models to design a set of large-scale concept learning experiments that can determine the most likely primitives for psychological concepts involving Boolean connectives and quantification. Subjects' inferences are most consistent with a rich (nonminimal) set of Boolean operations, including first-order, but not second-order, quantification. Our results more generally show how specific LOT theories can be distinguished empirically. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. multiUQ: An intrusive uncertainty quantification tool for gas-liquid multiphase flows

    NASA Astrophysics Data System (ADS)

    Turnquist, Brian; Owkes, Mark

    2017-11-01

    Uncertainty quantification (UQ) can improve our understanding of the sensitivity of gas-liquid multiphase flows to variability about inflow conditions and fluid properties, creating a valuable tool for engineers. While non-intrusive UQ methods (e.g., Monte Carlo) are simple and robust, the cost associated with these techniques can render them unrealistic. In contrast, intrusive UQ techniques modify the governing equations by replacing deterministic variables with stochastic variables, adding complexity, but making UQ cost effective. Our numerical framework, called multiUQ, introduces an intrusive UQ approach for gas-liquid flows, leveraging a polynomial chaos expansion of the stochastic variables: density, momentum, pressure, viscosity, and surface tension. The gas-liquid interface is captured using a conservative level set approach, including a modified reinitialization equation which is robust and quadrature free. A least-squares method is leveraged to compute the stochastic interface normal and curvature needed in the continuum surface force method for surface tension. The solver is tested by applying uncertainty to one or two variables and verifying results against the Monte Carlo approach. NSF Grant #1511325.

  7. A proteomic insight into vitellogenesis during tick ovary maturation.

    PubMed

    Xavier, Marina Amaral; Tirloni, Lucas; Pinto, Antônio F M; Diedrich, Jolene K; Yates, John R; Mulenga, Albert; Logullo, Carlos; da Silva Vaz, Itabajara; Seixas, Adriana; Termignoni, Carlos

    2018-03-16

    Ticks are arthropod ectoparasites of importance for public and veterinary health. The understanding of tick oogenesis and embryogenesis could contribute to the development of novel control methods. However, to date, studies on the temporal dynamics of proteins during ovary development were not reported. In the present study we followed protein profile during ovary maturation. Proteomic analysis of ovary extracts was performed by liquid chromatography-tandem mass spectrometry (LC-MS/MS) using shotgun strategy, in addition to dimethyl labelling-based protein quantification. A total of 3,756 proteins were identified, which were functionally annotated into 30 categories. Circa 80% of the annotated proteins belong to categories related to basal metabolism, such as protein synthesis and modification machineries, nuclear regulation, cytoskeleton, proteasome machinery, transcriptional machinery, energetic metabolism, extracellular matrix/cell adhesion, immunity, oxidation/detoxification metabolism, signal transduction, and storage. The abundance of selected proteins involved in yolk uptake and degradation, as well as vitellin accumulation during ovary maturation, was assessed using dimethyl-labelling quantification. In conclusion, proteins identified in this study provide a framework for future studies to elucidate tick development and validate candidate targets for novel control methods.

  8. Advances in life cycle assessment and emergy evaluation with case studies in gold mining and pineapple production

    NASA Astrophysics Data System (ADS)

    Ingwersen, Wesley W.

    Life cycle assessment (LCA) is an internationally standardized framework for assessing the environmental impacts of products that is rapidly evolving to improve understanding and quantification of how complex product systems depend upon and affect the environment. This dissertation contributes to that evolution through the development of new methods for measuring impacts, estimating the uncertainty of impacts, and measuring ranges of environmental performance, with a focus on product systems in non-OECD countries that have not been well characterized. The integration of a measure of total energy use, emergy, is demonstrated in an LCA of gold from the Yanacocha mine in Peru in the second chapter. A model for estimating the accuracy of emergy results is proposed in the following chapter. The fourth chapter presents a template for LCA-based quantification of the range of environmental performance for tropical agricultural products using the example of fresh pineapple production for export in Costa Rica that can be used to create product labels with environmental information. The final chapter synthesizes how each methodological contribution will together improve the science of measuring product environmental performance.

  9. Objective quantification of the tinnitus decompensation by synchronization measures of auditory evoked single sweeps.

    PubMed

    Strauss, Daniel J; Delb, Wolfgang; D'Amelio, Roberto; Low, Yin Fen; Falkai, Peter

    2008-02-01

    Large-scale neural correlates of the tinnitus decompensation might be used for an objective evaluation of therapies and neurofeedback based therapeutic approaches. In this study, we try to identify large-scale neural correlates of the tinnitus decompensation using wavelet phase stability criteria of single sweep sequences of late auditory evoked potentials as synchronization stability measure. The extracted measure provided an objective quantification of the tinnitus decompensation and allowed for a reliable discrimination between a group of compensated and decompensated tinnitus patients. We provide an interpretation for our results by a neural model of top-down projections based on the Jastreboff tinnitus model combined with the adaptive resonance theory which has not been applied to model tinnitus so far. Using this model, our stability measure of evoked potentials can be linked to the focus of attention on the tinnitus signal. It is concluded that the wavelet phase stability of late auditory evoked potential single sweeps might be used as objective tinnitus decompensation measure and can be interpreted in the framework of the Jastreboff tinnitus model and adaptive resonance theory.

  10. On uncertainty quantification of lithium-ion batteries: Application to an LiC6/LiCoO2 cell

    NASA Astrophysics Data System (ADS)

    Hadigol, Mohammad; Maute, Kurt; Doostan, Alireza

    2015-12-01

    In this work, a stochastic, physics-based model for Lithium-ion batteries (LIBs) is presented in order to study the effects of parametric model uncertainties on the cell capacity, voltage, and concentrations. To this end, the proposed uncertainty quantification (UQ) approach, based on sparse polynomial chaos expansions, relies on a small number of battery simulations. Within this UQ framework, the identification of most important uncertainty sources is achieved by performing a global sensitivity analysis via computing the so-called Sobol' indices. Such information aids in designing more efficient and targeted quality control procedures, which consequently may result in reducing the LIB production cost. An LiC6/LiCoO2 cell with 19 uncertain parameters discharged at 0.25C, 1C and 4C rates is considered to study the performance and accuracy of the proposed UQ approach. The results suggest that, for the considered cell, the battery discharge rate is a key factor affecting not only the performance variability of the cell, but also the determination of most important random inputs.

  11. Regional flux analysis for discovering and quantifying anatomical changes: An application to the brain morphometry in Alzheimer's disease.

    PubMed

    Lorenzi, M; Ayache, N; Pennec, X

    2015-07-15

    In this study we introduce the regional flux analysis, a novel approach to deformation based morphometry based on the Helmholtz decomposition of deformations parameterized by stationary velocity fields. We use the scalar pressure map associated to the irrotational component of the deformation to discover the critical regions of volume change. These regions are used to consistently quantify the associated measure of volume change by the probabilistic integration of the flux of the longitudinal deformations across the boundaries. The presented framework unifies voxel-based and regional approaches, and robustly describes the volume changes at both group-wise and subject-specific level as a spatial process governed by consistently defined regions. Our experiments on the large cohorts of the ADNI dataset show that the regional flux analysis is a powerful and flexible instrument for the study of Alzheimer's disease in a wide range of scenarios: cross-sectional deformation based morphometry, longitudinal discovery and quantification of group-wise volume changes, and statistically powered and robust quantification of hippocampal and ventricular atrophy. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Characterization of the Marcellus Shale based on computer-assisted correlation of wireline logs in Virginia and West Virginia

    USGS Publications Warehouse

    Enomoto, Catherine B.; Olea, Ricardo A.; Coleman, James L.

    2014-01-01

    The Middle Devonian Marcellus Shale in the Appalachian basin extends from central Ohio on the west to eastern New York on the east, and from north-central New York on the north to northern Tennessee on the south. Its thickness ranges from 0 feet (ft) where it pinches out to the west to as much as 700 ft in its eastern extent. Within the Broadtop synclinorium, the thickness of the Marcellus Shale ranges from 250 to 565 ft. Although stratigraphic complexities have been documented, a significant range in thickness most likely is because of tectonic thickening from folds and thrust faults. Outcrop studies in the Valley and Ridge and Appalachian Plateaus provinces illustrate the challenges of interpreting the relation of third-order faults, folds, and “disturbed” zones to the regional tectonic framework. Recent field work within the Valley and Ridge province determined that significant faulting and intraformational deformation are present within the Marcellus Shale at the outcrop scale. In an attempt to determine if this scale of deformation is detectable with conventional wireline logs, petrophysical properties (primarily mineralogy and porosity) were measured by interpretation of gamma-ray and bulk-density logs. The results of performing a statistical correlation of wireline logs from nine wells indicated that there are discontinuities within the Millboro Shale (undifferentiated Marcellus Shale and Mahantango Formation) where there are significant thickness differences between wells. Also, some intervals likely contain mineralogy that makes these zones more prone to layer-shortening cleavage duplexes. The Correlator program proved to be a useful tool in a region of contractional deformation.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamlin, H.S.; Dutton, S.P.; Tyler, N.

    The Tirrawarra Sandstone contains 146 million bbl of oil in Tirrawarra field in the Cooper basin of South Australia. We used core, well logs, and petro-physical data to construct a depositional-facies-based flow-unit model of the reservoir, which describes rock properties and hydrocarbon saturations in three dimensions. Using the model to calculate volumes and residency of original and remaining oil in place, we identified an additional 36 million bbl of oil in place and improved understanding of past production patterns. The Tirrawarra Sandstone reservoir was deposited in a Carboniferous-Permian proglacial intracratonic setting and is composed of lacustrine and fluvial facies assemblages.more » The stratigraphic framework of these nonmarine facies is defined by distinctive stacking patterns and erosional unconformities. Mudstone dominated zones that are analogous to marine maximum flooding surfaces bound the reservoir. At its base a progradational lacustrine-delta system, composed of lenticular mud-clast-rich sandstones enclosed in mudstone, is truncated by an unconformity. Sandstones in these lower deltaic facies lost most of their porosity by mechanical compaction of ductile grains. Sediment reworking by channel migration and locally shore-zone processes created by quartz-rich, multilateral sandstones, which retained the highest porosity and permeability of all the reservoir facies and contained most of the original oil in place. Braided-channel sandstones, however, are overlain by lenticular meandering-channel sandstones, which in turn grade upward into widespread mudstones and coals. Thus, this uppermost part of the reservoir displays a retrogradational stacking pattern and upward-decreasing reservoir quality. Our results demonstrate that depositional variables are the primary controls on reservoir quality and productivity in the Tirrawarra Sandstone.« less

  14. Dual-domain mass-transfer parameters from electrical hysteresis: theory and analytical approach applied to laboratory, synthetic streambed, and groundwater experiments

    USGS Publications Warehouse

    Briggs, Martin A.; Day-Lewis, Frederick D.; Ong, John B.; Harvey, Judson W.; Lane, John W.

    2014-01-01

    Models of dual-domain mass transfer (DDMT) are used to explain anomalous aquifer transport behavior such as the slow release of contamination and solute tracer tailing. Traditional tracer experiments to characterize DDMT are performed at the flow path scale (meters), which inherently incorporates heterogeneous exchange processes; hence, estimated “effective” parameters are sensitive to experimental design (i.e., duration and injection velocity). Recently, electrical geophysical methods have been used to aid in the inference of DDMT parameters because, unlike traditional fluid sampling, electrical methods can directly sense less-mobile solute dynamics and can target specific points along subsurface flow paths. Here we propose an analytical framework for graphical parameter inference based on a simple petrophysical model explaining the hysteretic relation between measurements of bulk and fluid conductivity arising in the presence of DDMT at the local scale. Analysis is graphical and involves visual inspection of hysteresis patterns to (1) determine the size of paired mobile and less-mobile porosities and (2) identify the exchange rate coefficient through simple curve fitting. We demonstrate the approach using laboratory column experimental data, synthetic streambed experimental data, and field tracer-test data. Results from the analytical approach compare favorably with results from calibration of numerical models and also independent measurements of mobile and less-mobile porosity. We show that localized electrical hysteresis patterns resulting from diffusive exchange are independent of injection velocity, indicating that repeatable parameters can be extracted under varied experimental designs, and these parameters represent the true intrinsic properties of specific volumes of porous media of aquifers and hyporheic zones.

  15. Hydrogeophysical investigations at Hidden Dam, Raymond, California

    USGS Publications Warehouse

    Minsley, Burke J.; Burton, Bethany L.; Ikard, Scott; Powers, Michael H.

    2011-01-01

    Self-potential and direct current resistivity surveys are carried out at the Hidden Dam site in Raymond, California to assess present-day seepage patterns and better understand the hydrogeologic mechanisms that likely influence seepage. Numerical modeling is utilized in conjunction with the geophysical measurements to predict variably-saturated flow through typical two-dimensional dam cross-sections as a function of reservoir elevation. Several different flow scenarios are investigated based on the known hydrogeology, as well as information about typical subsurface structures gained from the resistivity survey. The flow models are also used to simulate the bulk electrical resistivity in the subsurface under varying saturation conditions, as well as the self-potential response using petrophysical relationships and electrokinetic coupling equations.The self-potential survey consists of 512 measurements on the downstream area of the dam, and corroborates known seepage areas on the northwest side of the dam. Two direct-current resistivity profiles, each approximately 2,500 ft (762 m) long, indicate a broad sediment channel under the northwest side of the dam, which may be a significant seepage pathway through the foundation. A focusing of seepage in low-topography areas downstream of the dam is confirmed from the numerical flow simulations, which is also consistent with past observations. Little evidence of seepage is identified from the self-potential data on the southeast side of the dam, also consistent with historical records, though one possible area of focused seepage is identified near the outlet works. Integration of the geophysical surveys, numerical modeling, and observation well data provides a framework for better understanding seepage at the site through a combined hydrogeophysical approach.

  16. The deeper structure of the southern Dead Sea basin derived from neural network analysis of velocity and attenuation tomography

    NASA Astrophysics Data System (ADS)

    Braeuer, Benjamin; Haberland, Christian; Bauer, Klaus; Weber, Michael

    2014-05-01

    The Dead Sea basin is a pull-apart basin at the Dead Sea transform fault, the boundary between the African and the Arabian plates. Though the DSB has been studied for a long time, the available knowledge - based mainly on surface geology, drilling and seismic reflection surveys - gives only a partial picture of its shallow structure. Therefore, within the framework of the international DESIRE (DEad Sea Integrated REsearch) project, a dense temporary local seismological network was operated in the southern Dead Sea area. Within 18 month of recording 650 events were detected. In addition to an already published tomography study revealing the distribution of P velocities and the Vp/Vs ratios a 2D P-wave attenuation tomography (parameter Qp) was performed. The neural network technique of Self-organizing maps (SOM) is used for the joint interpretation of these three parameters (Vp, Vp/Vs, Qp). The resulting clusters in the petrophysical parameter space are assigned to the main lithological units below the southern part of the Dead Sea basin: (1) The basin sediments characterized by strong attenuation, high vp/vs ratios and low P velocities. (2) The pre-basin sediments characterized by medium to strong attenuation, low Vp/Vs ratios and medium P velocities. (3) The basement characterized by low to moderate attenuation, medium vp/vs ratios and high P velocities. Thus, the asymmetric southern Dead Sea basin is filled with basin sediments down to depth of 7 to 12 km. Below the basin sediments, the pre-basin sediments are extending to a depth between 13 and 18 km.

  17. Dual-domain mass-transfer parameters from electrical hysteresis: Theory and analytical approach applied to laboratory, synthetic streambed, and groundwater experiments

    NASA Astrophysics Data System (ADS)

    Briggs, Martin A.; Day-Lewis, Frederick D.; Ong, John B.; Harvey, Judson W.; Lane, John W.

    2014-10-01

    Models of dual-domain mass transfer (DDMT) are used to explain anomalous aquifer transport behavior such as the slow release of contamination and solute tracer tailing. Traditional tracer experiments to characterize DDMT are performed at the flow path scale (meters), which inherently incorporates heterogeneous exchange processes; hence, estimated "effective" parameters are sensitive to experimental design (i.e., duration and injection velocity). Recently, electrical geophysical methods have been used to aid in the inference of DDMT parameters because, unlike traditional fluid sampling, electrical methods can directly sense less-mobile solute dynamics and can target specific points along subsurface flow paths. Here we propose an analytical framework for graphical parameter inference based on a simple petrophysical model explaining the hysteretic relation between measurements of bulk and fluid conductivity arising in the presence of DDMT at the local scale. Analysis is graphical and involves visual inspection of hysteresis patterns to (1) determine the size of paired mobile and less-mobile porosities and (2) identify the exchange rate coefficient through simple curve fitting. We demonstrate the approach using laboratory column experimental data, synthetic streambed experimental data, and field tracer-test data. Results from the analytical approach compare favorably with results from calibration of numerical models and also independent measurements of mobile and less-mobile porosity. We show that localized electrical hysteresis patterns resulting from diffusive exchange are independent of injection velocity, indicating that repeatable parameters can be extracted under varied experimental designs, and these parameters represent the true intrinsic properties of specific volumes of porous media of aquifers and hyporheic zones.

  18. 3D Hydraulic tomography from joint inversion of the hydraulic heads and self-potential data. (Invited)

    NASA Astrophysics Data System (ADS)

    Jardani, A.; Soueid Ahmed, A.; Revil, A.; Dupont, J.

    2013-12-01

    Pumping tests are usually employed to predict the hydraulic conductivity filed from the inversion of the head measurements. Nevertheless, the inverse problem is strongly underdetermined and a reliable imaging requires a considerable number of wells. We propose to add more information to the inversion of the heads by adding (non-intrusive) streaming potentials (SP) data. The SP corresponds to perturbations in the local electrical field caused directly by the fow of the ground water. These SP are obtained with a set of the non-polarising electrodes installed at the ground surface. We developed a geostatistical method for the estimation of the hydraulic conductivity field from measurements of hydraulic heads and SP during pumping and injection experiments. We use the adjoint state method and a recent petrophysical formulation of the streaming potential problem in which the streaming coupling coefficient is derived from the hydraulic conductivity allowed reducing of the unknown parameters. The geostatistical inverse framework is applied to three synthetic case studies with different number of the wells and electrodes used to measure the hydraulic heads and the streaming potentials. To evaluate the benefits of the incorporating of the streaming potential to the hydraulic data, we compared the cases in which the data are coupled or not to map the hydraulic conductivity. The results of the inversion revealed that a dense distribution of electrodes can be used to infer the heterogeneities in the hydraulic conductivity field. Incorporating the streaming potential information to the hydraulic head data improves the estimate of hydraulic conductivity field especially when the number of piezometers is limited.

  19. Controls on the quality of Miocene reservoirs, southern Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Gutiérrez Paredes, Hilda Clarisa; Catuneanu, Octavian; Hernández Romano, Ulises

    2018-01-01

    An investigation was conducted to determine the main controls on the reservoir quality of the middle and upper Miocene sandstones in the southern Gulf of Mexico based on core descriptions, thin section petrography and petrophysical data; as well as to explore the possible link between the sequence stratigraphic framework, depositional facies and diagenetic alterations. The Miocene deep marine sandstones are attributed to the falling-stage, lowstand, and transgressive systems tracts. The middle Miocene falling-stage systems tract includes medium-to very fine-grained, and structureless sandstones deposited in channels and frontal splays, and muddy sandstones, deposited in lobes of debrites. The lowstand and transgressive systems tracts consist of medium-to very fine-grained massive and normally graded sandstones deposited in channel systems within frontal splay complexes. The upper Miocene falling-stage systems tract includes medium-to coarse-grained, structureless sandstones deposited in channel systems and frontal splay, as well as lobes of debrites formed by grain flows and hybrid-flow deposits. The lowstand and transgressive systems tracts include fine-grained sandstones deposited in overbank deposits. The results reveal that the depositional elements with the best reservoir quality are the frontal splays deposited during the falling-stage system tracts. The reservoir quality of the Miocene sandstones was controlled by a combination of depositional facies, sand composition and diagenetic factors (mainly compaction and calcite cementation). Sandstone texture, controlled primarily by depositional facies appears more important than sandstone composition in determining reservoir quality; and compaction was more important than cementation in porosity destruction. Compaction was stopped, when complete calcite cementation occurred.

  20. Adaptive Quantification and Longitudinal Analysis of Pulmonary Emphysema with a Hidden Markov Measure Field Model

    PubMed Central

    Häme, Yrjö; Angelini, Elsa D.; Hoffman, Eric A.; Barr, R. Graham; Laine, Andrew F.

    2014-01-01

    The extent of pulmonary emphysema is commonly estimated from CT images by computing the proportional area of voxels below a predefined attenuation threshold. However, the reliability of this approach is limited by several factors that affect the CT intensity distributions in the lung. This work presents a novel method for emphysema quantification, based on parametric modeling of intensity distributions in the lung and a hidden Markov measure field model to segment emphysematous regions. The framework adapts to the characteristics of an image to ensure a robust quantification of emphysema under varying CT imaging protocols and differences in parenchymal intensity distributions due to factors such as inspiration level. Compared to standard approaches, the present model involves a larger number of parameters, most of which can be estimated from data, to handle the variability encountered in lung CT scans. The method was used to quantify emphysema on a cohort of 87 subjects, with repeated CT scans acquired over a time period of 8 years using different imaging protocols. The scans were acquired approximately annually, and the data set included a total of 365 scans. The results show that the emphysema estimates produced by the proposed method have very high intra-subject correlation values. By reducing sensitivity to changes in imaging protocol, the method provides a more robust estimate than standard approaches. In addition, the generated emphysema delineations promise great advantages for regional analysis of emphysema extent and progression, possibly advancing disease subtyping. PMID:24759984

  1. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  2. Coupled Hydrogeophysical Inversion and Hydrogeological Data Fusion

    NASA Astrophysics Data System (ADS)

    Cirpka, O. A.; Schwede, R. L.; Li, W.

    2012-12-01

    Tomographic geophysical monitoring methods give the opportunity to observe hydrogeological tests at higher spatial resolution than is possible with classical hydraulic monitoring tools. This has been demonstrated in a substantial number of studies in which electrical resistivity tomography (ERT) has been used to monitor salt-tracer experiments. It is now accepted that inversion of such data sets requires a fully coupled framework, explicitly accounting for the hydraulic processes (groundwater flow and solute transport), the relationship between solute and geophysical properties (petrophysical relationship such as Archie's law), and the governing equations of the geophysical surveying techniques (e.g., the Poisson equation) as consistent coupled system. These data sets can be amended with data from other - more direct - hydrogeological tests to infer the distribution of hydraulic aquifer parameters. In the inversion framework, meaningful condensation of data does not only contribute to inversion efficiency but also increases the stability of the inversion. In particular, transient concentration data themselves only weakly depend on hydraulic conductivity, and model improvement using gradient-based methods is only possible when a substantial agreement between measurements and model output already exists. The latter also holds when concentrations are monitored by ERT. Tracer arrival times, by contrast, show high sensitivity and a more monotonic dependence on hydraulic conductivity than concentrations themselves. Thus, even without using temporal-moment generating equations, inverting travel times rather than concentrations or related geoelectrical signals themselves is advantageous. We have applied this approach to concentrations measured directly or via ERT, and to heat-tracer data. We present a consistent inversion framework including temporal moments of concentrations, geoelectrical signals obtained during salt-tracer tests, drawdown data from hydraulic tomography and flowmeter measurements to identify mainly the hydraulic-conductivity distribution. By stating the inversion as geostatistical conditioning problem, we obtain parameter sets together with their correlated uncertainty. While we have applied the quasi-linear geostatistical approach as inverse kernel, other methods - such as ensemble Kalman methods - may suit the same purpose, particularly when many data points are to be included. In order to identify 3-D fields, discretized by about 50 million grid points, we use the high-performance-computing framework DUNE to solve the involved partial differential equations on midrange computer cluster. We have quantified the worth of different data types in these inference problems. In practical applications, the constitutive relationships between geophysical, thermal, and hydraulic properties can pose a problem, requiring additional inversion. However, not well constrained transient boundary conditions may put inversion efforts on larger (e.g. regional) scales even more into question. We envision that future hydrogeophysical inversion efforts will target boundary conditions, such as groundwater recharge rates, in conjunction with - or instead of - aquifer parameters. By this, the distinction between data assimilation and parameter estimation will gradually vanish.

  3. Demand Response Resource Quantification with Detailed Building Energy Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Elaine; Horsey, Henry; Merket, Noel

    Demand response is a broad suite of technologies that enables changes in electrical load operations in support of power system reliability and efficiency. Although demand response is not a new concept, there is new appetite for comprehensively evaluating its technical potential in the context of renewable energy integration. The complexity of demand response makes this task difficult -- we present new methods for capturing the heterogeneity of potential responses from buildings, their time-varying nature, and metrics such as thermal comfort that help quantify likely acceptability of specific demand response actions. Computed with an automated software framework, the methods are scalable.

  4. Dynamic whole-body PET parametric imaging: I. Concept, acquisition protocol optimization and clinical application.

    PubMed

    Karakatsanis, Nicolas A; Lodge, Martin A; Tahari, Abdel K; Zhou, Y; Wahl, Richard L; Rahmim, Arman

    2013-10-21

    Static whole-body PET/CT, employing the standardized uptake value (SUV), is considered the standard clinical approach to diagnosis and treatment response monitoring for a wide range of oncologic malignancies. Alternative PET protocols involving dynamic acquisition of temporal images have been implemented in the research setting, allowing quantification of tracer dynamics, an important capability for tumor characterization and treatment response monitoring. Nonetheless, dynamic protocols have been confined to single-bed-coverage limiting the axial field-of-view to ~15-20 cm, and have not been translated to the routine clinical context of whole-body PET imaging for the inspection of disseminated disease. Here, we pursue a transition to dynamic whole-body PET parametric imaging, by presenting, within a unified framework, clinically feasible multi-bed dynamic PET acquisition protocols and parametric imaging methods. We investigate solutions to address the challenges of: (i) long acquisitions, (ii) small number of dynamic frames per bed, and (iii) non-invasive quantification of kinetics in the plasma. In the present study, a novel dynamic (4D) whole-body PET acquisition protocol of ~45 min total length is presented, composed of (i) an initial 6 min dynamic PET scan (24 frames) over the heart, followed by (ii) a sequence of multi-pass multi-bed PET scans (six passes × seven bed positions, each scanned for 45 s). Standard Patlak linear graphical analysis modeling was employed, coupled with image-derived plasma input function measurements. Ordinary least squares Patlak estimation was used as the baseline regression method to quantify the physiological parameters of tracer uptake rate Ki and total blood distribution volume V on an individual voxel basis. Extensive Monte Carlo simulation studies, using a wide set of published kinetic FDG parameters and GATE and XCAT platforms, were conducted to optimize the acquisition protocol from a range of ten different clinically acceptable sampling schedules examined. The framework was also applied to six FDG PET patient studies, demonstrating clinical feasibility. Both simulated and clinical results indicated enhanced contrast-to-noise ratios (CNRs) for Ki images in tumor regions with notable background FDG concentration, such as the liver, where SUV performed relatively poorly. Overall, the proposed framework enables enhanced quantification of physiological parameters across the whole body. In addition, the total acquisition length can be reduced from 45 to ~35 min and still achieve improved or equivalent CNR compared to SUV, provided the true Ki contrast is sufficiently high. In the follow-up companion paper, a set of advanced linear regression schemes is presented to particularly address the presence of noise, and attempt to achieve a better trade-off between the mean-squared error and the CNR metrics, resulting in enhanced task-based imaging.

  5. Dynamic whole body PET parametric imaging: I. Concept, acquisition protocol optimization and clinical application

    PubMed Central

    Karakatsanis, Nicolas A.; Lodge, Martin A.; Tahari, Abdel K.; Zhou, Y.; Wahl, Richard L.; Rahmim, Arman

    2013-01-01

    Static whole body PET/CT, employing the standardized uptake value (SUV), is considered the standard clinical approach to diagnosis and treatment response monitoring for a wide range of oncologic malignancies. Alternative PET protocols involving dynamic acquisition of temporal images have been implemented in the research setting, allowing quantification of tracer dynamics, an important capability for tumor characterization and treatment response monitoring. Nonetheless, dynamic protocols have been confined to single bed-coverage limiting the axial field-of-view to ~15–20 cm, and have not been translated to the routine clinical context of whole-body PET imaging for the inspection of disseminated disease. Here, we pursue a transition to dynamic whole body PET parametric imaging, by presenting, within a unified framework, clinically feasible multi-bed dynamic PET acquisition protocols and parametric imaging methods. We investigate solutions to address the challenges of: (i) long acquisitions, (ii) small number of dynamic frames per bed, and (iii) non-invasive quantification of kinetics in the plasma. In the present study, a novel dynamic (4D) whole body PET acquisition protocol of ~45min total length is presented, composed of (i) an initial 6-min dynamic PET scan (24 frames) over the heart, followed by (ii) a sequence of multi-pass multi-bed PET scans (6 passes x 7 bed positions, each scanned for 45sec). Standard Patlak linear graphical analysis modeling was employed, coupled with image-derived plasma input function measurements. Ordinary least squares (OLS) Patlak estimation was used as the baseline regression method to quantify the physiological parameters of tracer uptake rate Ki and total blood distribution volume V on an individual voxel basis. Extensive Monte Carlo simulation studies, using a wide set of published kinetic FDG parameters and GATE and XCAT platforms, were conducted to optimize the acquisition protocol from a range of 10 different clinically acceptable sampling schedules examined. The framework was also applied to six FDG PET patient studies, demonstrating clinical feasibility. Both simulated and clinical results indicated enhanced contrast-to-noise ratios (CNRs) for Ki images in tumor regions with notable background FDG concentration, such as the liver, where SUV performed relatively poorly. Overall, the proposed framework enables enhanced quantification of physiological parameters across the whole-body. In addition, the total acquisition length can be reduced from 45min to ~35min and still achieve improved or equivalent CNR compared to SUV, provided the true Ki contrast is sufficiently high. In the follow-up companion paper, a set of advanced linear regression schemes is presented to particularly address the presence of noise, and attempt to achieve a better trade-off between the mean-squared error (MSE) and the CNR metrics, resulting in enhanced task-based imaging. PMID:24080962

  6. Dynamic whole-body PET parametric imaging: I. Concept, acquisition protocol optimization and clinical application

    NASA Astrophysics Data System (ADS)

    Karakatsanis, Nicolas A.; Lodge, Martin A.; Tahari, Abdel K.; Zhou, Y.; Wahl, Richard L.; Rahmim, Arman

    2013-10-01

    Static whole-body PET/CT, employing the standardized uptake value (SUV), is considered the standard clinical approach to diagnosis and treatment response monitoring for a wide range of oncologic malignancies. Alternative PET protocols involving dynamic acquisition of temporal images have been implemented in the research setting, allowing quantification of tracer dynamics, an important capability for tumor characterization and treatment response monitoring. Nonetheless, dynamic protocols have been confined to single-bed-coverage limiting the axial field-of-view to ˜15-20 cm, and have not been translated to the routine clinical context of whole-body PET imaging for the inspection of disseminated disease. Here, we pursue a transition to dynamic whole-body PET parametric imaging, by presenting, within a unified framework, clinically feasible multi-bed dynamic PET acquisition protocols and parametric imaging methods. We investigate solutions to address the challenges of: (i) long acquisitions, (ii) small number of dynamic frames per bed, and (iii) non-invasive quantification of kinetics in the plasma. In the present study, a novel dynamic (4D) whole-body PET acquisition protocol of ˜45 min total length is presented, composed of (i) an initial 6 min dynamic PET scan (24 frames) over the heart, followed by (ii) a sequence of multi-pass multi-bed PET scans (six passes × seven bed positions, each scanned for 45 s). Standard Patlak linear graphical analysis modeling was employed, coupled with image-derived plasma input function measurements. Ordinary least squares Patlak estimation was used as the baseline regression method to quantify the physiological parameters of tracer uptake rate Ki and total blood distribution volume V on an individual voxel basis. Extensive Monte Carlo simulation studies, using a wide set of published kinetic FDG parameters and GATE and XCAT platforms, were conducted to optimize the acquisition protocol from a range of ten different clinically acceptable sampling schedules examined. The framework was also applied to six FDG PET patient studies, demonstrating clinical feasibility. Both simulated and clinical results indicated enhanced contrast-to-noise ratios (CNRs) for Ki images in tumor regions with notable background FDG concentration, such as the liver, where SUV performed relatively poorly. Overall, the proposed framework enables enhanced quantification of physiological parameters across the whole body. In addition, the total acquisition length can be reduced from 45 to ˜35 min and still achieve improved or equivalent CNR compared to SUV, provided the true Ki contrast is sufficiently high. In the follow-up companion paper, a set of advanced linear regression schemes is presented to particularly address the presence of noise, and attempt to achieve a better trade-off between the mean-squared error and the CNR metrics, resulting in enhanced task-based imaging.

  7. A case study on changes of petrophysical properties of Werkendam well-cores due to interaction with supercritical carbon dioxide

    NASA Astrophysics Data System (ADS)

    Nover, Georg; Hbib, Nasser; Mansfeld, Arne

    2017-04-01

    Changes of porosity, permeability, electrical conductivity and E-modul were studied on sandstones from the Werkendam drillings WED2 (CO2-free) and WED3 (CO2-rich) (The Netherlands). WED2 and WED3 are separated by a fault. Porosities of the untreated samples range from <0.3% up to 16.5%, permeabilities from<0.01 mD up to >160 mD. Significant differences of samples from the WED2 and WED3 well were not detected. The petrophysical properties of the whole set of samples was measured prior to any experiment, then in total 8 samples from WED2 and WED3 were selected for the following experiments with supercritical CO2 (scCO2). These were performed at pressures of 10-12 MPa and temperatures ranging from 100 up to 120°C. The pores were partially saturated with brine (0.1 M NaCl). In a first step the autoclave experiments lasted about 45 days and were then extended in a second series up to 120 days total reaction time. An increase in porosity, permeability and electrical conductivity was measured after each experimental series with scCO2. Two of the samples failed along fractures due to dissolution and thereby caused loss of stability. The frequency dependent complex conductivity was measured in the frequency range 10-3 Hz up to 45 kHz thus having access to fluid/solid interactions at the inner surface of the pores. In a final sequence the uniaxial compressive strength and E-modul were measured on untreated and processed samples. Thus we could get an estimate on weakening of the mechanical stability caused by scCO2-treatment.

  8. Gravimetric water distribution assessment from geoelectrical methods (ERT and EMI) in municipal solid waste landfill.

    PubMed

    Dumont, Gaël; Pilawski, Tamara; Dzaomuho-Lenieregue, Phidias; Hiligsmann, Serge; Delvigne, Frank; Thonart, Philippe; Robert, Tanguy; Nguyen, Frédéric; Hermans, Thomas

    2016-09-01

    The gravimetric water content of the waste material is a key parameter in waste biodegradation. Previous studies suggest a correlation between changes in water content and modification of electrical resistivity. This study, based on field work in Mont-Saint-Guibert landfill (Belgium), aimed, on one hand, at characterizing the relationship between gravimetric water content and electrical resistivity and on the other hand, at assessing geoelectrical methods as tools to characterize the gravimetric water distribution in a landfill. Using excavated waste samples obtained after drilling, we investigated the influences of the temperature, the liquid phase conductivity, the compaction and the water content on the electrical resistivity. Our results demonstrate that Archie's law and Campbell's law accurately describe these relationships in municipal solid waste (MSW). Next, we conducted a geophysical survey in situ using two techniques: borehole electromagnetics (EM) and electrical resistivity tomography (ERT). First, in order to validate the use of EM, EM values obtained in situ were compared to electrical resistivity of excavated waste samples from corresponding depths. The petrophysical laws were used to account for the change of environmental parameters (temperature and compaction). A rather good correlation was obtained between direct measurement on waste samples and borehole electromagnetic data. Second, ERT and EM were used to acquire a spatial distribution of the electrical resistivity. Then, using the petrophysical laws, this information was used to estimate the water content distribution. In summary, our results demonstrate that geoelectrical methods represent a pertinent approach to characterize spatial distribution of water content in municipal landfills when properly interpreted using ground truth data. These methods might therefore prove to be valuable tools in waste biodegradation optimization projects. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Time-Lapse Electrical Geophysical Monitoring of Amendment-Based Biostimulation.

    PubMed

    Johnson, Timothy C; Versteeg, Roelof J; Day-Lewis, Frederick D; Major, William; Lane, John W

    2015-01-01

    Biostimulation is increasingly used to accelerate microbial remediation of recalcitrant groundwater contaminants. Effective application of biostimulation requires successful emplacement of amendment in the contaminant target zone. Verification of remediation performance requires postemplacement assessment and contaminant monitoring. Sampling-based approaches are expensive and provide low-density spatial and temporal information. Time-lapse electrical resistivity tomography (ERT) is an effective geophysical method for determining temporal changes in subsurface electrical conductivity. Because remedial amendments and biostimulation-related biogeochemical processes often change subsurface electrical conductivity, ERT can complement and enhance sampling-based approaches for assessing emplacement and monitoring biostimulation-based remediation. Field studies demonstrating the ability of time-lapse ERT to monitor amendment emplacement and behavior were performed during a biostimulation remediation effort conducted at the Department of Defense Reutilization and Marketing Office (DRMO) Yard, in Brandywine, Maryland, United States. Geochemical fluid sampling was used to calibrate a petrophysical relation in order to predict groundwater indicators of amendment distribution. The petrophysical relations were field validated by comparing predictions to sequestered fluid sample results, thus demonstrating the potential of electrical geophysics for quantitative assessment of amendment-related geochemical properties. Crosshole radar zero-offset profile and borehole geophysical logging were also performed to augment the data set and validate interpretation. In addition to delineating amendment transport in the first 10 months after emplacement, the time-lapse ERT results show later changes in bulk electrical properties interpreted as mineral precipitation. Results support the use of more cost-effective surface-based ERT in conjunction with limited field sampling to improve spatial and temporal monitoring of amendment emplacement and remediation performance. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  10. CO2/ brine substitution experiments at simulated reservoir conditions

    NASA Astrophysics Data System (ADS)

    Kummerow, Juliane; Spangenberg, Erik

    2015-04-01

    Capillary properties of rocks affect the mobility of fluids in a reservoir. Therefore, the understanding of the capillary pressure behaviour is essential to assess the long-term behaviour of CO2 reservoirs. Beyond this, a calibration of the petrophysical properties on water saturation of reservoir rocks at simulated in situ conditions is crucial for a proper interpretation of field monitoring data. We present a set-up, which allows for the combined measurements of capillary pressure, electric resistivity, and elastic wave velocities under controlled reservoir conditions (pconf = 400 bar, ppore = 180 bar, T = 65 ° C) at different brine-CO2 saturations. The capillary properties of the samples are measured using the micropore membrane technique. The sample is jacketed with a Viton tube (thickness = 4 mm) and placed between two current electrode endcaps, which as well contain pore fluid ports and ultrasonic P and S wave transducers. Between the sample and the lower endcap the hydrophilic semi-permeable micro-pore membrane (pore size = 100 nm) is integrated. It is embedded into filter papers to establish a good capillary contact and to protect the highly sensitive membrane against mechanical damage under load. Two high-precision syringe pumps are used to displace a quantified volume of brine by CO2 and determine the corresponding sample saturation. The fluid displacement induces a pressure gradient along the sample, which corresponds to the capillary pressure at a particular sample saturation. It is measured with a differential pressure sensor in the range between 0 - 0.2 MPa. Drainage and imbibition cycles are performed to provide information on the efficiency of capillary trapping and to get a calibration of the petrophysical parameters of the sample.

  11. Petrophysical, Geochemical, and Hydrological Evidence for Extensive Fracture-Mediated Fluid and Heat Transport in the Alpine Fault's Hanging-Wall Damage Zone

    NASA Astrophysics Data System (ADS)

    Townend, John; Sutherland, Rupert; Toy, Virginia G.; Doan, Mai-Linh; Célérier, Bernard; Massiot, Cécile; Coussens, Jamie; Jeppson, Tamara; Janku-Capova, Lucie; Remaud, Léa.; Upton, Phaedra; Schmitt, Douglas R.; Pezard, Philippe; Williams, Jack; Allen, Michael John; Baratin, Laura-May; Barth, Nicolas; Becroft, Leeza; Boese, Carolin M.; Boulton, Carolyn; Broderick, Neil; Carpenter, Brett; Chamberlain, Calum J.; Cooper, Alan; Coutts, Ashley; Cox, Simon C.; Craw, Lisa; Eccles, Jennifer D.; Faulkner, Dan; Grieve, Jason; Grochowski, Julia; Gulley, Anton; Hartog, Arthur; Henry, Gilles; Howarth, Jamie; Jacobs, Katrina; Kato, Naoki; Keys, Steven; Kirilova, Martina; Kometani, Yusuke; Langridge, Rob; Lin, Weiren; Little, Tim; Lukacs, Adrienn; Mallyon, Deirdre; Mariani, Elisabetta; Mathewson, Loren; Melosh, Ben; Menzies, Catriona; Moore, Jo; Morales, Luis; Mori, Hiroshi; Niemeijer, André; Nishikawa, Osamu; Nitsch, Olivier; Paris, Jehanne; Prior, David J.; Sauer, Katrina; Savage, Martha K.; Schleicher, Anja; Shigematsu, Norio; Taylor-Offord, Sam; Teagle, Damon; Tobin, Harold; Valdez, Robert; Weaver, Konrad; Wiersberg, Thomas; Zimmer, Martin

    2017-12-01

    Fault rock assemblages reflect interaction between deformation, stress, temperature, fluid, and chemical regimes on distinct spatial and temporal scales at various positions in the crust. Here we interpret measurements made in the hanging-wall of the Alpine Fault during the second stage of the Deep Fault Drilling Project (DFDP-2). We present observational evidence for extensive fracturing and high hanging-wall hydraulic conductivity (˜10-9 to 10-7 m/s, corresponding to permeability of ˜10-16 to 10-14 m2) extending several hundred meters from the fault's principal slip zone. Mud losses, gas chemistry anomalies, and petrophysical data indicate that a subset of fractures intersected by the borehole are capable of transmitting fluid volumes of several cubic meters on time scales of hours. DFDP-2 observations and other data suggest that this hydrogeologically active portion of the fault zone in the hanging-wall is several kilometers wide in the uppermost crust. This finding is consistent with numerical models of earthquake rupture and off-fault damage. We conclude that the mechanically and hydrogeologically active part of the Alpine Fault is a more dynamic and extensive feature than commonly described in models based on exhumed faults. We propose that the hydrogeologically active damage zone of the Alpine Fault and other large active faults in areas of high topographic relief can be subdivided into an inner zone in which damage is controlled principally by earthquake rupture processes and an outer zone in which damage reflects coseismic shaking, strain accumulation and release on interseismic timescales, and inherited fracturing related to exhumation.

  12. Volcanic settings and their reservoir potential: An outcrop analog study on the Miocene Tepoztlán Formation, Central Mexico

    NASA Astrophysics Data System (ADS)

    Lenhardt, Nils; Götz, Annette E.

    2011-07-01

    The reservoir potential of volcanic and associated sedimentary rocks is less documented in regard to groundwater resources, and oil and gas storage compared to siliciclastic and carbonate systems. Outcrop analog studies within a volcanic setting enable to identify spatio-temporal architectural elements and geometric features of different rock units and their petrophysical properties such as porosity and permeability, which are important information for reservoir characterization. Despite the wide distribution of volcanic rocks in Mexico, their reservoir potential has been little studied in the past. In the Valley of Mexico, situated 4000 m above the Neogene volcanic rocks, groundwater is a matter of major importance as more than 20 million people and 42% of the industrial capacity of the Mexican nation depend on it for most of their water supply. Here, we present porosity and permeability data of 108 rock samples representing five different lithofacies types of the Miocene Tepoztlán Formation. This 800 m thick formation mainly consists of pyroclastic rocks, mass flow and fluvial deposits and is part of the southern Transmexican Volcanic Belt, cropping out south of the Valley of Mexico and within the two states of Morelos and Mexico State. Porosities range from 1.4% to 56.7%; average porosity is 24.8%. Generally, permeabilities are low to median (0.2-933.3 mD) with an average permeability of 88.5 mD. The lavas are characterized by the highest porosity values followed by tuffs, conglomerates, sandstones and tuffaceous breccias. On the contrary, the highest permeabilities can be found in the conglomerates, followed by tuffs, tuffaceous breccias, sandstones and lavas. The knowledge of these petrophysical rock properties provides important information on the reservoir potential of volcanic settings to be integrated to 3D subsurface models.

  13. Paleomagnetism and petrophysics of the Jänisjärvi impact structure, Russian Karelia

    NASA Astrophysics Data System (ADS)

    Salminen, J.; Donadini, F.; Pesonen, L. J.; Masaitis, V. L.; Naumov, M. V.

    Paleomagnetic, rock magnetic, and petrophysical results are presented for impactites and target rocks from the Lake Jänisjärvi impact structure, Russian Karelia. The impactites (tagamites, suevites, and lithic breccias) are characterized by increased porosity and magnetization, which is in agreement with observations performed at other impact structures. Thermomagnetic, hysteresis, and scanning electron microscope (SEM) analysis document the presence of primary multidomain titanomagnetite with additional secondary titanomaghemite and ilmenohematite. The characteristic impact-related remanent magnetization (ChRM) direction (D = 101.5°, I = 73.1°, α95 = 6.2°) yields a pole (Lat. = 45.0°N, Long. = 76.9°E, dp = 9.9°, dm = 11.0°). Additionally, the same component is observed as an overprint on some rocks located in the vicinity of the structure, which provides proofs of its primary origin.An attempt was made to determine the ancient geomagnetic field intensity. Seven reliable results were obtained, yielding an ancient intensity of 68.7 ± 7.6 μT (corresponding to VDM of 10.3 ± 1.1 × 1022 Am2). The intensity, however, appears to be biased toward high values mainly because of the concave shape of the Arai diagrams.The new paleomagnetic data and published isotopic ages for the structure are in disagreement. According to well-defined paleomagnetic data, two possible ages for magnetization of Jänisjärvi rocks exist: 1) Late Sveconorwegian age (900-850 Myr) or 2) Late Cambrian age (˜500 Myr). However, published isotopic ages are 718 ± 5 Myr (K-Ar) and 698 ± 22 Myr (39Ar-40Ar), but such isotopic dating methods are often ambiguous for the impactites.

  14. Cross-well 4-D resistivity tomography localizes the oil-water encroachment front during water flooding

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Revil, A.

    2015-04-01

    The early detection of the oil-water encroachment front is of prime interest during the water flooding of an oil reservoir to maximize the production of oil and to avoid the oil-water encroachment front to come too close to production wells. We propose a new 4-D inversion approach based on the Gauss-Newton approach to invert cross-well resistance data. The goal of this study is to image the position of the oil-water encroachment front in a heterogeneous clayey sand reservoir. This approach is based on explicitly connecting the change of resistivity to the petrophysical properties controlling the position of the front (porosity and permeability) and to the saturation of the water phase through a petrophysical resistivity model accounting for bulk and surface conductivity contributions and saturation. The distributions of the permeability and porosity are also inverted using the time-lapse resistivity data in order to better reconstruct the position of the oil water encroachment front. In our synthetic test case, we get a better position of the front with the by-products of porosity and permeability inferences near the flow trajectory and close to the wells. The numerical simulations show that the position of the front is recovered well but the distribution of the recovered porosity and permeability is only fair. A comparison with a commercial code based on a classical Gauss-Newton approach with no information provided by the two-phase flow model fails to recover the position of the front. The new approach could be used for the time-lapse monitoring of various processes in both geothermal fields and oil and gas reservoirs using a combination of geophysical methods.

  15. Geometries of geoelectrical structures in central Tibetan Plateau from INDEPTH magnetotelluric data

    NASA Astrophysics Data System (ADS)

    Vozar, J.; Jones, A. G.; Le Pape, F.

    2012-12-01

    Magnetotelluric (MT) data collected on N-S profiles crossing the Banggong-Nujiang Suture (BNS), which separates the Qiangtang and Lhasa Terranes in central Tibet, as a part of InterNational DEep Profiling of Tibet and the Himalaya project (INDEPTH) are modeled by 2D, 3D inversion codes and 1D petro-physical package LitMod. The modeling exhibits regional resistive and conductive structures correlated with ShuangHu Suture, Tanggula Mountains and strike-slip faults like BengCo-Jiali fault in the south. The BNS is not manifested in the geoelectrical models as a strong crustal regional structure. The strike direction azimuth of mid and lower crustal structures estimated from horizontal slices from 3D modeling (N110°E) is slightly different from one estimated by 2D strike analysis (N100°E). Orientation of crustal structures is perpendicular to convergence direction in this area. The deepest lower crustal conductors are correlated to areas with maximum Moho depth obtained from satellite gravity data. The anisotropic 2D modeling reveals that lower crustal conductor in Lhasa Terrane is anisotropic. This anisotropy can be interpreted as a proof for crustal channel flow below Lhasa Terrane. But same Lhasa lower crust conductor from isotropic 3D modeling can be interpreted more likely as 3D lower Indian crust structure, located to the east from line 500, than geoelectrical anisotropic crustal flow. From deep electromagnetic sounding, supported by independent integrated petro-physical investigation, we can estimate the next upper-mantle conductive layer at depths from 200 km to 250 km below the Lhasa Terrane and less resistive Tibetan lithosphere below the Qiangtang Terrane with conductive upper-mantle in depths about 120 km.

  16. Convex hull approach for determining rock representative elementary volume for multiple petrophysical parameters using pore-scale imaging and Lattice-Boltzmann modelling

    NASA Astrophysics Data System (ADS)

    Shah, S. M.; Crawshaw, J. P.; Gray, F.; Yang, J.; Boek, E. S.

    2017-06-01

    In the last decade, the study of fluid flow in porous media has developed considerably due to the combination of X-ray Micro Computed Tomography (micro-CT) and advances in computational methods for solving complex fluid flow equations directly or indirectly on reconstructed three-dimensional pore space images. In this study, we calculate porosity and single phase permeability using micro-CT imaging and Lattice Boltzmann (LB) simulations for 8 different porous media: beadpacks (with bead sizes 50 μm and 350 μm), sandpacks (LV60 and HST95), sandstones (Berea, Clashach and Doddington) and a carbonate (Ketton). Combining the observed porosity and calculated single phase permeability, we shed new light on the existence and size of the Representative Element of Volume (REV) capturing the different scales of heterogeneity from the pore-scale imaging. Our study applies the concept of the 'Convex Hull' to calculate the REV by considering the two main macroscopic petrophysical parameters, porosity and single phase permeability, simultaneously. The shape of the hull can be used to identify strong correlation between the parameters or greatly differing convergence rates. To further enhance computational efficiency we note that the area of the convex hull (for well-chosen parameters such as the log of the permeability and the porosity) decays exponentially with sub-sample size so that only a few small simulations are needed to determine the system size needed to calculate the parameters to high accuracy (small convex hull area). Finally we propose using a characteristic length such as the pore size to choose an efficient absolute voxel size for the numerical rock.

  17. A rock physics and seismic reservoir characterization study of the Rock Springs Uplift, a carbon dioxide sequestration site in Southwestern Wyoming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grana, Dario; Verma, Sumit; Pafeng, Josiane

    We present a reservoir geophysics study, including rock physics modeling and seismic inversion, of a carbon dioxide sequestration site in Southwestern Wyoming, namely the Rock Springs Uplift, and build a petrophysical model for the potential injection reservoirs for carbon dioxide sequestration. Our objectives include the facies classification and the estimation of the spatial model of porosity and permeability for two sequestration targets of interest, the Madison Limestone and the Weber Sandstone. The available dataset includes a complete set of well logs at the location of the borehole available in the area, a set of 110 core samples, and a seismicmore » survey acquired in the area around the well. The proposed study includes a formation evaluation analysis and facies classification at the well location, the calibration of a rock physics model to link petrophysical properties and elastic attributes using well log data and core samples, the elastic inversion of the pre-stack seismic data, and the estimation of the reservoir model of facies, porosity and permeability conditioned by seismic inverted elastic attributes and well log data. In particular, the rock physics relations are facies-dependent and include granular media equations for clean and shaley sandstone, and inclusion models for the dolomitized limestone. The permeability model has been computed by applying a facies-dependent porosity-permeability relation calibrated using core sample measurements. Finally, the study shows that both formations show good storage capabilities. The Madison Limestone includes a homogeneous layer of high-porosity high-permeability dolomite; the Weber Sandstone is characterized by a lower average porosity but the layer is thicker than the Madison Limestone.« less

  18. Ultrasonic laboratory measurements of the seismic velocity changes due to CO2 injection

    NASA Astrophysics Data System (ADS)

    Park, K. G.; Choi, H.; Park, Y. C.; Hwang, S.

    2009-04-01

    Monitoring the behavior and movement of carbon dioxide (CO2) in the subsurface is a quite important in sequestration of CO2 in geological formation because such information provides a basis for demonstrating the safety of CO2 sequestration. Recent several applications in many commercial and pilot scale projects and researches show that 4D surface or borehole seismic methods are among the most promising techniques for this purpose. However, such information interpreted from the seismic velocity changes can be quite subjective and qualitative without petrophysical characterization for the effect of CO2 saturation on the seismic changes since seismic wave velocity depends on various factors and parameters like mineralogical composition, hydrogeological factors, in-situ conditions. In this respect, we have developed an ultrasonic laboratory measurement system and have carried out measurements for a porous sandstone sample to characterize the effects of CO2 injection to seismic velocity and amplitude. Measurements are done by ultrasonic piezoelectric transducer mounted on both ends of cylindrical core sample under various pressure, temperature, and saturation conditions. According to our fundamental experiments, injected CO2 introduces the decrease of seismic velocity and amplitude. We identified that the velocity decreases about 6% or more until fully saturated by CO2, but the attenuation of seismic amplitude is more drastically than the velocity decrease. We also identified that Vs/Vp or elastic modulus is more sensitive to CO2 saturation. We note that this means seismic amplitude and elastic modulus change can be an alternative target anomaly of seismic techniques in CO2 sequestration monitoring. Thus, we expect that we can estimate more quantitative petrophysical relationships between the changes of seismic attributes and CO2 concentration, which can provide basic relation for the quantitative assessment of CO2 sequestration by further researches.

  19. Static reservoir modeling of the Bahariya reservoirs for the oilfields development in South Umbarka area, Western Desert, Egypt

    NASA Astrophysics Data System (ADS)

    Abdel-Fattah, Mohamed I.; Metwalli, Farouk I.; Mesilhi, El Sayed I.

    2018-02-01

    3D static reservoir modeling of the Bahariya reservoirs using seismic and wells data can be a relevant part of an overall strategy for the oilfields development in South Umbarka area (Western Desert, Egypt). The seismic data is used to build the 3D grid, including fault sticks for the fault modeling, and horizon interpretations and surfaces for horizon modeling. The 3D grid is the digital representation of the structural geology of Bahariya Formation. When we got a reasonably accurate representation, we fill the 3D grid with facies and petrophysical properties to simulate it, to gain a more precise understanding of the reservoir properties behavior. Sequential Indicator Simulation (SIS) and Sequential Gaussian Simulation (SGS) techniques are the stochastic algorithms used to spatially distribute discrete reservoir properties (facies) and continuous reservoir properties (shale volume, porosity, and water saturation) respectively within the created 3D grid throughout property modeling. The structural model of Bahariya Formation exhibits the trapping mechanism which is a fault assisted anticlinal closure trending NW-SE. This major fault breaks the reservoirs into two major fault blocks (North Block and South Block). Petrophysical models classified Lower Bahariya reservoir as a moderate to good reservoir rather than Upper Bahariya reservoir in terms of facies, with good porosity and permeability, low water saturation, and moderate net to gross. The Original Oil In Place (OOIP) values of modeled Bahariya reservoirs show hydrocarbon accumulation in economic quantity, considering the high structural dips at the central part of South Umbarka area. The powerful of 3D static modeling technique has provided a considerable insight into the future prediction of Bahariya reservoirs performance and production behavior.

  20. Near Surface Geophysical Investigations of Potential Direct Recharge Zones in the Biscayne Aquifer within Everglades National Park, Florida.

    NASA Astrophysics Data System (ADS)

    Mount, G.; Comas, X.

    2017-12-01

    The karstic Miami Limestone of the Biscayne aquifer is characterized as having water flow that is controlled by the presence of dissolution enhanced porosity and mega-porous features. The dissolution features and other high porosity areas create horizontal preferential flow paths and high rates of ground water velocity, which may not be accurately conceptualized in groundwater flow models. In addition, recent research suggests the presence of numerous vertical dissolution features across Everglades National Park at Long Pine Key Trail, that may act as areas of direct recharge to the aquifer. These vertical features have been identified through ground penetrating radar (GPR) surveys as areas of velocity pull-down which have been modeled to have porosity values higher than the surrounding Miami Limestone. As climate change may induce larger and longer temporal variability between wet and dry times in the Everglades, a more comprehensive understanding of preferential flow pathways from the surface to the aquifer would be a great benefit to modelers and planners. This research utilizes near surface geophysical techniques, such as GPR, to identify these vertical dissolution features and then estimate the spatial variability of porosity using petrophysical models. GPR transects that were collected for several kilometers along the Long Pine Key Trail, show numerous pull down areas that correspond to dissolution enhanced porosity zones within the Miami Limestone. Additional 3D GPR surveys have attempted to delineate the boundaries of these features to elucidate their geometry for future modelling studies. We demonstrate the ability of near surface geophysics and petrophysical models to identify dissolution enhanced porosity in shallow karstic limestones to better understand areas that may act as zones of direct recharge into the Biscayne Aquifer.

  1. A rock physics and seismic reservoir characterization study of the Rock Springs Uplift, a carbon dioxide sequestration site in Southwestern Wyoming

    DOE PAGES

    Grana, Dario; Verma, Sumit; Pafeng, Josiane; ...

    2017-06-20

    We present a reservoir geophysics study, including rock physics modeling and seismic inversion, of a carbon dioxide sequestration site in Southwestern Wyoming, namely the Rock Springs Uplift, and build a petrophysical model for the potential injection reservoirs for carbon dioxide sequestration. Our objectives include the facies classification and the estimation of the spatial model of porosity and permeability for two sequestration targets of interest, the Madison Limestone and the Weber Sandstone. The available dataset includes a complete set of well logs at the location of the borehole available in the area, a set of 110 core samples, and a seismicmore » survey acquired in the area around the well. The proposed study includes a formation evaluation analysis and facies classification at the well location, the calibration of a rock physics model to link petrophysical properties and elastic attributes using well log data and core samples, the elastic inversion of the pre-stack seismic data, and the estimation of the reservoir model of facies, porosity and permeability conditioned by seismic inverted elastic attributes and well log data. In particular, the rock physics relations are facies-dependent and include granular media equations for clean and shaley sandstone, and inclusion models for the dolomitized limestone. The permeability model has been computed by applying a facies-dependent porosity-permeability relation calibrated using core sample measurements. Finally, the study shows that both formations show good storage capabilities. The Madison Limestone includes a homogeneous layer of high-porosity high-permeability dolomite; the Weber Sandstone is characterized by a lower average porosity but the layer is thicker than the Madison Limestone.« less

  2. Seismic modeling of multidimensional heterogeneity scales of Mallik gas hydrate reservoirs, Northwest Territories of Canada

    NASA Astrophysics Data System (ADS)

    Huang, Jun-Wei; Bellefleur, Gilles; Milkereit, Bernd

    2009-07-01

    In hydrate-bearing sediments, the velocity and attenuation of compressional and shear waves depend primarily on the spatial distribution of hydrates in the pore space of the subsurface lithologies. Recent characterizations of gas hydrate accumulations based on seismic velocity and attenuation generally assume homogeneous sedimentary layers and neglect effects from large- and small-scale heterogeneities of hydrate-bearing sediments. We present an algorithm, based on stochastic medium theory, to construct heterogeneous multivariable models that mimic heterogeneities of hydrate-bearing sediments at the level of detail provided by borehole logging data. Using this algorithm, we model some key petrophysical properties of gas hydrates within heterogeneous sediments near the Mallik well site, Northwest Territories, Canada. The modeled density, and P and S wave velocities used in combination with a modified Biot-Gassmann theory provide a first-order estimate of the in situ volume of gas hydrate near the Mallik 5L-38 borehole. Our results suggest a range of 528 to 768 × 106 m3/km2 of natural gas trapped within hydrates, nearly an order of magnitude lower than earlier estimates which did not include effects of small-scale heterogeneities. Further, the petrophysical models are combined with a 3-D finite difference modeling algorithm to study seismic attenuation due to scattering and leaky mode propagation. Simulations of a near-offset vertical seismic profile and cross-borehole numerical surveys demonstrate that attenuation of seismic energy may not be directly related to the intrinsic attenuation of hydrate-bearing sediments but, instead, may be largely attributed to scattering from small-scale heterogeneities and highly attenuate leaky mode propagation of seismic waves through larger-scale heterogeneities in sediments.

  3. A combined microstructural and petrophysical study to analyse the mechanical behaviour of shales in the Flysch units, Glarus Alps, Switzerland

    NASA Astrophysics Data System (ADS)

    Akker, Vénice; Kaufmann, Josef; Berger, Alfons; Herwegh, Marco

    2017-04-01

    Crustal scale deformation is strongly controlled by the rheological behaviour of sheet-silicate-rich rock types. As these rocks have low rock strength, facilitated by the strong crystallographically controlled mechanical anisotropy and interstitial pore fluid in the aggregate, they are able to accommodate considerable amounts of strain. A close relationship is expected between microstructure, porosity and permeability as function of metamorphic conditions and strain gradients. Thereby, fluids set free by compaction, mineral reactions or deformation play an important role. Rising industries in underground storage such as nuclear waste disposal, shale gas exploration or geological carbon sequestration make use of the advantageous properties of such rock types. Therefore, there is a great demand for research on the interaction of these processes. This study uses samples from Flysch-units of the Glarus Alps (Switzerland) collected along a metamorphic gradient (150-400°C) to unravel the link between the mechanical behaviour of these sheet-silicate-rich rocks at geological conditions and their present-day physical parameters. Investigations include two topics: (1) characterization of such rock types in terms of mineralogy, microstructure and petrophysical properties; and (2) possible reconstruction of deformation processes from microstructures. Quantitative information on the porosity, i.e. the pore sizes, distribution and their interconnectivity is crucial for both topics. Porosity is therefore estimated by: (1) image analysis of high resolution SEM images, (2) He-pycnometry, and (3) Hg-porosimetry. In a first step, differences in their present day physical parameters between low and high temperature sampling sites are shown. The variations inside and between the investigated samples is partly due to initial sedimentological heterogeneity and partly to the changes along the metamorphic gradient. This study will demonstrate how the characterized present day porosity evolved owing to these two prerequisites.

  4. Mechanical Stability of Fractured Rift Basin Mudstones: from lab to basin scale

    NASA Astrophysics Data System (ADS)

    Zakharova, N. V.; Goldberg, D.; Collins, D.; Swager, L.; Payne, W. G.

    2016-12-01

    Understanding petrophysical and mechanical properties of caprock mudstones is essential for ensuring good containment and mechanical formation stability at potential CO2 storage sites. Natural heterogeneity and presence of fractures, however, create challenges for accurate prediction of mudstone behavior under injection conditions and at reservoir scale. In this study, we present a multi-scale geomechanical analysis for Mesozoic mudstones from the Newark Rift basin, integrating petropyshical core and borehole data, in situ stress measurements, and caprock stability modeling. The project funded by the U.S. Department of Energy's National Energy Technology Laboratory (NETL) focuses on the Newark basin as a representative locality for a series of the Mesozoic rift basins in eastern North America considered as potential CO2 storage sites. An extensive core characterization program, which included laboratory CT scans, XRD, SEM, MICP, porosity, permeability, acoustic velocity measurements, and geomechanical testing under a range of confining pressures, revealed large variability and heterogeneity in both petrophysical and mechanical properties. Estimates of unconfined compressive strength for these predominantly lacustrine mudstones range from 5,000 to 50,000 psi, with only a weak correlation to clay content. Thinly bedded intervals exhibit up to 30% strength anisotropy. Mineralized fractures, abundant in most formations, are characterized by compressive strength as low as 10% of matrix strength. Upscaling these observations from core to reservoir scale is challenging. No simple one-to-one correlation between mechanical and petrophyscial properties exists, and therefore, we develop multivariate empirical relationships among these properties. A large suite of geophysical logs, including new measurements of the in situ stress field, is used to extrapolate these relationships to a basin-scale geomechanical model and predict mudstone behavior under injection conditions.

  5. Electrofacies vs. lithofacies sandstone reservoir characterization Campanian sequence, Arshad gas/oil field, Central Sirt Basin, Libya

    NASA Astrophysics Data System (ADS)

    Burki, Milad; Darwish, Mohamed

    2017-06-01

    The present study focuses on the vertically stacked sandstones of the Arshad Sandstone in Arshad gas/oil field, Central Sirt Basin, Libya, and is based on the conventional cores analysis and wireline log interpretation. Six lithofacies types (F1 to F6) were identified based on the lithology, sedimentary structures and biogenic features, and are supported by wireline log calibration. From which four types (F1-F4) represent the main Campanian sandstone reservoirs in the Arshad gas/oil field. Lithofacies F5 is the basal conglomerates at the lower part of the Arshad sandstones. The Paleozoic Gargaf Formation is represented by lithofacies F6 which is the source provenance for the above lithofacies types. Arshad sediments are interpreted to be deposited in shallow marginal and nearshore marine environment influenced by waves and storms representing interactive shelf to fluvio-marine conditions. The main seal rocks are the Campanian Sirte shale deposited in a major flooding events during sea level rise. It is contended that the syn-depositional tectonics controlled the distribution of the reservoir facies in time and space. In addition, the post-depositional changes controlled the reservoir quality and performance. Petrophysical interpretation from the porosity log values were confirmed by the conventional core measurements of the different sandstone lithofacies types. Porosity ranges from 5 to 20% and permeability is between 0 and 20 mD. Petrophysical cut-off summary of the lower part of the clastic dominated sequence (i. e. Arshad Sandstone) calculated from six wells includes net pay sand ranging from 19.5‧ to 202.05‧, average porosity from 7.7 to 15% and water saturation from 19 to 58%.

  6. Probabilistic inversion of electrical resistivity data from bench-scale experiments: On model parameterization for CO2 sequestration monitoring

    NASA Astrophysics Data System (ADS)

    Breen, S. J.; Lochbuehler, T.; Detwiler, R. L.; Linde, N.

    2013-12-01

    Electrical resistivity tomography (ERT) is a well-established method for geophysical characterization and has shown potential for monitoring geologic CO2 sequestration, due to its sensitivity to electrical resistivity contrasts generated by liquid/gas saturation variability. In contrast to deterministic ERT inversion approaches, probabilistic inversion provides not only a single saturation model but a full posterior probability density function for each model parameter. Furthermore, the uncertainty inherent in the underlying petrophysics (e.g., Archie's Law) can be incorporated in a straightforward manner. In this study, the data are from bench-scale ERT experiments conducted during gas injection into a quasi-2D (1 cm thick), translucent, brine-saturated sand chamber with a packing that mimics a simple anticlinal geological reservoir. We estimate saturation fields by Markov chain Monte Carlo sampling with the MT-DREAM(ZS) algorithm and compare them quantitatively to independent saturation measurements from a light transmission technique, as well as results from deterministic inversions. Different model parameterizations are evaluated in terms of the recovered saturation fields and petrophysical parameters. The saturation field is parameterized (1) in cartesian coordinates, (2) by means of its discrete cosine transform coefficients, and (3) by fixed saturation values and gradients in structural elements defined by a gaussian bell of arbitrary shape and location. Synthetic tests reveal that a priori knowledge about the expected geologic structures (as in parameterization (3)) markedly improves the parameter estimates. The number of degrees of freedom thus strongly affects the inversion results. In an additional step, we explore the effects of assuming that the total volume of injected gas is known a priori and that no gas has migrated away from the monitored region.

  7. Joint inversion of NMR and SIP data to estimate pore size distribution of geomaterials

    NASA Astrophysics Data System (ADS)

    Niu, Qifei; Zhang, Chi

    2018-03-01

    There are growing interests in using geophysical tools to characterize the microstructure of geomaterials because of the non-invasive nature and the applicability in field. In these applications, multiple types of geophysical data sets are usually processed separately, which may be inadequate to constrain the key feature of target variables. Therefore, simultaneous processing of multiple data sets could potentially improve the resolution. In this study, we propose a method to estimate pore size distribution by joint inversion of nuclear magnetic resonance (NMR) T2 relaxation and spectral induced polarization (SIP) spectra. The petrophysical relation between NMR T2 relaxation time and SIP relaxation time is incorporated in a nonlinear least squares problem formulation, which is solved using Gauss-Newton method. The joint inversion scheme is applied to a synthetic sample and a Berea sandstone sample. The jointly estimated pore size distributions are very close to the true model and results from other experimental method. Even when the knowledge of the petrophysical models of the sample is incomplete, the joint inversion can still capture the main features of the pore size distribution of the samples, including the general shape and relative peak positions of the distribution curves. It is also found from the numerical example that the surface relaxivity of the sample could be extracted with the joint inversion of NMR and SIP data if the diffusion coefficient of the ions in the electrical double layer is known. Comparing to individual inversions, the joint inversion could improve the resolution of the estimated pore size distribution because of the addition of extra data sets. The proposed approach might constitute a first step towards a comprehensive joint inversion that can extract the full pore geometry information of a geomaterial from NMR and SIP data.

  8. Reservoir characterization and seal integrity of Jemir field in Niger Delta, Nigeria

    NASA Astrophysics Data System (ADS)

    Adagunodo, Theophilus Aanuoluwa; Sunmonu, Lukman Ayobami; Adabanija, Moruffdeen Adedapo

    2017-05-01

    Ignoring fault seal and depending solely on reservoir parameters and estimated hydrocarbon contacts can lead to extremely unequal division of reserves especially in oil fields dominated by structural traps where faults play an important role in trapping of hydrocarbons. These faults may be sealing or as conduit to fluid flow. In this study; three-dimensional seismic and well log data has been used to characterize the reservoirs and investigate the seal integrity of fault plane trending NW-SE and dip towards south in Jemir field, Niger-Delta for enhanced oil recovery. The petrophysical and volumetric analysis of the six reservoirs that were mapped as well as structural interpretation of the faults were done both qualitatively and quantitatively. In order to know the sealing potential of individual hydrocarbon bearing sand, horizon-fault intersection was done, volume of shale was determined, thickness of individual bed was estimated, and quality control involving throw analysis was done. Shale Gouge Ratio (SGR) and Hydrocarbon Column Height (HCH) (supportable and structure-supported) were also determined to assess the seal integrity of the faults in Jemir field. The petrophysical analysis indicated the porosity of traps on Jemir field ranged from 0.20 to 0.29 and the volumetric analyses showed that the Stock Tank Original Oil in Place varied between 5.5 and 173.4 Mbbl. The SGR ranged from leaking (<20%) to sealing (>60%) fault plane suggesting poor to moderate sealing. The supportable HCH of Jemir field ranged from 98.3 to 446.2 m while its Structure-supported HCH ranged from 12.1 to 101.7 m. The porosities of Jemir field are good enough for hydrocarbon production as exemplified by its oil reserve estimates. However, improper sealing of the fault plane might enhance hydrocarbon leakage.

  9. Petrophysical Properties of Twenty Drill Cores from the Los Azufres, Mexico, Geothermal Field

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iglesias, E.R.; Contreras L., E.; Garcia G., A.

    1987-01-20

    For this study we selected 20 drill cores covering a wide range of depths (400-3000 m), from 15 wells, that provide a reasonable coverage of the field. Only andesite, the largely predominant rock type in the field, was included in this sample. We measured bulk density, grain (solids) density, effective porosity and (matrix) permeability on a considerable number of specimens taken from the cores; and inferred the corresponding total porosity and fraction of interconnected total porosity. We characterized the statistical distributions of the measured and inferred variables. The distributions of bulk density and grain density resulted approximately normal; the distributionsmore » of effective porosity, total porosity and fraction of total porosity turned out to be bimodal; the permeability distribution resulted highly skewed towards very small (1 mdarcy) values, though values as high as 400 mdarcies were measured. We also characterized the internal inhomogeneity of the cores by means of the ratio (standard deviation/mean) corresponding to the bulk density in each core (in average there are 9 specimens per core). The cores were found to present clearly discernible inhomogeneity; this quantitative characterization will help design new experimental work and interpret currently available and forthcoming results. We also found statistically significant linear correlations between total density and density of solids, effective porosity and total density, total porosity and total density, fraction of interconnected total porosity and the inverse of the effective porosity, total porosity and effective porosity; bulk density and total porosity also correlate with elevation. These results provide the first sizable and statistically detailed database available on petrophysical properties of the Los Azufres andesites. 1 tab., 16 figs., 4 refs.« less

  10. Structural control on the deep hydrogeological and geothermal aquifers related to the fractured Campanian-Miocene reservoirs of north-eastern Tunisia foreland constrained by subsurface data

    NASA Astrophysics Data System (ADS)

    Khomsi, Sami; Echihi, Oussema; Slimani, Naji

    2012-03-01

    A set of different data including high resolution seismic sections, petroleum wire-logging well data, borehole piezometry, structural cross-sections and outcrop analysis allowed us to characterise the tectonic framework, and its relationships with the deep aquifers seated in Cretaceous-Miocene deep reservoirs. The structural framework, based on major structures, controls the occurrence of deep aquifers and sub-basin aquifer distributions. Five structural domains can be defined, having different morphostructural characteristics. The northernmost domain lying on the north-south axis and Zaghouan thrust system is a domain of recharge by underflow of the different subsurface reservoirs and aquifers from outcrops of highly fractured reservoirs. On the other hand, the morphostructural configuration controls the piezometry of underground flows in the Plio-Quaternary unconfined aquifer. In the subsurface the Late Cretaceous-Miocene reservoirs are widespread with high thicknesses in many places and high porosities and connectivities especially along major fault corridors and on the crestal parts of major anticlines. Among all reservoirs, the Oligo-Miocene, detritic series are widespread and present high cumulative thicknesses. Subsurface and fieldwork outline the occurrence of 10 fractured sandy reservoirs for these series with packages having high hydrodynamic and petrophysical characteristics. These series show low salinities (maximum 5 g/l) in the northern part of the study area and will constitute an important source of drinkable water for the next generations. A regional structural cross-section is presented, compiled from all the different data sets, allowing us to define the major characteristics of the hydrogeological-hydrogeothermal sub-basins. Eight hydrogeological provinces are defined from north-west to south-east. A major thermal anomaly is clearly identified in the south-eastern part of the study area in Sfax-Sidi Il Itayem. This anomaly is possibly related to major faults pertaining to the Sirt basin and controlled by a deep thermal anomaly. Many exploration targets are identified especially along the Cherichira-Kondar thrust where the Oligocene subcropping reservoirs are well developed. They are highly fractured and show good hydrodynamic characteristics.

  11. Towards a robust framework for catchment classification

    NASA Astrophysics Data System (ADS)

    Deshmukh, A.; Samal, A.; Singh, R.

    2017-12-01

    Classification of catchments based on various measures of similarity has emerged as an important technique to understand regional scale hydrologic behavior. Classification of catchment characteristics and/or streamflow response has been used reveal which characteristics are more likely to explain the observed variability of hydrologic response. However, numerous algorithms for supervised or unsupervised classification are available, making it hard to identify the algorithm most suitable for the dataset at hand. Consequently, existing catchment classification studies vary significantly in the classification algorithms employed with no previous attempt at understanding the degree of uncertainty in classification due to this algorithmic choice. This hinders the generalizability of interpretations related to hydrologic behavior. Our goal is to develop a protocol that can be followed while classifying hydrologic datasets. We focus on a classification framework for unsupervised classification and provide a step-by-step classification procedure. The steps include testing the clusterabiltiy of original dataset prior to classification, feature selection, validation of clustered data, and quantification of similarity of two clusterings. We test several commonly available methods within this framework to understand the level of similarity of classification results across algorithms. We apply the proposed framework on recently developed datasets for India to analyze to what extent catchment properties can explain observed catchment response. Our testing dataset includes watershed characteristics for over 200 watersheds which comprise of both natural (physio-climatic) characteristics and socio-economic characteristics. This framework allows us to understand the controls on observed hydrologic variability across India.

  12. Her2Net: A Deep Framework for Semantic Segmentation and Classification of Cell Membranes and Nuclei in Breast Cancer Evaluation.

    PubMed

    Saha, Monjoy; Chakraborty, Chandan

    2018-05-01

    We present an efficient deep learning framework for identifying, segmenting, and classifying cell membranes and nuclei from human epidermal growth factor receptor-2 (HER2)-stained breast cancer images with minimal user intervention. This is a long-standing issue for pathologists because the manual quantification of HER2 is error-prone, costly, and time-consuming. Hence, we propose a deep learning-based HER2 deep neural network (Her2Net) to solve this issue. The convolutional and deconvolutional parts of the proposed Her2Net framework consisted mainly of multiple convolution layers, max-pooling layers, spatial pyramid pooling layers, deconvolution layers, up-sampling layers, and trapezoidal long short-term memory (TLSTM). A fully connected layer and a softmax layer were also used for classification and error estimation. Finally, HER2 scores were calculated based on the classification results. The main contribution of our proposed Her2Net framework includes the implementation of TLSTM and a deep learning framework for cell membrane and nucleus detection, segmentation, and classification and HER2 scoring. Our proposed Her2Net achieved 96.64% precision, 96.79% recall, 96.71% F-score, 93.08% negative predictive value, 98.33% accuracy, and a 6.84% false-positive rate. Our results demonstrate the high accuracy and wide applicability of the proposed Her2Net in the context of HER2 scoring for breast cancer evaluation.

  13. Environmental Impacts of Future Urban Deployment of Electric Vehicles: Assessment Framework and Case Study of Copenhagen for 2016-2030.

    PubMed

    Bohnes, Florence A; Gregg, Jay S; Laurent, Alexis

    2017-12-05

    To move toward environmentally sustainable transport systems, electric vehicles (EVs) are increasingly seen as viable alternatives to internal combustion vehicles (ICVs). To ensure effectiveness of such deployment, holistic assessments of environmental impacts can help decision-makers determine optimized urban strategies in a long-term perspective. However, explicit guidance and conduct of such assessments are currently missing. Here, we therefore propose a framework using life cycle assessment that enables the quantification of environmental impacts of a transport system at full urban scale from a fleet-based, foresight perspective. The analysis of the passenger car fleet development in the city of Copenhagen for the years 2016-2030 is used as a proof-of-concept. We modeled and compared five powertrain technologies, and we assessed four fleet-based scenarios for the entire city. Our results showed relative environmental benefits from range-extended and fuel-cell EVs over ICVs and standard EVs. These results were found to be sensitive to local settings, like electricity grid mix, which could alter the relative environmental performances across EV technologies. The comprehensive framework developed here can be applied to other geographic areas and contexts to assess the environmental sustainability of transport systems.

  14. Tiered Approach to Resilience Assessment.

    PubMed

    Linkov, Igor; Fox-Lent, Cate; Read, Laura; Allen, Craig R; Arnott, James C; Bellini, Emanuele; Coaffee, Jon; Florin, Marie-Valentine; Hatfield, Kirk; Hyde, Iain; Hynes, William; Jovanovic, Aleksandar; Kasperson, Roger; Katzenberger, John; Keys, Patrick W; Lambert, James H; Moss, Richard; Murdoch, Peter S; Palma-Oliveira, Jose; Pulwarty, Roger S; Sands, Dale; Thomas, Edward A; Tye, Mari R; Woods, David

    2018-04-25

    Regulatory agencies have long adopted a three-tier framework for risk assessment. We build on this structure to propose a tiered approach for resilience assessment that can be integrated into the existing regulatory processes. Comprehensive approaches to assessing resilience at appropriate and operational scales, reconciling analytical complexity as needed with stakeholder needs and resources available, and ultimately creating actionable recommendations to enhance resilience are still lacking. Our proposed framework consists of tiers by which analysts can select resilience assessment and decision support tools to inform associated management actions relative to the scope and urgency of the risk and the capacity of resource managers to improve system resilience. The resilience management framework proposed is not intended to supplant either risk management or the many existing efforts of resilience quantification method development, but instead provide a guide to selecting tools that are appropriate for the given analytic need. The goal of this tiered approach is to intentionally parallel the tiered approach used in regulatory contexts so that resilience assessment might be more easily and quickly integrated into existing structures and with existing policies. Published 2018. This article is a U.S. government work and is in the public domain in the USA.

  15. Preparation of porous aromatic framework/ionic liquid hybrid composite coated solid-phase microextraction fibers and their application in the determination of organochlorine pesticides combined with GC-ECD detection.

    PubMed

    Wu, Mingxue; Chen, Gang; Liu, Ping; Zhou, Weihong; Jia, Qiong

    2016-01-07

    A novel hybrid material incorporating porous aromatic frameworks and an ionic liquid, 1-(triethoxy silyl)propyl-3-aminopropyl imidazole hexafluorophosphate, was prepared as solid-phase microextraction coating and employed for the extraction of organochlorine pesticides. Combining the advantages of porous aromatic frameworks and an ionic liquid, the fiber exhibited a high adsorption capacity for organochlorine pesticides. Under optimized experimental conditions, enhancement factors of 247-1696 were obtained with good linearity in the range of 1-500 μg L(-1). The detection limits and quantification limits were determined to be in the range of 0.11-0.29 μg L(-1) and 0.35-0.93 μg L(-1). The relative standard deviations for six replicates of organochlorine pesticides were in the range of 4.4%-7.2% and 5.7%-10.1% for one fiber and fiber-to-fiber, respectively. By coupling with a gas chromatography-electron capture detector, the novel fiber was successfully used for the determination of organochlorine pesticides in juice and milk samples with recoveries of 76.1%-121.3%.

  16. Bayesian Hierarchical Grouping: perceptual grouping as mixture estimation

    PubMed Central

    Froyen, Vicky; Feldman, Jacob; Singh, Manish

    2015-01-01

    We propose a novel framework for perceptual grouping based on the idea of mixture models, called Bayesian Hierarchical Grouping (BHG). In BHG we assume that the configuration of image elements is generated by a mixture of distinct objects, each of which generates image elements according to some generative assumptions. Grouping, in this framework, means estimating the number and the parameters of the mixture components that generated the image, including estimating which image elements are “owned” by which objects. We present a tractable implementation of the framework, based on the hierarchical clustering approach of Heller and Ghahramani (2005). We illustrate it with examples drawn from a number of classical perceptual grouping problems, including dot clustering, contour integration, and part decomposition. Our approach yields an intuitive hierarchical representation of image elements, giving an explicit decomposition of the image into mixture components, along with estimates of the probability of various candidate decompositions. We show that BHG accounts well for a diverse range of empirical data drawn from the literature. Because BHG provides a principled quantification of the plausibility of grouping interpretations over a wide range of grouping problems, we argue that it provides an appealing unifying account of the elusive Gestalt notion of Prägnanz. PMID:26322548

  17. Quantification of CO2 generation in sedimentary basins through carbonate/clays reactions with uncertain thermodynamic parameters

    NASA Astrophysics Data System (ADS)

    Ceriotti, G.; Porta, G. M.; Geloni, C.; Dalla Rosa, M.; Guadagnini, A.

    2017-09-01

    We develop a methodological framework and mathematical formulation which yields estimates of the uncertainty associated with the amounts of CO2 generated by Carbonate-Clays Reactions (CCR) in large-scale subsurface systems to assist characterization of the main features of this geochemical process. Our approach couples a one-dimensional compaction model, providing the dynamics of the evolution of porosity, temperature and pressure along the vertical direction, with a chemical model able to quantify the partial pressure of CO2 resulting from minerals and pore water interaction. The modeling framework we propose allows (i) estimating the depth at which the source of gases is located and (ii) quantifying the amount of CO2 generated, based on the mineralogy of the sediments involved in the basin formation process. A distinctive objective of the study is the quantification of the way the uncertainty affecting chemical equilibrium constants propagates to model outputs, i.e., the flux of CO2. These parameters are considered as key sources of uncertainty in our modeling approach because temperature and pressure distributions associated with deep burial depths typically fall outside the range of validity of commonly employed geochemical databases and typically used geochemical software. We also analyze the impact of the relative abundancy of primary phases in the sediments on the activation of CCR processes. As a test bed, we consider a computational study where pressure and temperature conditions are representative of those observed in real sedimentary formation. Our results are conducive to the probabilistic assessment of (i) the characteristic pressure and temperature at which CCR leads to generation of CO2 in sedimentary systems, (ii) the order of magnitude of the CO2 generation rate that can be associated with CCR processes.

  18. ICN_Atlas: Automated description and quantification of functional MRI activation patterns in the framework of intrinsic connectivity networks.

    PubMed

    Kozák, Lajos R; van Graan, Louis André; Chaudhary, Umair J; Szabó, Ádám György; Lemieux, Louis

    2017-12-01

    Generally, the interpretation of functional MRI (fMRI) activation maps continues to rely on assessing their relationship to anatomical structures, mostly in a qualitative and often subjective way. Recently, the existence of persistent and stable brain networks of functional nature has been revealed; in particular these so-called intrinsic connectivity networks (ICNs) appear to link patterns of resting state and task-related state connectivity. These networks provide an opportunity of functionally-derived description and interpretation of fMRI maps, that may be especially important in cases where the maps are predominantly task-unrelated, such as studies of spontaneous brain activity e.g. in the case of seizure-related fMRI maps in epilepsy patients or sleep states. Here we present a new toolbox (ICN_Atlas) aimed at facilitating the interpretation of fMRI data in the context of ICN. More specifically, the new methodology was designed to describe fMRI maps in function-oriented, objective and quantitative way using a set of 15 metrics conceived to quantify the degree of 'engagement' of ICNs for any given fMRI-derived statistical map of interest. We demonstrate that the proposed framework provides a highly reliable quantification of fMRI activation maps using a publicly available longitudinal (test-retest) resting-state fMRI dataset. The utility of the ICN_Atlas is also illustrated on a parametric task-modulation fMRI dataset, and on a dataset of a patient who had repeated seizures during resting-state fMRI, confirmed on simultaneously recorded EEG. The proposed ICN_Atlas toolbox is freely available for download at http://icnatlas.com and at http://www.nitrc.org for researchers to use in their fMRI investigations. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Keypress-Based Musical Preference Is Both Individual and Lawful.

    PubMed

    Livengood, Sherri L; Sheppard, John P; Kim, Byoung W; Malthouse, Edward C; Bourne, Janet E; Barlow, Anne E; Lee, Myung J; Marin, Veronica; O'Connor, Kailyn P; Csernansky, John G; Block, Martin P; Blood, Anne J; Breiter, Hans C

    2017-01-01

    Musical preference is highly individualized and is an area of active study to develop methods for its quantification. Recently, preference-based behavior, associated with activity in brain reward circuitry, has been shown to follow lawful, quantifiable patterns, despite broad variation across individuals. These patterns, observed using a keypress paradigm with visual stimuli, form the basis for relative preference theory (RPT). Here, we sought to determine if such patterns extend to non-visual domains (i.e., audition) and dynamic stimuli, potentially providing a method to supplement psychometric, physiological, and neuroimaging approaches to preference quantification. For this study, we adapted our keypress paradigm to two sets of stimuli consisting of seventeenth to twenty-first century western art music (Classical) and twentieth to twenty-first century jazz and popular music (Popular). We studied a pilot sample and then a separate primary experimental sample with this paradigm, and used iterative mathematical modeling to determine if RPT relationships were observed with high R 2 fits. We further assessed the extent of heterogeneity in the rank ordering of keypress-based responses across subjects. As expected, individual rank orderings of preferences were quite heterogeneous, yet we observed mathematical patterns fitting these data similar to those observed previously with visual stimuli. These patterns in music preference were recurrent across two cohorts and two stimulus sets, and scaled between individual and group data, adhering to the requirements for lawfulness. Our findings suggest a general neuroscience framework that predicts human approach/avoidance behavior, while also allowing for individual differences and the broad diversity of human choices; the resulting framework may offer novel approaches to advancing music neuroscience, or its applications to medicine and recommendation systems.

  20. Uncertainty Quantification for CO2-Enhanced Oil Recovery

    NASA Astrophysics Data System (ADS)

    Dai, Z.; Middleton, R.; Bauman, J.; Viswanathan, H.; Fessenden-Rahn, J.; Pawar, R.; Lee, S.

    2013-12-01

    CO2-Enhanced Oil Recovery (EOR) is currently an option for permanently sequestering CO2 in oil reservoirs while increasing oil/gas productions economically. In this study we have developed a framework for understanding CO2 storage potential within an EOR-sequestration environment at the Farnsworth Unit of the Anadarko Basin in northern Texas. By coupling a EOR tool--SENSOR (CEI, 2011) with a uncertainty quantification tool PSUADE (Tong, 2011), we conduct an integrated Monte Carlo simulation of water, oil/gas components and CO2 flow and reactive transport in the heterogeneous Morrow formation to identify the key controlling processes and optimal parameters for CO2 sequestration and EOR. A global sensitivity and response surface analysis are conducted with PSUADE to build numerically the relationship among CO2 injectivity, oil/gas production, reservoir parameters and distance between injection and production wells. The results indicate that the reservoir permeability and porosity are the key parameters to control the CO2 injection, oil and gas (CH4) recovery rates. The distance between the injection and production wells has large impact on oil and gas recovery and net CO2 injection rates. The CO2 injectivity increases with the increasing reservoir permeability and porosity. The distance between injection and production wells is the key parameter for designing an EOR pattern (such as a five (or nine)-spot pattern). The optimal distance for a five-spot-pattern EOR in this site is estimated from the response surface analysis to be around 400 meters. Next, we are building the machinery into our risk assessment framework CO2-PENS to utilize these response surfaces and evaluate the operation risk for CO2 sequestration and EOR at this site.

  1. Quantification of skeletal fraction volume of a soil pit by means of photogrammetry

    NASA Astrophysics Data System (ADS)

    Baruck, Jasmin; Zieher, Thomas; Bremer, Magnus; Rutzinger, Martin; Geitner, Clemens

    2015-04-01

    The grain size distribution of a soil is a key parameter determining soil water behaviour, soil fertility and land use potential. It plays an important role in soil classification and allows drawing conclusions on landscape development as well as soil formation processes. However, fine soil material (i.e. particle diameter ≤2 mm) is usually documented more thoroughly than the skeletal fraction (i.e. particle diameter >2 mm). While fine soil material is commonly analysed in the laboratory in order to determine the soil type, the skeletal fraction is typically estimated in the field at the profile. For a more precise determination of the skeletal fraction other methods can be applied and combined. These methods can be volume-related (sampling rings, percussion coring tubes) or non-volume-related (sieve of spade excavation). In this study we present a framework for the quantification of skeletal fraction volumes of a soil pit by means of photogrammetry. As a first step 3D point clouds of both soil pit and skeletal grains were generated. Therefore all skeletal grains of the pit were spread out onto a plane, clean plastic sheet in the field and numerous digital photos were taken using a reflex camera. With the help of the open source tool VisualSFM (structure from motion) two scaled 3D point clouds were derived. As a second step the skeletal fraction point cloud was segmented by radiometric attributes in order to determine volumes of single skeletal grains. The comparison of the total skeletal fraction volume with the volume of the pit (closed by spline interpolation) yields an estimate of the volumetric proportion of skeletal grains. The presented framework therefore provides an objective reference value of skeletal fraction for the support of qualitative field records.

  2. Improved Uncertainty Quantification in Groundwater Flux Estimation Using GRACE

    NASA Astrophysics Data System (ADS)

    Reager, J. T., II; Rao, P.; Famiglietti, J. S.; Turmon, M.

    2015-12-01

    Groundwater change is difficult to monitor over large scales. One of the most successful approaches is in the remote sensing of time-variable gravity using NASA Gravity Recovery and Climate Experiment (GRACE) mission data, and successful case studies have created the opportunity to move towards a global groundwater monitoring framework for the world's largest aquifers. To achieve these estimates, several approximations are applied, including those in GRACE processing corrections, the formulation of the formal GRACE errors, destriping and signal recovery, and the numerical model estimation of snow water, surface water and soil moisture storage states used to isolate a groundwater component. A major weakness in these approaches is inconsistency: different studies have used different sources of primary and ancillary data, and may achieve different results based on alternative choices in these approximations. In this study, we present two cases of groundwater change estimation in California and the Colorado River basin, selected for their good data availability and varied climates. We achieve a robust numerical estimate of post-processing uncertainties resulting from land-surface model structural shortcomings and model resolution errors. Groundwater variations should demonstrate less variability than the overlying soil moisture state does, as groundwater has a longer memory of past events due to buffering by infiltration and drainage rate limits. We apply a model ensemble approach in a Bayesian framework constrained by the assumption of decreasing signal variability with depth in the soil column. We also discuss time variable errors vs. time constant errors, across-scale errors v. across-model errors, and error spectral content (across scales and across model). More robust uncertainty quantification for GRACE-based groundwater estimates would take all of these issues into account, allowing for more fair use in management applications and for better integration of GRACE-based measurements with observations from other sources.

  3. Keypress-Based Musical Preference Is Both Individual and Lawful

    PubMed Central

    Livengood, Sherri L.; Sheppard, John P.; Kim, Byoung W.; Malthouse, Edward C.; Bourne, Janet E.; Barlow, Anne E.; Lee, Myung J.; Marin, Veronica; O'Connor, Kailyn P.; Csernansky, John G.; Block, Martin P.; Blood, Anne J.; Breiter, Hans C.

    2017-01-01

    Musical preference is highly individualized and is an area of active study to develop methods for its quantification. Recently, preference-based behavior, associated with activity in brain reward circuitry, has been shown to follow lawful, quantifiable patterns, despite broad variation across individuals. These patterns, observed using a keypress paradigm with visual stimuli, form the basis for relative preference theory (RPT). Here, we sought to determine if such patterns extend to non-visual domains (i.e., audition) and dynamic stimuli, potentially providing a method to supplement psychometric, physiological, and neuroimaging approaches to preference quantification. For this study, we adapted our keypress paradigm to two sets of stimuli consisting of seventeenth to twenty-first century western art music (Classical) and twentieth to twenty-first century jazz and popular music (Popular). We studied a pilot sample and then a separate primary experimental sample with this paradigm, and used iterative mathematical modeling to determine if RPT relationships were observed with high R2 fits. We further assessed the extent of heterogeneity in the rank ordering of keypress-based responses across subjects. As expected, individual rank orderings of preferences were quite heterogeneous, yet we observed mathematical patterns fitting these data similar to those observed previously with visual stimuli. These patterns in music preference were recurrent across two cohorts and two stimulus sets, and scaled between individual and group data, adhering to the requirements for lawfulness. Our findings suggest a general neuroscience framework that predicts human approach/avoidance behavior, while also allowing for individual differences and the broad diversity of human choices; the resulting framework may offer novel approaches to advancing music neuroscience, or its applications to medicine and recommendation systems. PMID:28512395

  4. BRICK v0.2, a simple, accessible, and transparent model framework for climate and regional sea-level projections

    NASA Astrophysics Data System (ADS)

    Wong, Tony E.; Bakker, Alexander M. R.; Ruckert, Kelsey; Applegate, Patrick; Slangen, Aimée B. A.; Keller, Klaus

    2017-07-01

    Simple models can play pivotal roles in the quantification and framing of uncertainties surrounding climate change and sea-level rise. They are computationally efficient, transparent, and easy to reproduce. These qualities also make simple models useful for the characterization of risk. Simple model codes are increasingly distributed as open source, as well as actively shared and guided. Alas, computer codes used in the geosciences can often be hard to access, run, modify (e.g., with regards to assumptions and model components), and review. Here, we describe the simple model framework BRICK (Building blocks for Relevant Ice and Climate Knowledge) v0.2 and its underlying design principles. The paper adds detail to an earlier published model setup and discusses the inclusion of a land water storage component. The framework largely builds on existing models and allows for projections of global mean temperature as well as regional sea levels and coastal flood risk. BRICK is written in R and Fortran. BRICK gives special attention to the model values of transparency, accessibility, and flexibility in order to mitigate the above-mentioned issues while maintaining a high degree of computational efficiency. We demonstrate the flexibility of this framework through simple model intercomparison experiments. Furthermore, we demonstrate that BRICK is suitable for risk assessment applications by using a didactic example in local flood risk management.

  5. An efficient framework for optimization and parameter sensitivity analysis in arterial growth and remodeling computations

    PubMed Central

    Sankaran, Sethuraman; Humphrey, Jay D.; Marsden, Alison L.

    2013-01-01

    Computational models for vascular growth and remodeling (G&R) are used to predict the long-term response of vessels to changes in pressure, flow, and other mechanical loading conditions. Accurate predictions of these responses are essential for understanding numerous disease processes. Such models require reliable inputs of numerous parameters, including material properties and growth rates, which are often experimentally derived, and inherently uncertain. While earlier methods have used a brute force approach, systematic uncertainty quantification in G&R models promises to provide much better information. In this work, we introduce an efficient framework for uncertainty quantification and optimal parameter selection, and illustrate it via several examples. First, an adaptive sparse grid stochastic collocation scheme is implemented in an established G&R solver to quantify parameter sensitivities, and near-linear scaling with the number of parameters is demonstrated. This non-intrusive and parallelizable algorithm is compared with standard sampling algorithms such as Monte-Carlo. Second, we determine optimal arterial wall material properties by applying robust optimization. We couple the G&R simulator with an adaptive sparse grid collocation approach and a derivative-free optimization algorithm. We show that an artery can achieve optimal homeostatic conditions over a range of alterations in pressure and flow; robustness of the solution is enforced by including uncertainty in loading conditions in the objective function. We then show that homeostatic intramural and wall shear stress is maintained for a wide range of material properties, though the time it takes to achieve this state varies. We also show that the intramural stress is robust and lies within 5% of its mean value for realistic variability of the material parameters. We observe that prestretch of elastin and collagen are most critical to maintaining homeostasis, while values of the material properties are most critical in determining response time. Finally, we outline several challenges to the G&R community for future work. We suggest that these tools provide the first systematic and efficient framework to quantify uncertainties and optimally identify G&R model parameters. PMID:23626380

  6. Assessment and Reduction of Model Parametric Uncertainties: A Case Study with A Distributed Hydrological Model

    NASA Astrophysics Data System (ADS)

    Gan, Y.; Liang, X. Z.; Duan, Q.; Xu, J.; Zhao, P.; Hong, Y.

    2017-12-01

    The uncertainties associated with the parameters of a hydrological model need to be quantified and reduced for it to be useful for operational hydrological forecasting and decision support. An uncertainty quantification framework is presented to facilitate practical assessment and reduction of model parametric uncertainties. A case study, using the distributed hydrological model CREST for daily streamflow simulation during the period 2008-2010 over ten watershed, was used to demonstrate the performance of this new framework. Model behaviors across watersheds were analyzed by a two-stage stepwise sensitivity analysis procedure, using LH-OAT method for screening out insensitive parameters, followed by MARS-based Sobol' sensitivity indices for quantifying each parameter's contribution to the response variance due to its first-order and higher-order effects. Pareto optimal sets of the influential parameters were then found by the adaptive surrogate-based multi-objective optimization procedure, using MARS model for approximating the parameter-response relationship and SCE-UA algorithm for searching the optimal parameter sets of the adaptively updated surrogate model. The final optimal parameter sets were validated against the daily streamflow simulation of the same watersheds during the period 2011-2012. The stepwise sensitivity analysis procedure efficiently reduced the number of parameters that need to be calibrated from twelve to seven, which helps to limit the dimensionality of calibration problem and serves to enhance the efficiency of parameter calibration. The adaptive MARS-based multi-objective calibration exercise provided satisfactory solutions to the reproduction of the observed streamflow for all watersheds. The final optimal solutions showed significant improvement when compared to the default solutions, with about 65-90% reduction in 1-NSE and 60-95% reduction in |RB|. The validation exercise indicated a large improvement in model performance with about 40-85% reduction in 1-NSE, and 35-90% reduction in |RB|. Overall, this uncertainty quantification framework is robust, effective and efficient for parametric uncertainty analysis, the results of which provide useful information that helps to understand the model behaviors and improve the model simulations.

  7. Fish pass assessment by remote control: a novel framework for quantifying the hydraulics at fish pass entrances

    NASA Astrophysics Data System (ADS)

    Kriechbaumer, Thomas; Blackburn, Kim; Gill, Andrew; Breckon, Toby; Everard, Nick; Wright, Ros; Rivas Casado, Monica

    2014-05-01

    Fragmentation of aquatic habitats can lead to the extinction of migratory fish species with severe negative consequences at the ecosystem level and thus opposes the target of good ecological status of rivers defined in the EU Water Framework Directive (WFD). In the UK, the implementation of the EU WFD requires investments in fish pass facilities of estimated 532 million GBP (i.e. 639 million Euros) until 2027 to ensure fish passage at around 3,000 barriers considered critical. Hundreds of passes have been installed in the past. However, monitoring studies of fish passes around the world indicate that on average less than half of the fish attempting to pass such facilities are actually successful. There is a need for frameworks that allow the rapid identification of facilities that are biologically effective and those that require enhancement. Although there are many environmental characteristics that can affect fish passage success, past research suggests that variations in hydrodynamic conditions, reflected in water velocities, velocity gradients and turbulences, are the major cues that fish use to seek migration pathways in rivers. This paper presents the first steps taken in the development of a framework for the rapid field-based quantification of the hydraulic conditions downstream of fish passes and the assessment of the attractivity of fish passes for salmonids and coarse fish in UK rivers. For this purpose, a small-sized remote control platform carrying an acoustic Doppler current profiler (ADCP), a GPS unit, a stereo camera and an inertial measurement unit has been developed. The large amount of data on water velocities and depths measured by the ADCP within relatively short time is used to quantify the spatial and temporal distribution of water velocities. By matching these hydraulic features with known preferences of migratory fish, it is attempted to identify likely migration routes and aggregation areas at barriers as well as hydraulic features that may distract fish away from fish pass entrances. The initial steps of the framework development have focused on the challenge of precise spatial data referencing in areas with limited sky view to navigation satellites. Platform tracking with a motorised Total Station, various satellite-based positioning solutions and simultaneous localisation and mapping (SLAM) based on stereo images have been tested. The effect of errors in spatial data referencing on ADCP-derived maps of flow features and bathymetry will be quantified through simultaneous deployment of these navigation technologies and the ADCP. This will inform the selection of a cost-effective platform positioning system in practice. Further steps will cover the quantification of uncertainties in ADCP data caused by highly turbulent flows and the identification of suitable ADCP data sampling strategies at fish passes. The final framework for fish pass assessment can contribute to an improved understanding of the interaction of fish and the complex hydraulic river environment.

  8. A Robust Error Model for iTRAQ Quantification Reveals Divergent Signaling between Oncogenic FLT3 Mutants in Acute Myeloid Leukemia*

    PubMed Central

    Zhang, Yi; Askenazi, Manor; Jiang, Jingrui; Luckey, C. John; Griffin, James D.; Marto, Jarrod A.

    2010-01-01

    The FLT3 receptor tyrosine kinase plays an important role in normal hematopoietic development and leukemogenesis. Point mutations within the activation loop and in-frame tandem duplications of the juxtamembrane domain represent the most frequent molecular abnormalities observed in acute myeloid leukemia. Interestingly these gain-of-function mutations correlate with different clinical outcomes, suggesting that signals from constitutive FLT3 mutants activate different downstream targets. In principle, mass spectrometry offers a powerful means to quantify protein phosphorylation and identify signaling events associated with constitutively active kinases or other oncogenic events. However, regulation of individual phosphorylation sites presents a challenging case for proteomics studies whereby quantification is based on individual peptides rather than an average across different peptides derived from the same protein. Here we describe a robust experimental framework and associated error model for iTRAQ-based quantification on an Orbitrap mass spectrometer that relates variance of peptide ratios to mass spectral peak height and provides for assignment of p value, q value, and confidence interval to every peptide identification, all based on routine measurements, obviating the need for detailed characterization of individual ion peaks. Moreover, we demonstrate that our model is stable over time and can be applied in a manner directly analogous to ubiquitously used external mass calibration routines. Application of our error model to quantitative proteomics data for FLT3 signaling provides evidence that phosphorylation of tyrosine phosphatase SHP1 abrogates the transformative potential, but not overall kinase activity, of FLT3-D835Y in acute myeloid leukemia. PMID:20019052

  9. Gas seepage in the Northern Adriatic Sea

    NASA Astrophysics Data System (ADS)

    Matilde Ferrante, Giulia; Donda, Federica; Volpi, Valentina; Tinivella, Umberta

    2017-04-01

    In the Northern Adriatic Sea, the occurrence of gas seepage has been widely documented. However, the origin of seeping gas was not clearly constrained. Geophysical data with different scale of resolution, i.e. multichannel seismic profiles, CHIRP and morpho-bathymetry data collected in 2009 and 2014 by OGS reveal that several the gas-enriched fluid vents are deeply rooted. In fact, the entire Plio-Quaternary succession is characterized by widespread seismic anomalies represented by wipe-out zones and interpreted as gas chimneys. They commonly root at the base of the Pliocene sequence but also within the Paleogene succession, where they appear to be associated to deep-seated, Mesozoic-to-Paleogene faults. These chimneys originate and terminate at different stratigraphic levels; they also commonly reach the seafloor, where rock outcrops interpreted as authigenic carbonate deposits have been recognized. In places, gas is then capable to escape in the water column as shown by numerous gas flares. On going studies are addressed to: 1. re-examining the structural setting of the study area, in order to verify a possible structural control on chimney distribution and gas migration; 2. performing geochemical analysis on gas which have been sampled in some key emission points; 3. a quantitative analysis of some selected boreholes well logs (made available through the public VidePi database (www.videpi.com)) aimed to estimate the amount of gas present in sediments. This work presents the preliminary results regarding the latter aspect of our research. In a first instance, for each selected borehole the geophysical logs have been digitized. This procedure consists in a manual picking of curves, in a set system of reference. Static corrections for vertical offset are made at this stage. Logs are then divided by type and converted in common scales, amplifications and units. Every log is resampled in order to cut high frequencies not useful in the comparison with seismic data. Estimation of gas requires a petrophysical characterization of sediments, but unfortunately the available wells are not sufficient for our investigations. For this reason, we are presently trying to establish empirical relationships between the available logs. All information available from wells and results from literature are used to fit cross-plots, and related chi-square tests are performed. Some correlations among our petrophysical logs and common trends in the investigated area have been already found, but our work is still in progress. This analysis will hopefully provide a petrophysical characterization of the study area and will be used to estimate density, velocity and porosity profiles. Next step will consist in an ad hoc processing of seismic data, applying a True Amplitude Recovery and keeping the amplitude information unaffected, which is the first request in our analysis. References: Deep-sourced gas seepage and methane-derived carbonates in the Northern Adriatic Sea, Donda et al., 2015; Sound velocity and related properties of marine sediments, Hamilton et al., 1982; Archie's law - a reappraisal, Glover, 2016.

  10. Characterization of heterogeneities from core X-ray scans and borehole wall images in a reefal carbonate reservoir: influence on the porosity structure.

    NASA Astrophysics Data System (ADS)

    Hebert, V.; Garing, C.; Pezard, P. A.; Gouze, P.; Maria-Sube, Y.; Camoin, G.; Lapointe, P.

    2009-04-01

    Petrophysical properties of rocks can be largely influenced by heterogeneities. This is particularly true in reefal carbonates, with heterogeneities due to the primary structure of the reef, the degradation of that structure into a fossil form, and fluid circulations with associated dissolutions and recrystallization. We report here a study conducted on Miocene reefal carbonates drilled in the context of salt water intrusion in coastal reservoirs. Salt water intrusion along coastlines is highly influenced by geological and petrophysical structures. In particular, heterogeneities and anisotropy in porous media (karsts, vugs…) control fluid flow and dispersion. A new experimental site has been developed in the South East of Mallorca Island (Spain) in the context of the ALIANCE EC project (2002-2005). This project aimed at developing a strategy for the quantitative analysis and description of fluid flow and salt transport in coastal carbonate aquifers. The site drilled the Miocene carbonate reef platform at Ses Sitjoles, 6 km inland, near the city of Campos. Sea water is found there at 60 to 80 m depth. The geological structure present multi-scale heterogeneities, often bound to either lateral variations of geological facies, or dissolution patterns. The Campos site provides a unique laboratory to study the heterogeneities of carbonate rocks with a saltwater intrusion and develop new borehole investigation methods in this context. The present study focuses on borehole geophysical measurements and images, and core scans. New image analysis methods have been developed to better characterize the presence of heterogeneities in terms of grain-size distribution, formation factor changes and porosity. Cores scans from RX tomography can lead to the extraction of petrophysical parameters from 3D images. For this, the AVIZO software was used here to represent the micro-porosity and vuggy porosity structure. Beyond core analyses, the optical and acoustic borehole wall images provide a direct look at meso-scale porosity beyond cm-scale heterogeneities, such as karstic channels and megapores. The reefal complex is dominated by moldic secondary porosity, which in the upper part of the boreholes. These heterogeneities are characterized by large and elongated molds mainly detected by the acoustic images. In slope, corresponding to the lower part of the structure, moldic porosity is characterized by small and round shaped molds. Vuggy porosity varies mainly from 5 to 40% along 100 m deep boreholes. But, in the slope part, the porosity is about 0% although it can reach 90% in karstic area. In all, the distribution of pore types is strongly controlled by that of lithofacies and water paleo-levels, leading to extensive cementation processes with the resulting occlusion of pore spaces. The combined analysis of porosity with complementary methods in terms of spacial resolution leads to the quantitative determination and description of microstructural heterogeneities in carbonate porous media. It is a key to model the reservoir mesoscale structure and fluid flows within it.

  11. Targeted Feature Detection for Data-Dependent Shotgun Proteomics

    PubMed Central

    2017-01-01

    Label-free quantification of shotgun LC–MS/MS data is the prevailing approach in quantitative proteomics but remains computationally nontrivial. The central data analysis step is the detection of peptide-specific signal patterns, called features. Peptide quantification is facilitated by associating signal intensities in features with peptide sequences derived from MS2 spectra; however, missing values due to imperfect feature detection are a common problem. A feature detection approach that directly targets identified peptides (minimizing missing values) but also offers robustness against false-positive features (by assigning meaningful confidence scores) would thus be highly desirable. We developed a new feature detection algorithm within the OpenMS software framework, leveraging ideas and algorithms from the OpenSWATH toolset for DIA/SRM data analysis. Our software, FeatureFinderIdentification (“FFId”), implements a targeted approach to feature detection based on information from identified peptides. This information is encoded in an MS1 assay library, based on which ion chromatogram extraction and detection of feature candidates are carried out. Significantly, when analyzing data from experiments comprising multiple samples, our approach distinguishes between “internal” and “external” (inferred) peptide identifications (IDs) for each sample. On the basis of internal IDs, two sets of positive (true) and negative (decoy) feature candidates are defined. A support vector machine (SVM) classifier is then trained to discriminate between the sets and is subsequently applied to the “uncertain” feature candidates from external IDs, facilitating selection and confidence scoring of the best feature candidate for each peptide. This approach also enables our algorithm to estimate the false discovery rate (FDR) of the feature selection step. We validated FFId based on a public benchmark data set, comprising a yeast cell lysate spiked with protein standards that provide a known ground-truth. The algorithm reached almost complete (>99%) quantification coverage for the full set of peptides identified at 1% FDR (PSM level). Compared with other software solutions for label-free quantification, this is an outstanding result, which was achieved at competitive quantification accuracy and reproducibility across replicates. The FDR for the feature selection was estimated at a low 1.5% on average per sample (3% for features inferred from external peptide IDs). The FFId software is open-source and freely available as part of OpenMS (www.openms.org). PMID:28673088

  12. Targeted Feature Detection for Data-Dependent Shotgun Proteomics.

    PubMed

    Weisser, Hendrik; Choudhary, Jyoti S

    2017-08-04

    Label-free quantification of shotgun LC-MS/MS data is the prevailing approach in quantitative proteomics but remains computationally nontrivial. The central data analysis step is the detection of peptide-specific signal patterns, called features. Peptide quantification is facilitated by associating signal intensities in features with peptide sequences derived from MS2 spectra; however, missing values due to imperfect feature detection are a common problem. A feature detection approach that directly targets identified peptides (minimizing missing values) but also offers robustness against false-positive features (by assigning meaningful confidence scores) would thus be highly desirable. We developed a new feature detection algorithm within the OpenMS software framework, leveraging ideas and algorithms from the OpenSWATH toolset for DIA/SRM data analysis. Our software, FeatureFinderIdentification ("FFId"), implements a targeted approach to feature detection based on information from identified peptides. This information is encoded in an MS1 assay library, based on which ion chromatogram extraction and detection of feature candidates are carried out. Significantly, when analyzing data from experiments comprising multiple samples, our approach distinguishes between "internal" and "external" (inferred) peptide identifications (IDs) for each sample. On the basis of internal IDs, two sets of positive (true) and negative (decoy) feature candidates are defined. A support vector machine (SVM) classifier is then trained to discriminate between the sets and is subsequently applied to the "uncertain" feature candidates from external IDs, facilitating selection and confidence scoring of the best feature candidate for each peptide. This approach also enables our algorithm to estimate the false discovery rate (FDR) of the feature selection step. We validated FFId based on a public benchmark data set, comprising a yeast cell lysate spiked with protein standards that provide a known ground-truth. The algorithm reached almost complete (>99%) quantification coverage for the full set of peptides identified at 1% FDR (PSM level). Compared with other software solutions for label-free quantification, this is an outstanding result, which was achieved at competitive quantification accuracy and reproducibility across replicates. The FDR for the feature selection was estimated at a low 1.5% on average per sample (3% for features inferred from external peptide IDs). The FFId software is open-source and freely available as part of OpenMS ( www.openms.org ).

  13. Atlas-based liver segmentation and hepatic fat-fraction assessment for clinical trials.

    PubMed

    Yan, Zhennan; Zhang, Shaoting; Tan, Chaowei; Qin, Hongxing; Belaroussi, Boubakeur; Yu, Hui Jing; Miller, Colin; Metaxas, Dimitris N

    2015-04-01

    Automated assessment of hepatic fat-fraction is clinically important. A robust and precise segmentation would enable accurate, objective and consistent measurement of hepatic fat-fraction for disease quantification, therapy monitoring and drug development. However, segmenting the liver in clinical trials is a challenging task due to the variability of liver anatomy as well as the diverse sources the images were acquired from. In this paper, we propose an automated and robust framework for liver segmentation and assessment. It uses single statistical atlas registration to initialize a robust deformable model to obtain fine segmentation. Fat-fraction map is computed by using chemical shift based method in the delineated region of liver. This proposed method is validated on 14 abdominal magnetic resonance (MR) volumetric scans. The qualitative and quantitative comparisons show that our proposed method can achieve better segmentation accuracy with less variance comparing with two other atlas-based methods. Experimental results demonstrate the promises of our assessment framework. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Geometrically Necessary Dislocation Density Evolution in Interstitial Free Steel at Small Plastic Strains

    NASA Astrophysics Data System (ADS)

    Kundu, Amrita; Field, David P.

    2018-06-01

    Measurement of geometrically necessary dislocation (GND) density using electron backscatter diffraction (EBSD) has become rather common place in modern metallurgical research. The utility of this measure as an indicator of the expected flow behavior of the material is not obvious. Incorporation of total dislocation density into the Taylor equation relating flow stress to dislocation density is generally accepted, but this does not automatically extend to a similar relationship for the GND density. This is discussed in the present work using classical equations for isotropic metal plasticity in a rather straight-forward theoretical framework. This investigation examines the development of GND structure in a commercially produced interstitial free steel subject to tensile deformation. Quantification of GND density was carried out using conventional EBSD at various strain levels on the surface of a standard dog-bone-shaped tensile specimen. There is linear increase of the average GND density with imposed macroscopic strain. This is in agreement with the established framework.

  15. Optimal design and uncertainty quantification in blood flow simulations for congenital heart disease

    NASA Astrophysics Data System (ADS)

    Marsden, Alison

    2009-11-01

    Recent work has demonstrated substantial progress in capabilities for patient-specific cardiovascular flow simulations. Recent advances include increasingly complex geometries, physiological flow conditions, and fluid structure interaction. However inputs to these simulations, including medical image data, catheter-derived pressures and material properties, can have significant uncertainties associated with them. For simulations to predict clinically useful and reliable output information, it is necessary to quantify the effects of input uncertainties on outputs of interest. In addition, blood flow simulation tools can now be efficiently coupled to shape optimization algorithms for surgery design applications, and these tools should incorporate uncertainty information. We present a unified framework to systematically and efficient account for uncertainties in simulations using adaptive stochastic collocation. In addition, we present a framework for derivative-free optimization of cardiovascular geometries, and layer these tools to perform optimization under uncertainty. These methods are demonstrated using simulations and surgery optimization to improve hemodynamics in pediatric cardiology applications.

  16. Robustness Metrics: How Are They Calculated, When Should They Be Used and Why Do They Give Different Results?

    NASA Astrophysics Data System (ADS)

    McPhail, C.; Maier, H. R.; Kwakkel, J. H.; Giuliani, M.; Castelletti, A.; Westra, S.

    2018-02-01

    Robustness is being used increasingly for decision analysis in relation to deep uncertainty and many metrics have been proposed for its quantification. Recent studies have shown that the application of different robustness metrics can result in different rankings of decision alternatives, but there has been little discussion of what potential causes for this might be. To shed some light on this issue, we present a unifying framework for the calculation of robustness metrics, which assists with understanding how robustness metrics work, when they should be used, and why they sometimes disagree. The framework categorizes the suitability of metrics to a decision-maker based on (1) the decision-context (i.e., the suitability of using absolute performance or regret), (2) the decision-maker's preferred level of risk aversion, and (3) the decision-maker's preference toward maximizing performance, minimizing variance, or some higher-order moment. This article also introduces a conceptual framework describing when relative robustness values of decision alternatives obtained using different metrics are likely to agree and disagree. This is used as a measure of how "stable" the ranking of decision alternatives is when determined using different robustness metrics. The framework is tested on three case studies, including water supply augmentation in Adelaide, Australia, the operation of a multipurpose regulated lake in Italy, and flood protection for a hypothetical river based on a reach of the river Rhine in the Netherlands. The proposed conceptual framework is confirmed by the case study results, providing insight into the reasons for disagreements between rankings obtained using different robustness metrics.

  17. `Dhara': An Open Framework for Critical Zone Modeling

    NASA Astrophysics Data System (ADS)

    Le, P. V.; Kumar, P.

    2016-12-01

    Processes in the Critical Zone, which sustain terrestrial life, are tightly coupled across hydrological, physical, biological, chemical, pedological, geomorphological and ecological domains over both short and long timescales. Observations and quantification of the Earth's surface across these domains using emerging high resolution measurement technologies such as light detection and ranging (lidar) and hyperspectral remote sensing are enabling us to characterize fine scale landscape attributes over large spatial areas. This presents a unique opportunity to develop novel approaches to model the Critical Zone that can capture fine scale intricate dependencies across the different processes in 3D. The development of interdisciplinary tools that transcend individual disciplines and capture new levels of complexity and emergent properties is at the core of Critical Zone science. Here we introduce an open framework for high-performance computing model (`Dhara') for modeling complex processes in the Critical Zone. The framework is designed to be modular in structure with the aim to create uniform and efficient tools to facilitate and leverage process modeling. It also provides flexibility to maintain, collaborate, and co-develop additional components by the scientific community. We show the essential framework that simulates ecohydrologic dynamics, and surface - sub-surface coupling in 3D using hybrid parallel CPU-GPU. We demonstrate that the open framework in Dhara is feasible for detailed, multi-processes, and large-scale modeling of the Critical Zone, which opens up exciting possibilities. We will also present outcomes from a Modeling Summer Institute led by Intensively Managed Critical Zone Observatory (IMLCZO) with representation from several CZOs and international representatives.

  18. Through-Space Intervalence Charge Transfer as a Mechanism for Charge Delocalisation in Metal-Organic Frameworks.

    PubMed

    Hua, Carol; Doheny, Patrick William; Ding, Bowen; Chan, Bun; Yu, Michelle; Kepert, Cameron J; D'Alessandro, Deanna M

    2018-05-04

    Understanding the nature of charge transfer mechanisms in 3-dimensional Metal-Organic Frameworks (MOFs) is an important goal owing to the possibility of harnessing this knowledge to design conductive frameworks. These materials have been implicated as the basis for the next generation of technological devices for applications in energy storage and conversion, including electrochromic devices, electrocatalysts, and battery materials. After nearly two decades of intense research into MOFs, the mechanisms of charge transfer remain relatively poorly understood, and new strategies to achieve charge mobility remain elusive and challenging to experimentally explore, validate and model. We now demonstrate that aromatic stacking interactions in Zn(II) frameworks containing cofacial thiazolo[5,4-d]thiazole units lead to a mixed-valence state upon electrochemical or chemical reduction. This through-space Intervalence Charge Transfer (IVCT) phenomenon represents a new mechanism for charge delocalisation in MOFs. Computational modelling of the optical data combined with application of Marcus-Hush theory to the IVCT bands for the mixed-valence framework has enabled quantification of the degree of delocalisation using both in situ and ex situ electro- and spectro-electrochemical methods. A distance dependence for the through-space electron transfer has also been identified on the basis of experimental studies and computational calculations. This work provides a new window into electron transfer phenomena in 3-dimensional coordination space, of relevance to electroactive MOFs where new mechanisms for charge transfer are highly sought after, and to understanding biological light harvesting systems where through-space mixed-valence interactions are operative.

  19. Ductile strain rate recorded in the Symvolon syn-extensional plutonic body (Rhodope core complex, Greece)

    NASA Astrophysics Data System (ADS)

    Cirrincione, Rosolino; Fazio, Eugenio; Ortolano, Gaetano; Fiannacca, Patrizia; Kern, Hartmut; Mengel, Kurt; Pezzino, Antonino; Punturo, Rosalda

    2016-04-01

    The present contribution deals with quantitative microstructural analysis, which was performed on granodiorites of the syn-tectonic Symvolon pluton (Punturo et al., 2014) at the south-western boundary of the Rhodope Core Complex (Greece). Our purpose is the quantification of ductile strain rate achieved across the pluton, by considering its cooling gradient from the centre to the periphery, using the combination of a paleopiezometer (Shimizu, 2008) and a quartz flow law (Hirth et al., 2001). Obtained results, associated with a detailed cooling history (Dinter et al., 1995), allowed us to reconstruct the joined cooling and strain gradient evolution of the pluton from its emplacement during early Miocene (ca. 700°C at 22 Ma) to its following cooling stage (ca. 500-300°C at 15 Ma). Shearing temperature values were constrained by means of a thermodynamic approach based on the recognition of syn-shear assemblages at incremental strain; to this aim, statistical handling of mineral chemistry X-Ray maps was carried out on microdomains detected at the tails of porphyroclasts. Results indicate that the strain/cooling gradients evolve "arm in arm" across the pluton, as also testified by the progressive development of mylonitic fabric over the magmatic microstructures approaching the host rock. References • Dinter, D. A., Macfarlane, A., Hames, W., Isachsen, C., Bowring, S., and Royden, L. (1995). U-Pb and 40Ar/39Ar geochronology of the Symvolon granodiorite: Implications for the thermal and structural evolution of the Rhodope metamorphic core complex, northeastern Greece. Tectonics, 14 (4), 886-908. • Shimizu, I. (2008). Theories and applicability of grain size piezometers: The role of dynamic recrystallization mechanisms. Journal of Structural Geology, 30 (7), 899-917. • Hirth, G., Teyssier, C., and Dunlap, J. W. (2001). An evaluation of quartzite flow laws based on comparisons between experimentally and naturally deformed rocks. International Journal of Earth Sciences, 90 (1), 77-87. • Punturo, R., Cirrincione, R., Fazio, E., Fiannacca, P., Kern, H., Mengel, K., Ortolano G., and Pezzino, A. (2014). Microstructural, compositional and petrophysical properties of mylonitic granodiorites from an extensional shear zone (Rhodope Core complex, Greece). Geological Magazine, 151 (6), 1051-1071.

  20. Extended generalized recurrence plot quantification of complex circular patterns

    NASA Astrophysics Data System (ADS)

    Riedl, Maik; Marwan, Norbert; Kurths, Jürgen

    2017-03-01

    The generalized recurrence plot is a modern tool for quantification of complex spatial patterns. Its application spans the analysis of trabecular bone structures, Turing patterns, turbulent spatial plankton patterns, and fractals. Determinism is a central measure in this framework quantifying the level of regularity of spatial structures. We show by basic examples of fully regular patterns of different symmetries that this measure underestimates the orderliness of circular patterns resulting from rotational symmetries. We overcome this crucial problem by checking additional structural elements of the generalized recurrence plot which is demonstrated with the examples. Furthermore, we show the potential of the extended quantity of determinism applying it to more irregular circular patterns which are generated by the complex Ginzburg-Landau-equation and which can be often observed in real spatially extended dynamical systems. So, we are able to reconstruct the main separations of the system's parameter space analyzing single snapshots of the real part only, in contrast to the use of the original quantity. This ability of the proposed method promises also an improved description of other systems with complicated spatio-temporal dynamics typically occurring in fluid dynamics, climatology, biology, ecology, social sciences, etc.

  1. MFIX-DEM Phi: Performance and Capability Improvements Towards Industrial Grade Open-source DEM Framework with Integrated Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    GEL, Aytekin; Jiao, Yang; Emady, Heather

    Two major challenges hinder the effective use and adoption of multiphase computational fluid dynamics tools by the industry. The first is the need for significant computational resources, which is inversely proportional to the accuracy of solutions due to computational intensity of the algorithms. The second barrier is assessing the prediction credibility and confidence in the simulation results. In this project, a multi-tiered approach has been proposed under four broad activities to overcome these challenges while addressing all of the objectives outlined in FOA-0001238 through Phases 1 and 2 of the project. The present report consists of the results for onlymore » Phase 1, which was the funded performance period. From the start the project, all of the objectives outlined in FOA were addressed through four major activity tasks in an integrated and balanced fashion to improve adoption of MFIX suite of solvers for industrial use. The first task aimed to improve the performance of MFIX-DEM specifically targeting to acquire the peak performance on Intel Xeon and Xeon Phi based systems, which are expected to be one of the primary high-performance computing platforms both affordable and available for the industrial users in the next two to five years. However, due to a number of changes in course of the project, the scope of the performance improvements related task was significantly reduced to avoid duplicate work. Hence, more emphasis was placed on the other three tasks as discussed below.The second task aimed at physical modeling enhancements through implementation of polydispersity capability and validation of heat transfer models in MFIX. An extended verification and validation (V&V) study was performed for the new polydispersity feature implemented in MFIX-DEM both for granular and coupled gas-solid flows. The features of the polydispersity capability and results for an industrially relevant problem were disseminated through journal papers (one published and one under review at the time of writing of the final technical report). As part of the validation efforts, another industrially relevant problem of interest based on rotary drums was studied for several modes of heat transfer and results were presented in conferences. Third task was aimed towards an important and unique contribution of the project, which was to develop a unified uncertainty quantification framework by integrating MFIX-DEM with a graphical user interface (GUI) driven uncertainty quantification (UQ) engine, i.e., MFIX-GUI and PSUADE. The goal was to enable a user with only modest knowledge of statistics to effectively utilize the UQ framework offered with MFIX-DEM Phi to perform UQ analysis routinely. For Phase 1, a proof-of-concept demonstration of the proposed framework was completed and shared. Direct industry involvement was one of the key virtues of this project, which was performed through forth task. For this purpose, even at the proposal stage, the project team received strong interest in the proposed capabilities from two major corporations, which were further expanded throughout Phase 1 and a new collaboration with another major corporation from chemical industry was also initiated. The level of interest received and continued collaboration for the project during Phase 1 clearly shows the relevance and potential impact of the project for the industrial users.« less

  2. Monitoring hillslope moisture dynamics with surface ERT for enhancing spatial significance of hydrometric point measurements

    NASA Astrophysics Data System (ADS)

    Hübner, R.; Heller, K.; Günther, T.; Kleber, A.

    2015-01-01

    Besides floodplains, hillslopes are basic units that mainly control water movement and flow pathways within catchments of subdued mountain ranges. The structure of their shallow subsurface affects water balance, e.g. infiltration, retention, and runoff. Nevertheless, there is still a gap in the knowledge of the hydrological dynamics on hillslopes, notably due to the lack of generalization and transferability. This study presents a robust multi-method framework of electrical resistivity tomography (ERT) in addition to hydrometric point measurements, transferring hydrometric data into higher spatial scales to obtain additional patterns of distribution and dynamics of soil moisture on a hillslope. A geoelectrical monitoring in a small catchment in the eastern Ore Mountains was carried out at weekly intervals from May to December 2008 to image seasonal moisture dynamics on the hillslope scale. To link water content and electrical resistivity, the parameters of Archie's law were determined using different core samples. To optimize inversion parameters and methods, the derived spatial and temporal water content distribution was compared to tensiometer data. The results from ERT measurements show a strong correlation with the hydrometric data. The response is congruent to the soil tension data. Water content calculated from the ERT profile shows similar variations as that of water content from soil moisture sensors. Consequently, soil moisture dynamics on the hillslope scale may be determined not only by expensive invasive punctual hydrometric measurements, but also by minimally invasive time-lapse ERT, provided that pedo-/petrophysical relationships are known. Since ERT integrates larger spatial scales, a combination with hydrometric point measurements improves the understanding of the ongoing hydrological processes and better suits identification of heterogeneities.

  3. Assessment of undiscovered oil and gas resources of the Devonian Marcellus Shale of the Appalachian Basin Province

    USGS Publications Warehouse

    Coleman, James L.; Milici, Robert C.; Cook, Troy A.; Charpentier, Ronald R.; Kirshbaum, Mark; Klett, Timothy R.; Pollastro, Richard M.; Schenk, Christopher J.

    2011-01-01

    Using a geology-based assessment methodology, the U.S. Geological Survey (USGS) estimated a mean undiscovered natural gas resource of 84,198 billion cubic feet and a mean undiscovered natural gas liquids resource of 3,379 million barrels in the Devonian Marcellus Shale within the Appalachian Basin Province. All this resource occurs in continuous accumulations. In 2011, the USGS completed an assessment of the undiscovered oil and gas potential of the Devonian Marcellus Shale within the Appalachian Basin Province of the eastern United States. The Appalachian Basin Province includes parts of Alabama, Georgia, Kentucky, Maryland, New York, Ohio, Pennsylvania, Tennessee, Virginia, and West Virginia. The assessment of the Marcellus Shale is based on the geologic elements of this formation's total petroleum system (TPS) as recognized in the characteristics of the TPS as a petroleum source rock (source rock richness, thermal maturation, petroleum generation, and migration) as well as a reservoir rock (stratigraphic position and content and petrophysical properties). Together, these components confirm the Marcellus Shale as a continuous petroleum accumulation. Using the geologic framework, the USGS defined one TPS and three assessment units (AUs) within this TPS and quantitatively estimated the undiscovered oil and gas resources within the three AUs. For the purposes of this assessment, the Marcellus Shale is considered to be that Middle Devonian interval that consists primarily of shale and lesser amounts of bentonite, limestone, and siltstone occurring between the underlying Middle Devonian Onondaga Limestone (or its stratigraphic equivalents, the Needmore Shale and Huntersville Chert) and the overlying Middle Devonian Mahantango Formation (or its stratigraphic equivalents, the upper Millboro Shale and middle Hamilton Group).

  4. Depositional and diagenetic variability within the Cambrian Mount Simon Sandstone: Implications for carbon dioxide sequestration

    USGS Publications Warehouse

    Bowen, B.B.; Ochoa, R.I.; Wilkens, N.D.; Brophy, J.; Lovell, T.R.; Fischietto, N.; Medina, C.R.; Rupp, J.A.

    2011-01-01

    The Cambrian Mount Simon Sandstone is the major target reservoir for ongoing geologic carbon dioxide (CO2) sequestration demonstrations throughout the midwest United States. The potential CO2 reservoir capacity, reactivity, and ultimate fate of injected CO2 depend on textural and compositional properties determined by depositional and diagenetic histories that vary vertically and laterally across the formation. Effective and efficient prediction and use of the available pore space requires detailed knowledge of the depositional and diagenetic textures and mineralogy, how these variables control the petrophysical character of the reservoir, and how they vary spatially. Here, we summarize the reservoir characteristics of the Mount Simon Sandstone based on examination of geophysical logs, cores, cuttings, and analysis of more than 150 thin sections. These samples represent different parts of the formation and depth ranges of more than 9000 ft (>2743 m) across the Illinois Basin and surrounding areas. This work demonstrates that overall reservoir quality and, specifically, porosity do not exhibit a simple relationship with depth, but vary both laterally and with depth because of changes in the primary depositional facies, framework composition (i.e., feldspar concentration), and diverse diagenetic modifications. Diagenetic processes that have been significant in modifying the reservoir include formation of iron oxide grain coatings, chemical compaction, feldspar precipitation and dissolution, multiple generations of quartz overgrowth cementation, clay mineral precipitation, and iron oxide cementation. These variables provide important inputs for calculating CO2 capacity potential, modeling reactivity, and are also an important baseline for comparisons after CO2 injection. Copyright ??2011. The American Association of Petroleum Geologists/Division of Environmental Geosciences. All rights reserved.

  5. IMPACT: Integrated Modeling of Perturbations in Atmospheres for Conjunction Tracking

    NASA Astrophysics Data System (ADS)

    Koller, J.; Brennan, S.; Godinez, H. C.; Higdon, D. M.; Klimenko, A.; Larsen, B.; Lawrence, E.; Linares, R.; McLaughlin, C. A.; Mehta, P. M.; Palmer, D.; Ridley, A. J.; Shoemaker, M.; Sutton, E.; Thompson, D.; Walker, A.; Wohlberg, B.

    2013-12-01

    Low-Earth orbiting satellites suffer from atmospheric drag due to thermospheric density which changes on the order of several magnitudes especially during space weather events. Solar flares, precipitating particles and ionospheric currents cause the upper atmosphere to heat up, redistribute, and cool again. These processes are intrinsically included in empirical models, e.g. MSIS and Jacchia-Bowman type models. However, sensitivity analysis has shown that atmospheric drag has the highest influence on satellite conjunction analysis and empirical model still do not adequately represent a desired accuracy. Space debris and collision avoidance have become an increasingly operational reality. It is paramount to accurately predict satellite orbits and include drag effect driven by space weather. The IMPACT project (Integrated Modeling of Perturbations in Atmospheres for Conjunction Tracking), funded with over $5 Million by the Los Alamos Laboratory Directed Research and Development office, has the goal to develop an integrated system of atmospheric drag modeling, orbit propagation, and conjunction analysis with detailed uncertainty quantification to address the space debris and collision avoidance problem. Now with over two years into the project, we have developed an integrated solution combining physics-based density modeling of the upper atmosphere between 120-700 km altitude, satellite drag forecasting for quiet and disturbed geomagnetic conditions, and conjunction analysis with non-Gaussian uncertainty quantification. We are employing several novel approaches including a unique observational sensor developed at Los Alamos; machine learning with a support-vector machine approach of the coupling between solar drivers of the upper atmosphere and satellite drag; rigorous data assimilative modeling using a physics-based approach instead of empirical modeling of the thermosphere; and a computed-tomography method for extracting temporal maps of thermospheric densities using ground based observations. The developed IMPACT framework is an open research framework enabling the exchange and testing of a variety of atmospheric density models, orbital propagators, drag coefficient models, ground based observations, etc. and study their effect on conjunctions and uncertainty predictions. The framework is based on a modern service-oriented architecture controlled by a web interface and providing 3D visualizations. The goal of this project is to revolutionize the ability to monitor and track space objects during highly disturbed space weather conditions, provide suitable forecasts for satellite drag conditions and conjunction analysis, and enable the exchange of models, codes, and data in an open research environment. We will present capabilities and results of the IMPACT framework including a demo of the control interface and visualizations.

  6. A framework for quantification and physical modeling of cell mixing applied to oscillator synchronization in vertebrate somitogenesis.

    PubMed

    Uriu, Koichiro; Bhavna, Rajasekaran; Oates, Andrew C; Morelli, Luis G

    2017-08-15

    In development and disease, cells move as they exchange signals. One example is found in vertebrate development, during which the timing of segment formation is set by a 'segmentation clock', in which oscillating gene expression is synchronized across a population of cells by Delta-Notch signaling. Delta-Notch signaling requires local cell-cell contact, but in the zebrafish embryonic tailbud, oscillating cells move rapidly, exchanging neighbors. Previous theoretical studies proposed that this relative movement or cell mixing might alter signaling and thereby enhance synchronization. However, it remains unclear whether the mixing timescale in the tissue is in the right range for this effect, because a framework to reliably measure the mixing timescale and compare it with signaling timescale is lacking. Here, we develop such a framework using a quantitative description of cell mixing without the need for an external reference frame and constructing a physical model of cell movement based on the data. Numerical simulations show that mixing with experimentally observed statistics enhances synchronization of coupled phase oscillators, suggesting that mixing in the tailbud is fast enough to affect the coherence of rhythmic gene expression. Our approach will find general application in analyzing the relative movements of communicating cells during development and disease. © 2017. Published by The Company of Biologists Ltd.

  7. A framework for quantification and physical modeling of cell mixing applied to oscillator synchronization in vertebrate somitogenesis

    PubMed Central

    Bhavna, Rajasekaran; Oates, Andrew C.; Morelli, Luis G.

    2017-01-01

    ABSTRACT In development and disease, cells move as they exchange signals. One example is found in vertebrate development, during which the timing of segment formation is set by a ‘segmentation clock’, in which oscillating gene expression is synchronized across a population of cells by Delta-Notch signaling. Delta-Notch signaling requires local cell-cell contact, but in the zebrafish embryonic tailbud, oscillating cells move rapidly, exchanging neighbors. Previous theoretical studies proposed that this relative movement or cell mixing might alter signaling and thereby enhance synchronization. However, it remains unclear whether the mixing timescale in the tissue is in the right range for this effect, because a framework to reliably measure the mixing timescale and compare it with signaling timescale is lacking. Here, we develop such a framework using a quantitative description of cell mixing without the need for an external reference frame and constructing a physical model of cell movement based on the data. Numerical simulations show that mixing with experimentally observed statistics enhances synchronization of coupled phase oscillators, suggesting that mixing in the tailbud is fast enough to affect the coherence of rhythmic gene expression. Our approach will find general application in analyzing the relative movements of communicating cells during development and disease. PMID:28652318

  8. Quantifying understorey vegetation in the US Lake States: a proposed framework to inform regional forest carbon stocks

    USGS Publications Warehouse

    Russell, Matthew B.; D'Amato, Anthony W.; Schulz, Bethany K.; Woodall, Christopher W.; Domke, Grant M.; Bradford, John B.

    2014-01-01

    The contribution of understorey vegetation (UVEG) to forest ecosystem biomass and carbon (C) across diverse forest types has, to date, eluded quantification at regional and national scales. Efforts to quantify UVEG C have been limited to field-intensive studies or broad-scale modelling approaches lacking field measurements. Although large-scale inventories of UVEG C are not common, species- and community-level inventories of vegetation structure are available and may prove useful in quantifying UVEG C stocks. This analysis developed a general framework for estimating UVEG C stocks by employing per cent cover estimates of UVEG from a region-wide forest inventory coupled with an estimate of maximum UVEG C across the US Lake States (i.e. Michigan, Minnesota and Wisconsin). Estimates of UVEG C stocks from this approach reasonably align with expected C stocks in the study region, ranging from 0.86 ± 0.06 Mg ha-1 in red pine-dominated to 1.59 ± 0.06 Mg ha-1 for aspen/birch-dominated forest types. Although the data employed here were originally collected to assess broad-scale forest structure and diversity, this study proposes a framework for using UVEG inventories as a foundation for estimating C stocks in an often overlooked, yet important ecosystem C pool.

  9. Automated measurement of vocal fold vibratory asymmetry from high-speed videoendoscopy recordings.

    PubMed

    Mehta, Daryush D; Deliyski, Dimitar D; Quatieri, Thomas F; Hillman, Robert E

    2011-02-01

    In prior work, a manually derived measure of vocal fold vibratory phase asymmetry correlated to varying degrees with visual judgments made from laryngeal high-speed videoendoscopy (HSV) recordings. This investigation extended this work by establishing an automated HSV-based framework to quantify 3 categories of vocal fold vibratory asymmetry. HSV-based analysis provided for cycle-to-cycle estimates of left-right phase asymmetry, left-right amplitude asymmetry, and axis shift during glottal closure for 52 speakers with no vocal pathology producing comfortable and pressed phonation. An initial cross-validation of the automated left-right phase asymmetry measure was performed by correlating the measure with other objective and subjective assessments of phase asymmetry. Vocal fold vibratory asymmetry was exhibited to a similar extent in both comfortable and pressed phonations. The automated measure of left-right phase asymmetry strongly correlated with manually derived measures and moderately correlated with visual-perceptual ratings. Correlations with the visual-perceptual ratings remained relatively consistent as the automated measure was derived from kymograms taken at different glottal locations. An automated HSV-based framework for the quantification of vocal fold vibratory asymmetry was developed and initially validated. This framework serves as a platform for investigating relationships between vocal fold tissue motion and acoustic measures of voice function.

  10. The Community Cloud retrieval for CLimate (CC4CL) - Part 1: A framework applied to multiple satellite imaging sensors

    NASA Astrophysics Data System (ADS)

    Sus, Oliver; Stengel, Martin; Stapelberg, Stefan; McGarragh, Gregory; Poulsen, Caroline; Povey, Adam C.; Schlundt, Cornelia; Thomas, Gareth; Christensen, Matthew; Proud, Simon; Jerg, Matthias; Grainger, Roy; Hollmann, Rainer

    2018-06-01

    We present here the key features of the Community Cloud retrieval for CLimate (CC4CL) processing algorithm. We focus on the novel features of the framework: the optimal estimation approach in general, explicit uncertainty quantification through rigorous propagation of all known error sources into the final product, and the consistency of our long-term, multi-platform time series provided at various resolutions, from 0.5 to 0.02°. By describing all key input data and processing steps, we aim to inform the user about important features of this new retrieval framework and its potential applicability to climate studies. We provide an overview of the retrieved and derived output variables. These are analysed for four, partly very challenging, scenes collocated with CALIOP (Cloud-Aerosol lidar with Orthogonal Polarization) observations in the high latitudes and over the Gulf of Guinea-West Africa. The results show that CC4CL provides very realistic estimates of cloud top height and cover for optically thick clouds but, where optically thin clouds overlap, returns a height between the two layers. CC4CL is a unique, coherent, multi-instrument cloud property retrieval framework applicable to passive sensor data of several EO missions. Through its flexibility, CC4CL offers the opportunity for combining a variety of historic and current EO missions into one dataset, which, compared to single sensor retrievals, is improved in terms of accuracy and temporal sampling.

  11. Quantification provides a conceptual basis for convergent evolution.

    PubMed

    Speed, Michael P; Arbuckle, Kevin

    2017-05-01

    While much of evolutionary biology attempts to explain the processes of diversification, there is an important place for the study of phenotypic similarity across life forms. When similar phenotypes evolve independently in different lineages this is referred to as convergent evolution. Although long recognised, evolutionary convergence is receiving a resurgence of interest. This is in part because new genomic data sets allow detailed and tractable analysis of the genetic underpinnings of convergent phenotypes, and in part because of renewed recognition that convergence may reflect limitations in the diversification of life. In this review we propose that although convergent evolution itself does not require a new evolutionary framework, none the less there is room to generate a more systematic approach which will enable evaluation of the importance of convergent phenotypes in limiting the diversity of life's forms. We therefore propose that quantification of the frequency and strength of convergence, rather than simply identifying cases of convergence, should be considered central to its systematic comprehension. We provide a non-technical review of existing methods that could be used to measure evolutionary convergence, bringing together a wide range of methods. We then argue that quantification also requires clear specification of the level at which the phenotype is being considered, and argue that the most constrained examples of convergence show similarity both in function and in several layers of underlying form. Finally, we argue that the most important and impressive examples of convergence are those that pertain, in form and function, across a wide diversity of selective contexts as these persist in the likely presence of different selection pressures within the environment. © 2016 The Authors. Biological Reviews published by John Wiley & Sons Ltd on behalf of Cambridge Philosophical Society.

  12. Bayesian Methods for Effective Field Theories

    NASA Astrophysics Data System (ADS)

    Wesolowski, Sarah

    Microscopic predictions of the properties of atomic nuclei have reached a high level of precision in the past decade. This progress mandates improved uncertainty quantification (UQ) for a robust comparison of experiment with theory. With the uncertainty from many-body methods under control, calculations are now sensitive to the input inter-nucleon interactions. These interactions include parameters that must be fit to experiment, inducing both uncertainty from the fit and from missing physics in the operator structure of the Hamiltonian. Furthermore, the implementation of the inter-nucleon interactions is not unique, which presents the additional problem of assessing results using different interactions. Effective field theories (EFTs) take advantage of a separation of high- and low-energy scales in the problem to form a power-counting scheme that allows the organization of terms in the Hamiltonian based on their expected contribution to observable predictions. This scheme gives a natural framework for quantification of uncertainty due to missing physics. The free parameters of the EFT, called the low-energy constants (LECs), must be fit to data, but in a properly constructed EFT these constants will be natural-sized, i.e., of order unity. The constraints provided by the EFT, namely the size of the systematic uncertainty from truncation of the theory and the natural size of the LECs, are assumed information even before a calculation is performed or a fit is done. Bayesian statistical methods provide a framework for treating uncertainties that naturally incorporates prior information as well as putting stochastic and systematic uncertainties on an equal footing. For EFT UQ Bayesian methods allow the relevant EFT properties to be incorporated quantitatively as prior probability distribution functions (pdfs). Following the logic of probability theory, observable quantities and underlying physical parameters such as the EFT breakdown scale may be expressed as pdfs that incorporate the prior pdfs. Problems of model selection, such as distinguishing between competing EFT implementations, are also natural in a Bayesian framework. In this thesis we focus on two complementary topics for EFT UQ using Bayesian methods--quantifying EFT truncation uncertainty and parameter estimation for LECs. Using the order-by-order calculations and underlying EFT constraints as prior information, we show how to estimate EFT truncation uncertainties. We then apply the result to calculating truncation uncertainties on predictions of nucleon-nucleon scattering in chiral effective field theory. We apply model-checking diagnostics to our calculations to ensure that the statistical model of truncation uncertainty produces consistent results. A framework for EFT parameter estimation based on EFT convergence properties and naturalness is developed which includes a series of diagnostics to ensure the extraction of the maximum amount of available information from data to estimate LECs with minimal bias. We develop this framework using model EFTs and apply it to the problem of extrapolating lattice quantum chromodynamics results for the nucleon mass. We then apply aspects of the parameter estimation framework to perform case studies in chiral EFT parameter estimation, investigating a possible operator redundancy at fourth order in the chiral expansion and the appropriate inclusion of truncation uncertainty in estimating LECs.

  13. A Modeling Framework for Optimal Computational Resource Allocation Estimation: Considering the Trade-offs between Physical Resolutions, Uncertainty and Computational Costs

    NASA Astrophysics Data System (ADS)

    Moslehi, M.; de Barros, F.; Rajagopal, R.

    2014-12-01

    Hydrogeological models that represent flow and transport in subsurface domains are usually large-scale with excessive computational complexity and uncertain characteristics. Uncertainty quantification for predicting flow and transport in heterogeneous formations often entails utilizing a numerical Monte Carlo framework, which repeatedly simulates the model according to a random field representing hydrogeological characteristics of the field. The physical resolution (e.g. grid resolution associated with the physical space) for the simulation is customarily chosen based on recommendations in the literature, independent of the number of Monte Carlo realizations. This practice may lead to either excessive computational burden or inaccurate solutions. We propose an optimization-based methodology that considers the trade-off between the following conflicting objectives: time associated with computational costs, statistical convergence of the model predictions and physical errors corresponding to numerical grid resolution. In this research, we optimally allocate computational resources by developing a modeling framework for the overall error based on a joint statistical and numerical analysis and optimizing the error model subject to a given computational constraint. The derived expression for the overall error explicitly takes into account the joint dependence between the discretization error of the physical space and the statistical error associated with Monte Carlo realizations. The accuracy of the proposed framework is verified in this study by applying it to several computationally extensive examples. Having this framework at hand aims hydrogeologists to achieve the optimum physical and statistical resolutions to minimize the error with a given computational budget. Moreover, the influence of the available computational resources and the geometric properties of the contaminant source zone on the optimum resolutions are investigated. We conclude that the computational cost associated with optimal allocation can be substantially reduced compared with prevalent recommendations in the literature.

  14. Quantitative volumetric Raman imaging of three dimensional cell cultures

    NASA Astrophysics Data System (ADS)

    Kallepitis, Charalambos; Bergholt, Mads S.; Mazo, Manuel M.; Leonardo, Vincent; Skaalure, Stacey C.; Maynard, Stephanie A.; Stevens, Molly M.

    2017-03-01

    The ability to simultaneously image multiple biomolecules in biologically relevant three-dimensional (3D) cell culture environments would contribute greatly to the understanding of complex cellular mechanisms and cell-material interactions. Here, we present a computational framework for label-free quantitative volumetric Raman imaging (qVRI). We apply qVRI to a selection of biological systems: human pluripotent stem cells with their cardiac derivatives, monocytes and monocyte-derived macrophages in conventional cell culture systems and mesenchymal stem cells inside biomimetic hydrogels that supplied a 3D cell culture environment. We demonstrate visualization and quantification of fine details in cell shape, cytoplasm, nucleus, lipid bodies and cytoskeletal structures in 3D with unprecedented biomolecular specificity for vibrational microspectroscopy.

  15. Effects of the local structure dependence of evaporation fields on field evaporation behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yao, Lan; Marquis, Emmanuelle A., E-mail: emarq@umich.edu; Withrow, Travis

    2015-12-14

    Accurate three dimensional reconstructions of atomic positions and full quantification of the information contained in atom probe microscopy data rely on understanding the physical processes taking place during field evaporation of atoms from needle-shaped specimens. However, the modeling framework for atom probe microscopy has only limited quantitative justification. Building on the continuum field models previously developed, we introduce a more physical approach with the selection of evaporation events based on density functional theory calculations. This model reproduces key features observed experimentally in terms of sequence of evaporation, evaporation maps, and depth resolution, and provides insights into the physical limit formore » spatial resolution.« less

  16. Intergrated 3-D Ground-Penetrating Radar,Outcrop,and Boreholoe Data Applied to Reservoir Characterization and Flow Simulation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McMechan et al.

    2001-08-31

    Existing reservoir models are based on 2-D outcrop;3-D aspects are inferred from correlation between wells,and so are inadequately constrained for reservoir simulations. To overcome these deficiencies, we initiated a multidimensional characterization of reservoir analogs in the Cretaceous Ferron Sandstone in Utah.The study was conducted at two sites(Corbula Gulch Coyote Basin); results from both sites are contained in this report. Detailed sedimentary facies maps of cliff faces define the geometry and distribution of potential reservoir flow units, barriers and baffles at the outcrop. High resolution 2-D and 3-D ground penetrating radar(GPR) images extend these reservoir characteristics into 3-D to allow developmentmore » of realistic 3-D reservoir models. Models use geometric information from the mapping and the GPR data, petrophysical data from surface and cliff-face outcrops, lab analyses of outcrop and core samples, and petrography. The measurements are all integrated into a single coordinate system using GPS and laser mapping of the main sedimentologic features and boundaries. The final step is analysis of results of 3-D fluid flow modeling to demonstrate applicability of our reservoir analog studies to well siting and reservoir engineering for maximization of hydrocarbon production. The main goals of this project are achieved. These are the construction of a deterministic 3-D reservoir analog model from a variety of geophysical and geologic measurements at the field sites, integrating these into comprehensive petrophysical models, and flow simulation through these models. This unique approach represents a significant advance in characterization and use of reservoir analogs. To data,the team has presented five papers at GSA and AAPG meetings produced a technical manual, and completed 15 technical papers. The latter are the main content of this final report. In addition,the project became part of 5 PhD dissertations, 3 MS theses,and two senior undergraduate research projects.« less

  17. Model complexity in carbon sequestration:A design of experiment and response surface uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Li, S.

    2014-12-01

    Geologic carbon sequestration (GCS) is proposed for the Nugget Sandstone in Moxa Arch, a regional saline aquifer with a large storage potential. For a proposed storage site, this study builds a suite of increasingly complex conceptual "geologic" model families, using subsets of the site characterization data: a homogeneous model family, a stationary petrophysical model family, a stationary facies model family with sub-facies petrophysical variability, and a non-stationary facies model family (with sub-facies variability) conditioned to soft data. These families, representing alternative conceptual site models built with increasing data, were simulated with the same CO2 injection test (50 years at 1/10 Mt per year), followed by 2950 years of monitoring. Using the Design of Experiment, an efficient sensitivity analysis (SA) is conducted for all families, systematically varying uncertain input parameters. Results are compared among the families to identify parameters that have 1st order impact on predicting the CO2 storage ratio (SR) at both end of injection and end of monitoring. At this site, geologic modeling factors do not significantly influence the short-term prediction of the storage ratio, although they become important over monitoring time, but only for those families where such factors are accounted for. Based on the SA, a response surface analysis is conducted to generate prediction envelopes of the storage ratio, which are compared among the families at both times. Results suggest a large uncertainty in the predicted storage ratio given the uncertainties in model parameters and modeling choices: SR varies from 5-60% (end of injection) to 18-100% (end of monitoring), although its variation among the model families is relatively minor. Moreover, long-term leakage risk is considered small at the proposed site. In the lowest-SR scenarios, all families predict gravity-stable supercritical CO2 migrating toward the bottom of the aquifer. In the highest-SR scenarios, supercritical CO2 footprints are relatively insignificant by the end of monitoring.

  18. Best Practices for Mudweight Window Generation and Accuracy Assessment between Seismic Based Pore Pressure Prediction Methodologies for a Near-Salt Field in Mississippi Canyon, Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Mannon, Timothy Patrick, Jr.

    Improving well design has and always will be the primary goal in drilling operations in the oil and gas industry. Oil and gas plays are continuing to move into increasingly hostile drilling environments, including near and/or sub-salt proximities. The ability to reduce the risk and uncertainly involved in drilling operations in unconventional geologic settings starts with improving the techniques for mudweight window modeling. To address this issue, an analysis of wellbore stability and well design improvement has been conducted. This study will show a systematic approach to well design by focusing on best practices for mudweight window projection for a field in Mississippi Canyon, Gulf of Mexico. The field includes depleted reservoirs and is in close proximity of salt intrusions. Analysis of offset wells has been conducted in the interest of developing an accurate picture of the subsurface environment by making connections between depth, non-productive time (NPT) events, and mudweights used. Commonly practiced petrophysical methods of pore pressure, fracture pressure, and shear failure gradient prediction have been applied to key offset wells in order to enhance the well design for two proposed wells. For the first time in the literature, the accuracy of the commonly accepted, seismic interval velocity based and the relatively new, seismic frequency based methodologies for pore pressure prediction are qualitatively and quantitatively compared for accuracy. Accuracy standards will be based on the agreement of the seismic outputs to pressure data obtained while drilling and petrophysically based pore pressure outputs for each well. The results will show significantly higher accuracy for the seismic frequency based approach in wells that were in near/sub-salt environments and higher overall accuracy for all of the wells in the study as a whole.

  19. Metaheuristic optimization approaches to predict shear-wave velocity from conventional well logs in sandstone and carbonate case studies

    NASA Astrophysics Data System (ADS)

    Emami Niri, Mohammad; Amiri Kolajoobi, Rasool; Khodaiy Arbat, Mohammad; Shahbazi Raz, Mahdi

    2018-06-01

    Seismic wave velocities, along with petrophysical data, provide valuable information during the exploration and development stages of oil and gas fields. The compressional-wave velocity (VP ) is acquired using conventional acoustic logging tools in many drilled wells. But the shear-wave velocity (VS ) is recorded using advanced logging tools only in a limited number of wells, mainly because of the high operational costs. In addition, laboratory measurements of seismic velocities on core samples are expensive and time consuming. So, alternative methods are often used to estimate VS . Heretofore, several empirical correlations that predict VS by using well logging measurements and petrophysical data such as VP , porosity and density are proposed. However, these empirical relations can only be used in limited cases. The use of intelligent systems and optimization algorithms are inexpensive, fast and efficient approaches for predicting VS. In this study, in addition to the widely used Greenberg–Castagna empirical method, we implement three relatively recently developed metaheuristic algorithms to construct linear and nonlinear models for predicting VS : teaching–learning based optimization, imperialist competitive and artificial bee colony algorithms. We demonstrate the applicability and performance of these algorithms to predict Vs using conventional well logs in two field data examples, a sandstone formation from an offshore oil field and a carbonate formation from an onshore oil field. We compared the estimated VS using each of the employed metaheuristic approaches with observed VS and also with those predicted by Greenberg–Castagna relations. The results indicate that, for both sandstone and carbonate case studies, all three implemented metaheuristic algorithms are more efficient and reliable than the empirical correlation to predict VS . The results also demonstrate that in both sandstone and carbonate case studies, the performance of an artificial bee colony algorithm in VS prediction is slightly higher than two other alternative employed approaches.

  20. Characterization of concrete from Roman theatre and amphitheater in Emerita Augusta (Mérida, Spain)

    NASA Astrophysics Data System (ADS)

    Mota-Lopez, Maria Isabel; Fort, Rafael; Alvarez de Buergo, Monica; Pizzo, Antonio; Maderuelo-Sanz, Ruben; Meneses-Rodríguez, Juan Miguel

    2016-04-01

    The restoration of historical buildings is very important for the history and culture of the cities and their population. It requires an advanced knowledge of the building materials used for the construction of these structures. Previously to any intervention in historical buildings, it is necessary a historic-scientific study of the original material. Historic mortars or concretes can reveal us different composition and the dependence on the geographical location and the time period of its construction. Historical concretes are complex systems that contain aerial or hydraulic binders or a blend of them, with aggregates, not always crystalline, and others elements that interact with the binder. The use of different techniques for microstructural characterization of materials, like optical microscopy, X-ray diffractometry or petrophysical analysis, allows the determination of the composition and some properties of these concretes. However, each technique has its own limits and, in many cases, several characterization techniques must be used to obtain coherent and reliable results. The present study focuses on the compositional characterization of Roman concrete from Roman buildings for public spectacles of Emerita Augusta, Mérida, Spain. An advanced knowledge of the Roman concrete composition is required to get a reliable restoration and preservation of these ancient monuments. Various samples of concrete were extracted from different zones from this archaeological site. The concrete was studied through mineralogical analysis (petrographic microscope and XRD) and petrophysical properties determination (bulk and real density, open porosity, mercury porosimetry intrusion, compressive strength and Ultrasound propagation velocity). The results obtained allow us to know the original composition of the concrete and the provenance of the aggregates used in it. Acknowledgements: Community of Madrid for financing Geomateriales2 program (P2013/MIT2914), to the funding provided by BIA 2014-53911-R project and to the Consortium for the Monumental City of Merida for the permission granted to collect concrete samples.

  1. Characterization of rock thermal conductivity by high-resolution optical scanning

    USGS Publications Warehouse

    Popov, Y.A.; Pribnow, D.F.C.; Sass, J.H.; Williams, C.F.; Burkhardt, H.

    1999-01-01

    We compared thress laboratory methods for thermal conductivity measurements: divided-bar, line-source and optical scanning. These methods are widely used in geothermal and petrophysical studies, particularly as applied to research on cores from deep scientific boreholes. The relatively new optical scanning method has recently been perfected and applied to geophysical problems. A comparison among these methods for determining the thermal conductivity tensor for anisotropic rocks is based on a representative collection of 80 crystalline rock samples from the KTB continental deep borehole (Germany). Despite substantial thermal inhomogeneity of rock thermal conductivity (up to 40-50% variation) and high anisotropy (with ratios of principal values attaining 2 and more), the results of measurements agree very well among the different methods. The discrepancy for measurements along the foliation is negligible (<1%). The component of thermal conductivity normal to the foliation reveals somewhat larger differences (3-4%). Optical scanning allowed us to characterize the thermal inhomogeneity of rocks and to identify a three-dimensional anisotropy in thermal conductivity of some gneiss samples. The merits of optical scanning include minor random errors (1.6%), the ability to record the variation of thermal conductivity along the sample, the ability to sample deeply using a slow scanning rate, freedom from constraints for sample size and shape, and quality of mechanical treatment of the sample surface, a contactless mode of measurement, high speed of operation, and the ability to measure on a cylindrical sample surface. More traditional methods remain superior for characterizing bulk conductivity at elevated temperature.Three laboratory methods including divided-bar, line-source and optical scanning are widely applied in geothermal and petrophysical studies. In this study, these three methods were compared for determining the thermal conductivity tensor for anisotropic rocks. For this study, a representative collection of 80 crystalline rock samples from the KTB continental deep borehole was used. Despite substantial thermal inhomogeneity of rock thermal conductivity and high anisotropy, measurement results were in excellent agreement among the three methods.

  2. A multidisciplinary approach for the characterisation of fault zones in geothermal areas in central Mexico

    NASA Astrophysics Data System (ADS)

    Comina, Cesare; Ferrero, Anna Maria; Mandrone, Giuseppe; Vinciguerra, Sergio

    2017-04-01

    There are more than 500 geothermal areas in the Trans-Mexican Volcanic Belt of central Mexico. Of these, two are presently object of a transnational project between EU and Mexico (GEMex): Acoculco, where there is already a commercial exploitation, and Los Humeros, at present not developed yet. The GEMex project aims to improve the resource assessment and the reservoir characterization using novel geophysical and geological methods and interpretations. One of the main issues controlling the geothermal system is the presence of pervasive fracture systems affecting the carbonatic basements underlying the volcanic complex (basalts and andesites). We propose the characterization of rock masses (rock and fractures) using a multiscale analysis, from the field to the outcrop up to the micro scale integrating a number of techniques. In detail, the University of Torino unit will take care of: 1) Technical field studies aimed to the characterization of the mechanical transitions throughout brittle deformation zones, from the intact rock, to the damage zone to the shear/slip zone; moreover, key geophysical parameters (seismic and electrical properties) will be measured; 2) Petrophysical and minero-petrographic detailed studies on representative samples will be performed at room temperature; verification of the mechanical properties of the samples subjected to cycles of heating up to the temperatures of the reservoir (> 400 °C) will be done; measurements of the geophysical properties of the samples will be done in comparison with the measures in place. 3) Numerical modeling to estimate the petrophysical, geophysical and geomechanical properties of the rock mass under the P and T conditions of the reservoir (i.e., using Comsol, VGeST, UDEC, 3DEC, ...). Detailed geological field studies and photogrammetry/laser scanner imaging of studied outcrops are supposed to be available soon: multiscale analysis will benefis from these new data. Results will be shared between EU and Mexican partners to improve the general model of these two geothermal field.

  3. Reservoir and aquifer characterization of fluvial architectural elements: Stubensandstein, Upper Triassic, southwest Germany

    NASA Astrophysics Data System (ADS)

    Hornung, Jens; Aigner, Thomas

    1999-12-01

    This paper aims at a quantitative sedimentological and petrophysical characterization of a terminal alluvial plain system exemplified by the Stubensandstein, South German Keuper Basin. The study follows the outcrop-analogue approach, where information derived from outcrops is collected in order to enhance interpretation of comparable subsurface successions. Quantitative data on sandbody geometries, porosities and permeabilities are presented in order to constrain modelling of subsurface sandbodies and permeability barriers. For sedimentological characterization the method of architectural element analysis (Miall, A.D., 1996. The Geology of Fluvial Deposits. Springer, Berlin) was used, and modified to include poroperm facies. A special photo-technique with a precise theodolite survey was developed to create optically corrected photomosaics for outcrop wall maps from up to 20,000 m 2 large outcrops. Nine architectural elements have been classified and quantified. Bedload, mixed-load and suspended-load channel fills are separated. The petrophysical characterization of the architectural elements integrated porosity and permeability measurements of core-plugs with gamma-ray measurements along representative sections. It could be demonstrated, that certain architectural elements show a characteristic poroperm facies. Four scales of sedimentary cycles have been recognized in the Stubensandstein. Cyclic sedimentation causes changing lithofacies patterns within the architectural elements, depending on their position in the sedimentary cycle. Stratigraphic position exerts only some, paleogeographic position exerts significant influence on porosity and permeability of the sandbodies. The highest poroperm values were found in proximal areas of the alluvial plain and in middle parts within sedimentary macrocycles. The strong internal heterogeneity on the alluvial plain system is important for its reservoir and aquifer characteristics. Compartments of bedload channel sandstones in medial positions of a stratigraphic cycle represent very good reservoirs or aquifers. The seals or aquicludes are formed by extensive floodplain claystones, lacustrine sediments, paleosols, and suspended-load deposits. Strongly cemented zones of sandstones represent aquitards.

  4. Seismic imaging in hardrock environments: The role of heterogeneity?

    NASA Astrophysics Data System (ADS)

    Bongajum, Emmanuel; Milkereit, Bernd; Adam, Erick; Meng, Yijian

    2012-10-01

    We investigate the effect of petrophysical scale parameters and structural dips on wave propagation and imaging in heterogeneous media. Seismic wave propagation effects within the heterogeneous media are studied for different velocity models with scale lengths determined via stochastic analysis of petrophysical logs from the Matagami mine, Quebec, Canada. The elastic modeling study reveals that provided certain conditions of the velocity fluctuations are met, strong local distortions of amplitude and arrival times of propagating waves are observed as the degree of scale length anisotropy in the P-wave velocity increases. The location of these local amplitude anomalies is related to the dips characterizing the fabric of the host rocks. This result is different from the elliptical shape of direct waves often defined by effective anisotropic parameters used for layered media. Although estimates of anisotropic parameters suggest weak anisotropy in the investigated models, these effective anisotropic parameters often used in VTI/TTI do not sufficiently describe the effects of scale length anisotropy in heterogeneous media that show such local amplitude, travel time, and phase distortions in the wavefields. Numerical investigations on the implications for reverse time migration (RTM) routines corroborate that mean P-wave velocity of the host rocks produces reliable imaging results. Based on the RTM results, we postulate the following: weak anisotropy in hardrock environments is a sufficient assumption for processing seismic data; and seismic scattering effects due to velocity heterogeneity with a dip component is not sufficient to cause mislocation errors of target structures as observed in the discrepancy between the location of the strong seismic reflections associated to the Matagami sulfide orebody and its true location. Future work will investigate other factors that may provide plausible explanations for these mislocation problems, with the objective of providing a mitigation strategy for incorporation into the seismic data processing sequence when imaging in hardrock settings.

  5. Multi-scale Pore Imaging Techniques to Characterise Heterogeneity Effects on Flow in Carbonate Rock

    NASA Astrophysics Data System (ADS)

    Shah, S. M.

    2017-12-01

    Digital rock analysis and pore-scale studies have become an essential tool in the oil and gas industry to understand and predict the petrophysical and multiphase flow properties for the assessment and exploitation of hydrocarbon reserves. Carbonate reservoirs, accounting for majority of the world's hydrocarbon reserves, are well known for their heterogeneity and multiscale pore characteristics. The pore sizes in carbonate rock can vary over orders of magnitudes, the geometry and topology parameters of pores at different scales have a great impact on flow properties. A pore-scale study is often comprised of two key procedures: 3D pore-scale imaging and numerical modelling techniques. The fundamental problem in pore-scale imaging and modelling is how to represent and model the different range of scales encountered in porous media, from the pore-scale to macroscopic petrophysical and multiphase flow properties. However, due to the restrictions of image size vs. resolution, the desired detail is rarely captured at the relevant length scales using any single imaging technique. Similarly, direct simulations of transport properties in heterogeneous rocks with broad pore size distributions are prohibitively expensive computationally. In this study, we present the advances and review the practical limitation of different imaging techniques varying from core-scale (1mm) using Medical Computed Tomography (CT) to pore-scale (10nm - 50µm) using Micro-CT, Confocal Laser Scanning Microscopy (CLSM) and Focussed Ion Beam (FIB) to characterise the complex pore structure in Ketton carbonate rock. The effect of pore structure and connectivity on the flow properties is investigated using the obtained pore scale images of Ketton carbonate using Pore Network and Lattice-Boltzmann simulation methods in comparison with experimental data. We also shed new light on the existence and size of the Representative Element of Volume (REV) capturing the different scales of heterogeneity from the pore-scale imaging.

  6. Microstructural characterization, petrophysics and upscaling - from porous media to fractural media

    NASA Astrophysics Data System (ADS)

    Liu, J.; Liu, K.; Regenauer-Lieb, K.

    2017-12-01

    We present an integrated study for the characterization of complex geometry, fluid transport features and mechanical deformation at micro-scale and the upscaling of properties using microtomographic data: We show how to integrate microstructural characterization by the volume fraction, specific surface area, connectivity (percolation), shape and orientation of microstructures with identification of individual fractures from a 3D fractural network. In a first step we use stochastic analyses of microstructures to determine the geometric RVE (representative volume element) of samples. We proceed by determining the size of a thermodynamic RVE by computing upper/lower bounds of entropy production through Finite Element (FE) analyses on a series of models with increasing sizes. The minimum size for thermodynamic RVE's is identified on the basis of the convergence criteria of the FE simulations. Petrophysical properties (permeability and mechanical parameters, including plastic strength) are then computed numerically if thermodynamic convergence criteria are fulfilled. Upscaling of properties is performed by means of percolation theory. The percolation threshold is detected by using a shrinking/expanding algorithm on static micro-CT images of rocks. Parameters of the scaling laws can be extracted from quantitative analyses and/or numerical simulations on a series of models with similar structures but different porosities close to the percolation threshold. Different rock samples are analyzed. Characterizing parameters of porous/fractural rocks are obtained. Synthetic derivative models of the microstructure are used to estimate the relationships between porosity and mechanical properties. Results obtained from synthetic sandstones show that yield stress, cohesion and the angle of friction are linearly proportional to porosity. Our integrated study shows that digital rock technology can provide meaningful parameters for effective upscaling if thermodynamic volume averaging satisfies the convergence criteria. For strongly heterogeneous rocks, however, thermodynamic convergence criteria may not meet; a continuum approach cannot be justified in this case.

  7. Use of nanotomographic images for structure analysis of carbonate rocks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagata, Rodrigo; Appoloni, Carlos Roberto

    Carbonate rocks store more than 50% of world's petroleum. These rocks' structures are highly complex and vary depending on many factors regarding their formation, e.g., lithification and diagenesis. In order to perform an effective extraction of petroleum it is necessary to know petrophysical parameters, such as total porosity, pore size and permeability of the reservoir rocks. Carbonate rocks usually have a range of pore sizes that goes from nanometers to meters or even dozen of meters. The nanopores and micropores might play an important role in the pores connectivity of carbonate rocks. X-ray computed tomography (CT) has been widely usedmore » to analyze petrophysical parameters in recent years. This technique has the capability to generate 2D images of the samples' inner structure and also allows the 3D reconstruction of the actual analyzed volume. CT is a powerful technique, but its results depend on the spatial resolution of the generated image. Spatial resolution is a measurement parameter that indicates the smallest object that can be detected. There are great difficulties to generate images with nanoscale resolution (nanotomographic images). In this work three carbonate rocks, one dolomite and two limestones (that will be called limestone A and limestone B) were analyzed by nanotomography. The measurements were performed with the SkyScan2011 nanotomograph, operated at 60 kV and 200 μA to measure the dolomite sample and 40 kV and 200 μA to measure the limestone samples. Each sample was measured with a given spatial resolution (270 nm for the dolomite sample, 360 nm for limestone A and 450 nm for limestone B). The achieved results for total porosity were: 3.09 % for dolomite, 0.65% for limestone A and 3.74% for limestone B. This paper reports the difficulties to acquire nanotomographic images and further analysis about the samples' pore sizes.« less

  8. Geomaterials and architecture of the medieval monuments of Sardinia (Italy): petrophysical investigations on their construction materials and documentation on the architectonic aspects using digital technologies

    NASA Astrophysics Data System (ADS)

    Columbu, Stefano; Verdiani, Giorgio

    2015-04-01

    The Sardinia Island is in the core area of the Mediterranean Sea. Its position has made it the crossing point of many cultural and political events, but at the same time its isolation has favoured the manifestation of specific and unique Cultural Heritage phenomena. The network of several medieval monuments (i.e., Romanesque churches) disseminated all around the island clearly shows how an architectural language can be declined according to site specific materials and specific artistical and practical choices, always preserving its original logic and grammar. On the bases of different architectural characteristics and petrophysical features of their lithology, a significant number of churches have been chosen from the different medieval geographical-political areas of the Sardinia named (at that time) "Giudicati". Each of these churches were surveyed using the following methods: photography, 3D Laser Scanner for the whole interior and exterior parts (using a Leica HDS 6000 and a Cam/2 Faro Photon units), photogrammetry (using high resolution Nikon D700 and D800e) of a selected set of the extern surface of significant altered samples (aimed to the production of high quality and highly detailed 3D digital models), direct sampling of representative rocks and ancient mortars for geochemical and minero-petrographic analysis using optical polarized microscope, electronic microscopy (SEM), X-Ray fluorescence (XRF), X-Ray diffractometry (XRD). The physical-mechanical properties (real and bulk densities, open and closed porosity, water absorption and saturation, vapour permeability, flexion and compression strengths, etc.) of various geomaterials are determined with helium picnometry, microscopic image analysis, gas-permeability thermostatic chamber, oil-hydraulic press machine, Point Load Test (PLT), abrasimeter. For each church, when there was the occasion, some specific case study has been developed, matching the information about the materials and the specific events connected to some Cultural Heritage element; this allowed to compare differences in recent substituted capitals, important reconstruction events, signs of the evolution of the building. All data were then treated and analysed to deepen the knowledge about the most meaningful aspects of different construction techniques and use of materials, provenance of raw materials, stone alterations and static-structure decay. As the result, a base was created to read common behaviours, design choices, recursive constructive solutions, and the "models" guiding the ancient intentions. A great effort has been made to keep together all these different kind of data, from the chemical and petrophysical information on the geomaterials to the geometrical description of the building, creating a robust relationship between the architectural and geological approach to the subject. The very specific photogrammetric survey of the stones sampled will be presented in details with a clear description of the applied processing and of the reached results. This contribution will present the progress state of this research, together with some widening about the most important churches between the investigated group, like: St. Saturnino in Cagliari (South Sardinia), St. Trinità of Saccargia (near Sassari, North), St. Antioco of Bisarcio in Ozieri (North-central), St. Giusta (near Oristano, central-Western). Keywords: Petrographic characterization, Physical properties, Medieval architecture, Laser scan, Digital survey

  9. Variations of the petrophysical properties of rocks with increasing hydrocarbons content and their implications at larger scale: insights from the Majella reservoir (Italy)

    NASA Astrophysics Data System (ADS)

    Trippetta, Fabio; Ruggieri, Roberta; Lipparini, Lorenzo

    2016-04-01

    Crustal processes such as deformations or faulting are strictly related to the petrophysical properties of involved rocks. These properties depend on mineral composition, fabric, pores and any secondary features such as cracks or infilling material that may have been introduced during the whole diagenetic and tectonic history of the rock. In this work we investigate the role of hydrocarbons (HC) in changing the petrophysical properties of rock by merging laboratory experiments, well data and static models focusing on the carbonate-bearing Majella reservoir. This reservoir represent an interesting analogue for the several oil fields discovered in the subsurface in the region, allowing a comparison of a wide range of geological and geophysical data at different scale. The investigated lithology is made of high porosity ramp calcarenites, structurally slightly affected by a superimposed fracture system and displaced by few major normal faults, with some minor strike-slip movements. Sets of rock specimens were selected in the field and in particular two groups were investigated: 1. clean rocks (without oil) and 2. HC bearing rocks (with different saturations). For both groups, density, porosity, P and S wave velocity, permeability and elastic moduli measurements at increasing confining pressure were conducted on cylindrical specimens at the HP-HT Laboratory of the Istituto Nazionale di Geofisica e Vulcanologia (INGV) in Rome, Italy. For clean samples at ambient pressure, laboratory porosity varies from 10 % up to 26 % and P wave velocity (Vp) spans from 4,1 km/s to 4,9 km/s and a very good correlation between Vp, Vs and porosity is observed. The P wave velocity at 100 MPa of confining pressure, ranges between 4,5 km/s and 5,2 km/s with a pressure independent Vp/Vs ratio of about 1,9. The presence of HC within the samples affects both Vp and Vs. In particular velocities increase with the presence of hydrocarbons proportionally respect to the amount of the filled porosity. Preliminary data also suggest a different behaviour at increasing confining pressure for clean and-oil bearing samples: almost perfectly elastic behaviour for oil-bearing samples and more inelastic behaviours for cleaner samples. Thus HC presence appears to contrast the increase of confining pressure acting as semi-fluids, reducing the rock inelastic compaction and enhancing its elastic behaviour. Trying to upscale our rock-physics results, we started from wells and laboratory data on stratigraphy, porosity and Vp in order to simulate the effect of the HC presence at larger scale, using Petrel® software. The developed synthetic model highlights that Vp, which is primarily controlled by porosity, changes significantly within oil-bearing portions, with a notable impact on the velocity model that should be adopted. Moreover we are currently performing laboratory tests in order to evaluate the changes in the elastic parameters with the aim of modelling the effects of the HC on the mechanical behaviour of the involved rocks at larger scale.

  10. Facies distribution, depositional environment, and petrophysical features of the Sharawra Formation, Old Qusaiba Village, Central Saudi Arabia

    NASA Astrophysics Data System (ADS)

    Abbas, Muhammad Asif; Kaminski, Michael; Umran Dogan, A.

    2016-04-01

    The Silurian Sharawra Formation has great importance as it rests over the richest source rock of the Qusaiba Formation in central Saudi Arabia. The Sharawra Formation has four members including Jarish, Khanafriyah, Nayyal, and Zubliyat. The formation mainly consists of sandstone and siltstone with subordinate shale sequences. The lack of published research on this formation requires fundamental studies that can lay the foundation for future research. Three outcrops were selected from the Old Qusaiba Village in Central Saudi Arabia for field observations, petrographical and petrophysical study. Thin section study has been aided by quantitative mineralogical characterization using scanning electron microscopy - energy dispersive spectroscopy and powder x-ray diffraction (XRD) for both minerals, cements, and clay minerals (detrital and authigenic). The outcrops were logged in detail and nine different lithofacies have been identified. The thin section study has revealed the Sharawra Formation to be mainly subarkosic, while the mica content increases near to its contact with the Qusaiba Formation. The XRD data has also revealed a prominent change in mineralogy with inclusion of minerals like phlogopite and microcline with depths. Field observations delineated a prominent thinning of strata as lithofacies correlation clearly shows the thinning of strata in the southwestern direction. The absence of outcrop exposures further supports the idea of southwestern thinning of strata. This is mainly attributed to local erosion and the presence of thicker shale interbeds in the southeastern section, which was probably subjected to more intense erosion than the northwestern one. The Sharawra Formation rests conformably over the thick transgressive shale sequence, deposited during the post glacial depositional cycle. The lowermost massive sandstone bed of the Sharawra Formation represents the beginning of the regressive period. The shale interbeds in the lower part are evidence of moderate-scale transgressive episodes, while the thin shale interbeds in the middle and upper part of the Sharawra Formation represent small-scale transgressions. Overall, the Sharawra Formation contains a series of repetitive transgressive and regressive events and has been interpreted as a pro-deltaic deposit in previous studies. In the present study, the lowermost sandstone thickly bedded facies lie within the transition zone environment. The siltstone facies and the horizontally stratified facies show a middle shore face environment. The middle shore face environment is present locally. The bioturbation in the uppermost facies is indicative of the upper shore face environment. The porosity values do not vary much, as the average porosity for the sandstone facies is about 15%, for the siltstones it ranges about 7%. The permeability is variable throughout the formation, the values range from 50 to 300 md. Although sandstone has a good porosity and permeability, the siltstone facies exhibit poor petrophysical characteristics. In terms of reservoir characterization, the mineralogical mature, moderately well sorted top most sandstone facies, with appreciable porosity and permeability can be considered as a potential reservoir rock. This study has provided a base for future quantitative studies in this important formation in the area.

  11. An adaptive coupling strategy for joint inversions that use petrophysical information as constraints

    NASA Astrophysics Data System (ADS)

    Heincke, Björn; Jegen, Marion; Moorkamp, Max; Hobbs, Richard W.; Chen, Jin

    2017-01-01

    Joint inversion strategies for geophysical data have become increasingly popular as they allow for the efficient combination of complementary information from different data sets. The algorithm used for the joint inversion needs to be flexible in its description of the subsurface so as to be able to handle the diverse nature of the data. Hence, joint inversion schemes are needed that 1) adequately balance data from the different methods, 2) have stable convergence behavior, 3) consider the different resolution power of the methods used and 4) link the parameter models in a way that they are suited for a wide range of applications. Here, we combine active source seismic P-wave tomography, gravity and magnetotelluric (MT) data in a petrophysical joint inversion that accounts for these issues. Data from the different methods are inverted separately but are linked through constraints accounting for parameter relationships. An advantage of performing the inversions separately is that no relative weighting between the data sets is required. To avoid perturbing the convergence behavior of the inversions by the coupling, the strengths of the constraints are readjusted at each iteration. The criterion we use to control the adaption of the coupling strengths is based on variations in the objective functions of the individual inversions from one to the next iteration. Adaption of the coupling strengths makes the joint inversion scheme also applicable to subsurface conditions, where assumed relationships are not valid everywhere, because the individual inversions decouple if it is not possible to reach adequately low data misfits for the made assumptions. In addition, the coupling constraints depend on the relative resolutions of the methods, which leads to an improved convergence behavior of the joint inversion. Another benefit of the proposed scheme is that structural information can easily be incorporated in the petrophysical joint inversion (no additional terms are added in the objective functions) by using mutually controlled structural weights for the smoothing constraints. We test our scheme using data generated from a synthetic 2-D sub-basalt model. We observe that the adaption of the coupling strengths makes the convergence of the inversions very robust (data misfits of all methods are close to the target misfits) and that final results are always close to the true models independent of the parameter choices. Finally, the scheme is applied on real data sets from the Faroe-Shetland Basin to image a basaltic sequence and underlying structures. The presence of a borehole and a 3-D reflection seismic survey in this region allows direct comparison and, hence, evaluate the quality of the joint inversion results. The results from joint inversion are more consistent with results from other studies than the ones from the corresponding individual inversions and the shape of the basaltic sequence is better resolved. However, due to the limited resolution of the individual methods used it was not possible to resolve structures underneath the basalt in detail, indicating that additional geophysical information (e.g. CSEM, reflection onsets) needs to be included.

  12. Geothermal prospection in the Greater Geneva Basin (Switzerland and France): Structural and reservoir quality assessment

    NASA Astrophysics Data System (ADS)

    Rusillon, Elme; Clerc, Nicolas; Makhloufi, Yasin; Brentini, Maud; Moscariello, Andrea

    2017-04-01

    A reservoir assessment was performed in the Greater Geneva Basin to evaluate the geothermal resources potential of low to medium enthalpy (Moscariello, 2016). For this purpose, a detail structural analysis of the basin was performed (Clerc et al., 2016) simultaneously with a reservoir appraisal study including petrophysical properties assessment in a consistent sedimentological and stratigraphical frame (Brentini et al., 2017). This multi-disciplinary study was organised in 4 steps: (1) investigation of the surrounding outcrops to understand the stratigraphy and lateral facies distribution of the sedimentary sequence from Permo-Carboniferous to Lower Cretaceous units; (2) development of 3D geological models derived from 2D seismic and well data focusing on the structural scheme of the basin to constrain better the tectonic influence on facies distribution and to assess potential hydraulic connectivity through faults between reservoir units ; (3) evaluation of the distribution, geometry, sedimentology and petrophysical properties of potential reservoir units from well data; (4) identification and selection of the most promising reservoir units for in-depth rock type characterization and 3D modeling. Petrophysical investigations revealed that the Kimmeridgian-Tithonian Reef Complex and the underlying Calcaires de Tabalcon units are the most promising geothermal reservoir targets (porosity range 10-20%; permeability to 1mD). Best reservoir properties are measured in patch reefs and high-energy peri-reefal depositional environments, which are surrounded by synchronous tight lagoonal deposits. Associated highly porous dolomitized intervals reported in the western part of the basin also provide enhanced reservoir quality. The distribution and geometry of best reservoir bodies is complex and constrained by (1) palaeotopography, which can be affected by synsedimentary fault activity during Mesozoic times, (2) sedimentary factors such as hydrodynamics, sea level variations, or sedimentation rates and (3) diagenetic history (Makhloufi et al., 2017). A detail structural characterization of the basin using 2D seismic data reveals the existence of several wrench fault zones and intra-basinal thrusts across the basin, which could act as hydraulic conduits and play a key role in connecting the most productive reservoir facies. To understand the propagation of these heterogeneous reservoirs, rock types are currently defined and will be integrated into 3D geological models. This integrated study allows us to understand better the distribution and properties of productive reservoir facies as well as hydraulic connectivity zones within the study area. This provides consistent knowledge for future geothermal exploration steps toward the successful development of this sustainable energy resource in the Greater Geneva Basin. Brentini et al. 2017 : Geothermal prospection in the Greater Geneva Basin: integration of geological data in the new Information System. Abstract, EGU General Assembly 2017, Vienna, Austria Clerc et al. 2016 : Structural Modeling of the Geneva Basin for Geothermal Ressource Assessment. Abstract, 14th Swiss Geoscience Meeting, Geneva, Switzerland Makhloufi et al. 2017 : Geothermal prospection in the Greater Geneva Basin (Switzerland and France) : impact of diagenesis on reservoir properties of the Upper Jurassic carbonate sediments. Abstract, EGU General Assembly 2017, Vienna, Austria Moscariello, A. 2016 : Geothermal exploration in SW Switzerland, Proceeding , European Geotermal Congress 2016, Strasbourg, France

  13. 31 P magnetic resonance fingerprinting for rapid quantification of creatine kinase reaction rate in vivo.

    PubMed

    Wang, Charlie Y; Liu, Yuchi; Huang, Shuying; Griswold, Mark A; Seiberlich, Nicole; Yu, Xin

    2017-12-01

    The purpose of this work was to develop a 31 P spectroscopic magnetic resonance fingerprinting (MRF) method for fast quantification of the chemical exchange rate between phosphocreatine (PCr) and adenosine triphosphate (ATP) via creatine kinase (CK). A 31 P MRF sequence (CK-MRF) was developed to quantify the forward rate constant of ATP synthesis via CK ( kfCK), the T 1 relaxation time of PCr ( T1PCr), and the PCr-to-ATP concentration ratio ( MRPCr). The CK-MRF sequence used a balanced steady-state free precession (bSSFP)-type excitation with ramped flip angles and a unique saturation scheme sensitive to the exchange between PCr and γATP. Parameter estimation was accomplished by matching the acquired signals to a dictionary generated using the Bloch-McConnell equation. Simulation studies were performed to examine the susceptibility of the CK-MRF method to several potential error sources. The accuracy of nonlocalized CK-MRF measurements before and after an ischemia-reperfusion (IR) protocol was compared with the magnetization transfer (MT-MRS) method in rat hindlimb at 9.4 T (n = 14). The reproducibility of CK-MRF was also assessed by comparing CK-MRF measurements with both MT-MRS (n = 17) and four angle saturation transfer (FAST) (n = 7). Simulation results showed that CK-MRF quantification of kfCK was robust, with less than 5% error in the presence of model inaccuracies including dictionary resolution, metabolite T 2 values, inorganic phosphate metabolism, and B 1 miscalibration. Estimation of kfCK by CK-MRF (0.38 ± 0.02 s -1 at baseline and 0.42 ± 0.03 s -1 post-IR) showed strong agreement with MT-MRS (0.39 ± 0.03 s -1 at baseline and 0.44 ± 0.04 s -1 post-IR). kfCK estimation was also similar between CK-MRF and FAST (0.38 ± 0.02 s -1 for CK-MRF and 0.38 ± 0.11 s -1 for FAST). The coefficient of variation from 20 s CK-MRF quantification of kfCK was 42% of that by 150 s MT-MRS acquisition and was 12% of that by 20 s FAST acquisition. This study demonstrates the potential of a 31 P spectroscopic MRF framework for rapid, accurate and reproducible quantification of chemical exchange rate of CK in vivo. Copyright © 2017 John Wiley & Sons, Ltd.

  14. A new approach to comprehensive quantification of linear landscape elements using biotope types on a regional scale

    NASA Astrophysics Data System (ADS)

    Hirt, Ulrike; Mewes, Melanie; Meyer, Burghard C.

    The structure of a landscape is highly relevant for research and planning (such as fulfilling the requirements of the Water Framework Directive - WFD - and for implementation of comprehensive catchment planning). There is a high potential for restoration of linear landscape elements in most European landscapes. By implementing the WFD in Germany, the restoration of linear landscape elements could be a valuable measure, for example to reduce nutrient input into rivers. Despite this importance of landscape structures for water and nutrients fluxes, biodiversity and the appearance of a landscape, specific studies of the linear elements are rare for larger catchment areas. Existing studies are limited because they either use remote sensing data, which does not adequately differentiate all types of linear landscape elements, or they focus only on a specific type of linear element. To address these limitations, we developed a framework allowing comprehensive quantification of linear landscape elements for catchment areas, using publicly available biotope type data. We analysed the dependence of landscape structures on natural regions and regional soil characteristics. Three data sets (differing in biotopes, soil parameters and natural regions) were generated for the catchment area of the middle Mulde River (2700 km 2) in Germany, using overlay processes in geographic information systems (GIS), followed by statistical evaluation. The linear landscape components of the total catchment area are divided into roads (55%), flowing water (21%), tree rows (14%), avenues (5%), and hedges (2%). The occurrence of these landscape components varies regionally among natural units and different soil regions. For example, the mixed deciduous stands (3.5 m/ha) are far more frequent in foothills (6 m/ha) than in hill country (0.9 m/ha). In contrast, fruit trees are more frequent in hill country (5.2 m/ha) than in the cooler foothills (0.5 m/ha). Some 70% of avenues, and 40% of tree rows, are discontinuous; in contrast, only 20% of hedges are discontinuous. Using our innovative framework, comprehensive information about landscape elements can now be obtained for regional applications. This approach can be applied to other regions and is highly relevant for landscape planning, erosion control, protection of waters and preservation of biotopes and species.

  15. Entangled states in the role of witnesses

    NASA Astrophysics Data System (ADS)

    Wang, Bang-Hai

    2018-05-01

    Quantum entanglement lies at the heart of quantum mechanics and quantum information processing. In this work, we show a framework where entangled states play the role of witnesses. We extend the notion of entanglement witnesses, developing a hierarchy of witnesses for classes of observables. This hierarchy captures the fact that entangled states act as witnesses for detecting entanglement witnesses and separable states act as witnesses for the set of non-block-positive Hermitian operators. Indeed, more hierarchies of witnesses exist. We introduce the concept of finer and optimal entangled states. These definitions not only give an unambiguous and non-numeric quantification of entanglement and an alternative perspective on edge states but also answer the open question of what the remainder of the best separable approximation of a density matrix is. Furthermore, we classify all entangled states into disjoint families with optimal entangled states at its heart. This implies that we can focus only on the study of a typical family with optimal entangled states at its core when we investigate entangled states. Our framework also assembles many seemingly different findings with simple arguments that do not require lengthy calculations.

  16. Uncertainty analyses of CO2 plume expansion subsequent to wellbore CO2 leakage into aquifers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hou, Zhangshuan; Bacon, Diana H.; Engel, David W.

    2014-08-01

    In this study, we apply an uncertainty quantification (UQ) framework to CO2 sequestration problems. In one scenario, we look at the risk of wellbore leakage of CO2 into a shallow unconfined aquifer in an urban area; in another scenario, we study the effects of reservoir heterogeneity on CO2 migration. We combine various sampling approaches (quasi-Monte Carlo, probabilistic collocation, and adaptive sampling) in order to reduce the number of forward calculations while trying to fully explore the input parameter space and quantify the input uncertainty. The CO2 migration is simulated using the PNNL-developed simulator STOMP-CO2e (the water-salt-CO2 module). For computationally demandingmore » simulations with 3D heterogeneity fields, we combined the framework with a scalable version module, eSTOMP, as the forward modeling simulator. We built response curves and response surfaces of model outputs with respect to input parameters, to look at the individual and combined effects, and identify and rank the significance of the input parameters.« less

  17. Towards a Unified Framework in Hydroclimate Extremes Prediction in Changing Climate

    NASA Astrophysics Data System (ADS)

    Moradkhani, H.; Yan, H.; Zarekarizi, M.; Bracken, C.

    2016-12-01

    Spatio-temporal analysis and prediction of hydroclimate extremes are of paramount importance in disaster mitigation and emergency management. The IPCC special report on managing the risks of extreme events and disasters emphasizes that the global warming would change the frequency, severity, and spatial pattern of extremes. In addition to climate change, land use and land cover changes also influence the extreme characteristics at regional scale. Therefore, natural variability and anthropogenic changes to the hydroclimate system result in nonstationarity in hydroclimate variables. In this presentation recent advancements in developing and using Bayesian approaches to account for non-stationarity in hydroclimate extremes are discussed. Also, implications of these approaches in flood frequency analysis, treatment of spatial dependence, the impact of large-scale climate variability, the selection of cause-effect covariates, with quantification of model errors in extreme prediction is explained. Within this framework, the applicability and usefulness of the ensemble data assimilation for extreme flood predictions is also introduced. Finally, a practical and easy to use approach for better communication with decision-makers and emergency managers is presented.

  18. Spatial rule-based assessment of habitat potential to predict impact of land use changes on biodiversity at municipal scale.

    PubMed

    Scolozzi, Rocco; Geneletti, Davide

    2011-03-01

    In human dominated landscapes, ecosystems are under increasing pressures caused by urbanization and infrastructure development. In Alpine valleys remnant natural areas are increasingly affected by habitat fragmentation and loss. In these contexts, there is a growing risk of local extinction for wildlife populations; hence assessing the consequences on biodiversity of proposed land use changes is extremely important. The article presents a methodology to assess the impacts of land use changes on target species at a local scale. The approach relies on the application of ecological profiles of target species for habitat potential (HP) assessment, using high resolution GIS-data within a multiple level framework. The HP, in this framework, is based on a species-specific assessment of the suitability of a site, as well of surrounding areas. This assessment is performed through spatial rules, structured as sets of queries on landscape objects. We show that by considering spatial dependencies in habitat assessment it is possible to perform better quantification of impacts of local-level land use changes on habitats.

  19. Magnetic porous carbon derived from a bimetallic metal-organic framework for magnetic solid-phase extraction of organochlorine pesticides from drinking and environmental water samples.

    PubMed

    Liu, Yaxi; Gao, Zongjun; Wu, Ri; Wang, Zhenhua; Chen, Xiangfeng; Chan, T-W Dominic

    2017-01-06

    In this work, magnetic porous carbon material derived from a bimetallic metal-organic framework was explored as an adsorbent for magnetic solid-phase extraction of organochlorine pesticides (OCPs). The synthesized porous carbon possessed a high specific surface area and magnetization saturation. The OCPs in the samples were quantified using gas chromatography coupled with a triple quadrupole mass spectrometer. The experimental parameters, including the desorption solvent and conditions, amount of adsorbent, extraction time, extraction temperature, and ionic strength of the solution, were optimized. Under optimal conditions, the developed method displayed good linearity (r>0.99) within the concentration range of 2-500ngL -1 . Low limits of detection (0.39-0.70ngL -1 , signal-to-noise ratio=3:1) and limits of quantification (1.45-2.0ngL -1 , signal-to-noise ratio=10:1) as well as good precision (relative standard deviation<10%) were also obtained. The developed method was applied in the analysis of OCPs in drinking and environmental water samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. High-resolution monitoring of nutrients in groundwater and surface waters: process understanding, quantification of loads and concentrations, and management applications

    NASA Astrophysics Data System (ADS)

    van Geer, Frans C.; Kronvang, Brian; Broers, Hans Peter

    2016-09-01

    Four sessions on "Monitoring Strategies: temporal trends in groundwater and surface water quality and quantity" at the EGU conferences in 2012, 2013, 2014, and 2015 and a special issue of HESS form the background for this overview of the current state of high-resolution monitoring of nutrients. The overview includes a summary of technologies applied in high-frequency monitoring of nutrients in the special issue. Moreover, we present a new assessment of the objectives behind high-frequency monitoring as classified into three main groups: (i) improved understanding of the underlying hydrological, chemical, and biological processes (PU); (ii) quantification of true nutrient concentrations and loads (Q); and (iii) operational management, including evaluation of the effects of mitigation measures (M). The contributions in the special issue focus on the implementation of high-frequency monitoring within the broader context of policy making and management of water in Europe for support of EU directives such as the Water Framework Directive, the Groundwater Directive, and the Nitrates Directive. The overview presented enabled us to highlight the typical objectives encountered in the application of high-frequency monitoring and to reflect on future developments and research needs in this growing field of expertise.

  1. A framework for assessing the uncertainty in wave energy delivery to targeted subsurface formations

    NASA Astrophysics Data System (ADS)

    Karve, Pranav M.; Kallivokas, Loukas F.; Manuel, Lance

    2016-02-01

    Stress wave stimulation of geological formations has potential applications in petroleum engineering, hydro-geology, and environmental engineering. The stimulation can be applied using wave sources whose spatio-temporal characteristics are designed to focus the emitted wave energy into the target region. Typically, the design process involves numerical simulations of the underlying wave physics, and assumes a perfect knowledge of the material properties and the overall geometry of the geostructure. In practice, however, precise knowledge of the properties of the geological formations is elusive, and quantification of the reliability of a deterministic approach is crucial for evaluating the technical and economical feasibility of the design. In this article, we discuss a methodology that could be used to quantify the uncertainty in the wave energy delivery. We formulate the wave propagation problem for a two-dimensional, layered, isotropic, elastic solid truncated using hybrid perfectly-matched-layers (PMLs), and containing a target elastic or poroelastic inclusion. We define a wave motion metric to quantify the amount of the delivered wave energy. We, then, treat the material properties of the layers as random variables, and perform a first-order uncertainty analysis of the formation to compute the probabilities of failure to achieve threshold values of the motion metric. We illustrate the uncertainty quantification procedure using synthetic data.

  2. High temperature polymer degradation: Rapid IR flow-through method for volatile quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giron, Nicholas H.; Celina, Mathew C.

    Accelerated aging of polymers at elevated temperatures often involves the generation of volatiles. These can be formed as the products of oxidative degradation reactions or intrinsic pyrolytic decomposition as part of polymer scission reactions. A simple analytical method for the quantification of water, CO 2, and CO as fundamental signatures of degradation kinetics is required. Here, we describe an analytical framework and develops a rapid mid-IR based gas analysis methodology to quantify volatiles that are contained in small ampoules after aging exposures. The approach requires identification of unique spectral signatures, systematic calibration with known concentrations of volatiles, and a rapidmore » acquisition FTIR spectrometer for time resolved successive spectra. Furthermore, the volatiles are flushed out from the ampoule with dry N2 carrier gas and are then quantified through spectral and time integration. This method is sufficiently sensitive to determine absolute yields of ~50 μg water or CO 2, which relates to probing mass losses of less than 0.01% for a 1 g sample, i.e. the early stages in the degradation process. Such quantitative gas analysis is not easily achieved with other approaches. Our approach opens up the possibility of quantitative monitoring of volatile evolution as an avenue to explore polymer degradation kinetics and its dependence on time and temperature.« less

  3. High temperature polymer degradation: Rapid IR flow-through method for volatile quantification

    DOE PAGES

    Giron, Nicholas H.; Celina, Mathew C.

    2017-05-19

    Accelerated aging of polymers at elevated temperatures often involves the generation of volatiles. These can be formed as the products of oxidative degradation reactions or intrinsic pyrolytic decomposition as part of polymer scission reactions. A simple analytical method for the quantification of water, CO 2, and CO as fundamental signatures of degradation kinetics is required. Here, we describe an analytical framework and develops a rapid mid-IR based gas analysis methodology to quantify volatiles that are contained in small ampoules after aging exposures. The approach requires identification of unique spectral signatures, systematic calibration with known concentrations of volatiles, and a rapidmore » acquisition FTIR spectrometer for time resolved successive spectra. Furthermore, the volatiles are flushed out from the ampoule with dry N2 carrier gas and are then quantified through spectral and time integration. This method is sufficiently sensitive to determine absolute yields of ~50 μg water or CO 2, which relates to probing mass losses of less than 0.01% for a 1 g sample, i.e. the early stages in the degradation process. Such quantitative gas analysis is not easily achieved with other approaches. Our approach opens up the possibility of quantitative monitoring of volatile evolution as an avenue to explore polymer degradation kinetics and its dependence on time and temperature.« less

  4. A Multi-physics Approach to Understanding Low Porosity Soils and Reservoir Rocks

    NASA Astrophysics Data System (ADS)

    Prasad, M.; Mapeli, C.; Livo, K.; Hasanov, A.; Schindler, M.; Ou, L.

    2017-12-01

    We present recent results on our multiphysics approach to rock physics. Thus, we evaluate geophysical measurements by simultaneously measuring petrophysical properties or imaging strains. In this paper, we present simultaneously measured acoustic and electrical anisotropy data as functions of pressure. Similarly, we present strains and strain localization images simultaneously acquired with acoustic measurements as well as NMR T2 relaxations on pressurized fluids as well as rocks saturated with these pressurized fluids. Such multiphysics experiments allow us to constrain and assign appropriate causative mechanisms to development rock physics models. They also allow us to decouple various effects, for example, fluid versus pressure, on geophysical measurements. We show applications towards reservoir characterization as well as CO2 sequestration applications.

  5. Editorial

    NASA Astrophysics Data System (ADS)

    Agosta, Fabrizio; Luetkemeyer, P. Benjamin; Lamarche, Juliette; Crider, Juliet G.; Lacombe, Olivier

    2016-10-01

    The present Volume is after the 2015 EGU General Assembly, held in Vienna (Austria), where we convened a session entitled "The role of fluids in faulting and fracturing in carbonates and other upper crustal rocks". In that occasion, more than forty contributions were illustrated as oral and poster presentations. The invitation to contribute to this Volume was extended not only to the session participants, but also to a wider spectrum of researchers working on related topics. As a result, a group of Earth scientists encompassing geologists, geophysicists, geochemists and petrologists contributed to this Volume, providing a sampling of the state-of-the-science on fluids and faulting in carbonate, crystalline and siliciclastic rocks from studies that combine and integrate different methods, including rock mechanics, petrophysics, structural diagenesis and crustal permeability.

  6. Spreadsheet log analysis in subsurface geology

    USGS Publications Warehouse

    Doveton, J.H.

    2000-01-01

    Most of the direct knowledge of the geology of the subsurface is gained from the examination of core and drill-cuttings recovered from boreholes drilled by the petroleum and water industries. Wireline logs run in these same boreholes generally have been restricted to tasks of lithostratigraphic correlation and thee location of hydrocarbon pay zones. However, the range of petrophysical measurements has expanded markedly in recent years, so that log traces now can be transformed to estimates of rock composition. Increasingly, logs are available in a digital format that can be read easily by a desktop computer and processed by simple spreadsheet software methods. Taken together, these developments offer accessible tools for new insights into subsurface geology that complement the traditional, but limited, sources of core and cutting observations.

  7. Uncertainty quantification and propagation in dynamic models using ambient vibration measurements, application to a 10-story building

    NASA Astrophysics Data System (ADS)

    Behmanesh, Iman; Yousefianmoghadam, Seyedsina; Nozari, Amin; Moaveni, Babak; Stavridis, Andreas

    2018-07-01

    This paper investigates the application of Hierarchical Bayesian model updating for uncertainty quantification and response prediction of civil structures. In this updating framework, structural parameters of an initial finite element (FE) model (e.g., stiffness or mass) are calibrated by minimizing error functions between the identified modal parameters and the corresponding parameters of the model. These error functions are assumed to have Gaussian probability distributions with unknown parameters to be determined. The estimated parameters of error functions represent the uncertainty of the calibrated model in predicting building's response (modal parameters here). The focus of this paper is to answer whether the quantified model uncertainties using dynamic measurement at building's reference/calibration state can be used to improve the model prediction accuracies at a different structural state, e.g., damaged structure. Also, the effects of prediction error bias on the uncertainty of the predicted values is studied. The test structure considered here is a ten-story concrete building located in Utica, NY. The modal parameters of the building at its reference state are identified from ambient vibration data and used to calibrate parameters of the initial FE model as well as the error functions. Before demolishing the building, six of its exterior walls were removed and ambient vibration measurements were also collected from the structure after the wall removal. These data are not used to calibrate the model; they are only used to assess the predicted results. The model updating framework proposed in this paper is applied to estimate the modal parameters of the building at its reference state as well as two damaged states: moderate damage (removal of four walls) and severe damage (removal of six walls). Good agreement is observed between the model-predicted modal parameters and those identified from vibration tests. Moreover, it is shown that including prediction error bias in the updating process instead of commonly-used zero-mean error function can significantly reduce the prediction uncertainties.

  8. Refining the Concepts of Self-quantification Needed for Health Self-management. A Thematic Literature Review.

    PubMed

    Almalki, Manal; Gray, Kathleen; Martin-Sanchez, Fernando J

    2017-01-09

    Questions like 'How is your health? How are you feeling? How have you been?' now can be answered in a different way due to innovative health self-quantification apps and devices. These apps and devices generate data that enable individuals to be informed and more responsible about their own health. The aim of this paper is to review studies on health SQ, firstly, exploring the concepts that are associated with the users' interaction with and around data for managing health; and secondly, the potential benefits and challenges that are associated with the use of such data to maintain or promote health, as well as their impact on the users' certainty or confidence in taking effective actions upon such data. To answer these questions, we conducted a comprehensive literature review to build our study sample. We searched a number of electronic bibliographic databases including Scopus, Web of Science, Medline, and Google Scholar. Thematic analysis was conducted for each study to find all the themes that are related to our research aims. In the reviewed literature, conceptualisation of health SQ is messy and inconsistent. Personal tracking, personal analytics, personal experimentation, and personal health activation are different concepts within the practice of health SQ; thus, a new definition and structure is proposed to set out boundaries between them. Using the data that are generated by SQS for managing health has many advantages but also poses many challenges. Inconsistency in conceptualisation of health SQ - as well as the challenges that users experience in health self-management - reveal the need for frameworks that can describe the users' health SQ practice in a holistic and consistent manner. Our ongoing work toward developing these frameworks will help researchers in this domain to gain better understanding of this practice, and will enable more systematic investigations which are needed to improve the use of SQS and their data in health self-management.

  9. Near-Field to Far-Field Uncertainty Propagation and Quantification of Ground Motions Generated by the Source Physics Experiments (SPE)

    NASA Astrophysics Data System (ADS)

    Antoun, T.; Ezzedine, S. M.; Vorobiev, O.; Pitarka, A.; Hurley, R.; Hirakawa, E. T.; Glenn, L.; Walter, W. R.

    2016-12-01

    LLNL has developed a framework for uncertainty propagation and quantification using HPC numerical codes to simulate end-to-end, from source to receivers, the ground motions observed during the Source Physics Experiments (SPE) conducted in fractured granitic rock at the Nevada National Security Site (NNSS). SPE includes six underground chemical explosions designed with different yields initiated at different depths. To date we have successfully applied this framework to explain the near-field shear motions observed in the vicinity of SPE3 thru SPE5. However, systematic uncertainty propagation to the far-field seismic receiver has not been addressed yet. In the current study, we used a coupling between the non-linear inelastic hydrodynamic regime in the near-field and the seismic elastic regime in the far-field to conduct the analysis. Several realizations of the stochastic discrete fracture network were generated conditional to the observed sparse data. These realizations were then used to calculate the ground motions generated from the SPE shots up to the elastic radius. The latter serves as the handshake interface for the far-field simulations. By creating several realizations of near-field responses one can embed those sources into the far-field elastic wave code and further the uncertainty propagation to the receivers. We will present a full assessment from end-to-end for the near- and far-field measurements. Separate analyses of the effect of the different conceptual geological models are also carried over using a nested Monte Carlo scheme. We compare the observed frequency content at several gages with the simulated ones. We conclude that both regions experience different sampling of frequencies: small features are relevant to near-field simulations while larger feature are more dominant at the far-field. We finally rank the primary sensitive parameters for both regions to drive and refine the field characterization data collection.

  10. A Computational Framework for Investigating the Positional Stability of Aortic Endografts

    PubMed Central

    Prasad, Anamika; Xiao, Nan; Gong, Xiao-Yan; Zarins, Christopher K.; Figueroa, C. Alberto

    2012-01-01

    Endovascular aneurysm repair (Greenhalgh, Brown et al.) techniques have revolutionized the treatment of thoracic and abdominal aortic aneurysm disease, greatly reducing the perioperative mortality and morbidity associated with open surgical repair techniques. However, EVAR is not free of important complications such as late device migration, endoleak formation and fracture of device components that may result in adverse events such as aneurysm enlargement, need for long-term imaging surveillance and secondary interventions or even death. These complications result from the device inability to withstand the hemodynamics of blood flow and to keep its originally intended post-operative position over time. Understanding the in vivo biomechanical working environment experienced by endografts is a critical factor in improving their long-term performance. To date, no study has investigated the mechanics of contact between device and aorta in a three-dimensional setting. In this work, we developed a comprehensive Computational Solid Mechanics and Computational Fluid Dynamics framework to investigate the mechanics of endograft positional stability. The main building blocks of this framework are: i) Three-dimensional non-planar aortic and stent-graft geometrical models, ii) Realistic multi-material constitutive laws for aorta, stent, and graft, iii) Physiological values for blood flow and pressure and iv) Frictional model to describe the contact between the endograft and the aorta. We introduce a new metric for numerical quantification of the positional stability of the endograft. Lastly, in the results section, we test the framework by investigating the impact of several factors that are clinically known to affect endograft stability. PMID:23143353

  11. Coupled near-field and far-field exposure assessment framework for chemicals in consumer products.

    PubMed

    Fantke, Peter; Ernstoff, Alexi S; Huang, Lei; Csiszar, Susan A; Jolliet, Olivier

    2016-09-01

    Humans can be exposed to chemicals in consumer products through product use and environmental emissions over the product life cycle. Exposure pathways are often complex, where chemicals can transfer directly from products to humans during use or exchange between various indoor and outdoor compartments until sub-fractions reach humans. To consistently evaluate exposure pathways along product life cycles, a flexible mass balance-based assessment framework is presented structuring multimedia chemical transfers in a matrix of direct inter-compartmental transfer fractions. By matrix inversion, we quantify cumulative multimedia transfer fractions and exposure pathway-specific product intake fractions defined as chemical mass taken in by humans per unit mass of chemical in a product. Combining product intake fractions with chemical mass in the product yields intake estimates for use in life cycle impact assessment and chemical alternatives assessment, or daily intake doses for use in risk-based assessment and high-throughput screening. Two illustrative examples of chemicals used in personal care products and flooring materials demonstrate how this matrix-based framework offers a consistent and efficient way to rapidly compare exposure pathways for adult and child users and for the general population. This framework constitutes a user-friendly approach to develop, compare and interpret multiple human exposure scenarios in a coupled system of near-field ('user' environment), far-field and human intake compartments, and helps understand the contribution of individual pathways to overall human exposure in various product application contexts to inform decisions in different science-policy fields for which exposure quantification is relevant. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Bayesian analysis of rare events

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  13. Computational Framework for Analysis of Prey–Prey Associations in Interaction Proteomics Identifies Novel Human Protein–Protein Interactions and Networks

    PubMed Central

    Saha, Sudipto; Dazard, Jean-Eudes; Xu, Hua; Ewing, Rob M.

    2013-01-01

    Large-scale protein–protein interaction data sets have been generated for several species including yeast and human and have enabled the identification, quantification, and prediction of cellular molecular networks. Affinity purification-mass spectrometry (AP-MS) is the preeminent methodology for large-scale analysis of protein complexes, performed by immunopurifying a specific “bait” protein and its associated “prey” proteins. The analysis and interpretation of AP-MS data sets is, however, not straightforward. In addition, although yeast AP-MS data sets are relatively comprehensive, current human AP-MS data sets only sparsely cover the human interactome. Here we develop a framework for analysis of AP-MS data sets that addresses the issues of noise, missing data, and sparsity of coverage in the context of a current, real world human AP-MS data set. Our goal is to extend and increase the density of the known human interactome by integrating bait–prey and cocomplexed preys (prey–prey associations) into networks. Our framework incorporates a score for each identified protein, as well as elements of signal processing to improve the confidence of identified protein–protein interactions. We identify many protein networks enriched in known biological processes and functions. In addition, we show that integrated bait–prey and prey–prey interactions can be used to refine network topology and extend known protein networks. PMID:22845868

  14. Biodiversity Offsets: Two New Zealand Case Studies and an Assessment Framework

    NASA Astrophysics Data System (ADS)

    Norton, David A.

    2009-04-01

    Biodiversity offsets are increasingly being used for securing biodiversity conservation outcomes as part of sustainable economic development to compensate for the residual unavoidable impacts of projects. Two recent New Zealand examples of biodiversity offsets are reviewed—while both are positive for biodiversity conservation, the process by which they were developed and approved was based more on the precautionary principal than on any formal framework. Based on this review and the broader offset literature, an environmental framework for developing and approving biodiversity offsets, comprising six principles, is outlined: (1) biodiversity offsets should only be used as part of an hierarchy of actions that first seeks to avoid impacts and then minimizes the impacts that do occur; (2) a guarantee is provided that the offset proposed will occur; (3) biodiversity offsets are inappropriate for certain ecosystem (or habitat) types because of their rarity or the presence of threatened species within them; (4) offsets most often involve the creation of new habitat, but can include protection of existing habitat where there is currently no protection; (5) a clear currency is required that allows transparent quantification of values to be lost and gained in order to ensure ecological equivalency between cleared and offset areas; (6) offsets must take into account both the uncertainty involved in obtaining the desired outcome for the offset area and the time-lag that is involved in reaching that point.

  15. Analysis of lipid experiments (ALEX): a software framework for analysis of high-resolution shotgun lipidomics data.

    PubMed

    Husen, Peter; Tarasov, Kirill; Katafiasz, Maciej; Sokol, Elena; Vogt, Johannes; Baumgart, Jan; Nitsch, Robert; Ekroos, Kim; Ejsing, Christer S

    2013-01-01

    Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF) from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1). The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.

  16. msCompare: A Framework for Quantitative Analysis of Label-free LC-MS Data for Comparative Candidate Biomarker Studies*

    PubMed Central

    Hoekman, Berend; Breitling, Rainer; Suits, Frank; Bischoff, Rainer; Horvatovich, Peter

    2012-01-01

    Data processing forms an integral part of biomarker discovery and contributes significantly to the ultimate result. To compare and evaluate various publicly available open source label-free data processing workflows, we developed msCompare, a modular framework that allows the arbitrary combination of different feature detection/quantification and alignment/matching algorithms in conjunction with a novel scoring method to evaluate their overall performance. We used msCompare to assess the performance of workflows built from modules of publicly available data processing packages such as SuperHirn, OpenMS, and MZmine and our in-house developed modules on peptide-spiked urine and trypsin-digested cerebrospinal fluid (CSF) samples. We found that the quality of results varied greatly among workflows, and interestingly, heterogeneous combinations of algorithms often performed better than the homogenous workflows. Our scoring method showed that the union of feature matrices of different workflows outperformed the original homogenous workflows in some cases. msCompare is open source software (https://trac.nbic.nl/mscompare), and we provide a web-based data processing service for our framework by integration into the Galaxy server of the Netherlands Bioinformatics Center (http://galaxy.nbic.nl/galaxy) to allow scientists to determine which combination of modules provides the most accurate processing for their particular LC-MS data sets. PMID:22318370

  17. A surrogate-based sensitivity quantification and Bayesian inversion of a regional groundwater flow model

    NASA Astrophysics Data System (ADS)

    Chen, Mingjie; Izady, Azizallah; Abdalla, Osman A.; Amerjeed, Mansoor

    2018-02-01

    Bayesian inference using Markov Chain Monte Carlo (MCMC) provides an explicit framework for stochastic calibration of hydrogeologic models accounting for uncertainties; however, the MCMC sampling entails a large number of model calls, and could easily become computationally unwieldy if the high-fidelity hydrogeologic model simulation is time consuming. This study proposes a surrogate-based Bayesian framework to address this notorious issue, and illustrates the methodology by inverse modeling a regional MODFLOW model. The high-fidelity groundwater model is approximated by a fast statistical model using Bagging Multivariate Adaptive Regression Spline (BMARS) algorithm, and hence the MCMC sampling can be efficiently performed. In this study, the MODFLOW model is developed to simulate the groundwater flow in an arid region of Oman consisting of mountain-coast aquifers, and used to run representative simulations to generate training dataset for BMARS model construction. A BMARS-based Sobol' method is also employed to efficiently calculate input parameter sensitivities, which are used to evaluate and rank their importance for the groundwater flow model system. According to sensitivity analysis, insensitive parameters are screened out of Bayesian inversion of the MODFLOW model, further saving computing efforts. The posterior probability distribution of input parameters is efficiently inferred from the prescribed prior distribution using observed head data, demonstrating that the presented BMARS-based Bayesian framework is an efficient tool to reduce parameter uncertainties of a groundwater system.

  18. Current frontiers and future directions of telecoupling research

    NASA Astrophysics Data System (ADS)

    Liu, J.

    2016-12-01

    The world has been increasingly interconnected over long distances though processes such as international trade, migration, telecommunication, and disease spread. However, previous studies often focused on socioeconomic or environmental issues of distant processes. While these studies have generated useful information for individual disciplines, integrating socioeconomic and environmental information is essential for holistic understanding of complex global challenges and unbiased decision making to address the challenges. To advance integrated research, the framework of telecoupling (socioeconomic and environmental interactions over distances) has been developed to explicitly address both socioeconomic and environmental issues simultaneously. Although the framework is relatively new, it has already been applied to tackle a variety of globally important issues, such as food security, water resources, energy sustainability, land use, international trade (e.g., food, forest products, energy, wildlife, industrial products), species invasion, investment, ecosystem services, conservation, information dissemination, and tourism. These applications have identified many important research gaps (e.g. spillover systems) and hidden linkages (e.g. feedbacks) among distant areas of the world with profound implications for sustainable development, ecosystem health, and human well-being. While working with telecoupling presents more challenges than focusing only on disciplinary issues, support from funding agencies has helped accelerate research on telecoupling and more efforts are being aimed at framework quantification and operationalization. The presenter will provide an overview of the current frontiers, discuss future research directions, and highlight emerging opportunities and challenges in telecoupling research and governance.

  19. Experimental investigations and geochemical modelling of site-specific fluid-fluid and fluid-rock interactions in underground storage of CO2/H2/CH4 mixtures: the H2STORE project

    NASA Astrophysics Data System (ADS)

    De Lucia, Marco; Pilz, Peter

    2015-04-01

    Underground gas storage is increasingly regarded as a technically viable option for meeting the energy demand and environmental targets of many industrialized countries. Besides the long-term CO2 sequestration, energy can be chemically stored in form of CO2/CH4/H2 mixtures, for example resulting from excess wind energy. A precise estimation of the impact of such gas mixtures on the mineralogical, geochemical and petrophysical properties of specific reservoirs and caprocks is crucial for site selection and optimization of storage depth. Underground gas storage is increasingly regarded as a technically viable option for meeting environmental targets and the energy demand through storage in form of H2 or CH4, i.e. resulting from excess wind energy. Gas storage in salt caverns is nowadays a mature technology; in regions where favorable geologic structures such as salt diapires are not available, however, gas storage can only be implemented in porous media such as depleted gas and oil reservoirs or suitable saline aquifers. In such settings, a significant amount of in-situ gas components such as CO2, CH4 (and N2) will always be present, making the CO2/CH4/H2 system of particular interest. A precise estimation of the impact of their gas mixtures on the mineralogical, geochemical and petrophysical properties of specific reservoirs and caprocks is therefore crucial for site selection and optimization of storage depth. In the framework of the collaborative research project H2STORE, the feasibility of industrial-scale gas storage in porous media in several potential siliciclastic depleted gas and oil reservoirs or suitable saline aquifers is being investigated by means of experiments and modelling on actual core materials from the evaluated sites. Among them are the Altmark depleted gas reservoir in Saxony-Anhalt and the Ketzin pilot site for CO2 storage in Brandenburg (Germany). Further sites are located in the Molasse basin in South Germany and Austria. In particular, two work packages hosted at the German Research Centre for Geosciences (GFZ) focus on the fluid-fluid and fluid-rock interactions triggered by CO2, H2 and their mixtures. Laboratory experiments expose core samples to hydrogen and CO2/hydrogen mixtures under site-specific conditions (temperatures up to 200 °C and pressure up to 300 bar). The resulting qualitative and, whereas possible, quantitative data are expected to ameliorate the precision of predictive geochemical and reactive transport modelling, which is also performed within the project. The combination of experiments, chemical and mineralogical analyses and models is needed to improve the knowledge about: (1) solubility model and mixing rule for multicomponent gas mixtures in high saline formation fluids: no data are namely available in literature for H2-charged gas mixtures in the conditions expected in the potential sites; (2) chemical reactivity of different mineral assemblages and formation fluids in a broad spectrum of P-T conditions and composition of the stored gas mixtures; (3) thermodynamics and kinetics of relevant reactions involving mineral dissolution or precipitation. The resulting amelioration of site characterization and the overall enhancement in understanding the potential processes will benefit the operational reliability, the ecological tolerance, and the economic efficiency of future energy storing plants, crucial aspects for public acceptance and for industrial investors.

  20. Geomechanical Framework for Secure CO 2 Storage in Fractured Reservoirs and Caprocks for Sedimentary Basins in theMidwest United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sminchak, Joel

    This report presents final technical results for the project Geomechanical Framework for Secure CO 2 Storage in Fractured Reservoirs and Caprocks for Sedimentary Basins in the Midwest United States (DE-FE0023330). The project was a three-year effort consisting of seven technical tasks focused on defining geomechanical factors for CO 2 storage applications in deep saline rock formations in Ohio and the Midwest United States, because geomechancial issues have been identified as a significant risk factor for large-scale CO 2 storage applications. A basin-scale stress-strain analysis was completed to describe the geomechanical setting for rock formations of Ordovician-Cambrian age in Ohio andmore » adjacent areas of the Midwest United States in relation to geologic CO 2 storage applications. The tectonic setting, stress orientation-magnitude, and geomechanical and petrophysical parameters for CO 2 storage zones and caprocks in the region were cataloged. Ten geophysical image logs were analyzed for natural fractures, borehole breakouts, and drilling-induced fractures. The logs indicated mostly less than 10 fractures per 100 vertical feet in the borehole, with mostly N65E principal stress orientation through the section. Geophysical image logs and other logs were obtained for three wells located near the sites where specific models were developed for geomechanical simulations: Arches site in Boone County, Kentucky; Northern Appalachian Basin site in Chautauqua County, New York; and E-Central Appalachian Basin site in Tuscarawas County, Ohio. For these three wells, 9,700 feet of image logs were processed and interpreted to provide a systematic review of the distribution within each well of natural fractures, wellbore breakouts, faults, and drilling induced fractures. There were many borehole breakouts and drilling-induced tensile fractures but few natural fractures. Concentrated fractures were present at the Rome-basal sandstone and basal sandstone-Precambrian contacts at the Arches and East-Central Appalachian Basin sites. Geophysical logs were utilized to develop local-scale geologic models by determining geomechanical and petrophysical parameters within the geologic formations. These data were ported to coupled fluid-flow and reservoir geomechanics multi-phase CO 2 injection simulations. The models were developed to emphasize the geomechanical layers within the CO 2 storage zones and caprocks. A series of simulations were completed for each site to evaluate whether commercial-scale CO 2 could be safely injected into each site, given site-specific geologic and geomechanical controls. This involved analyzing the simulation results for the integrity of the caprock, intermediate, and reservoir zones, as well quantifying the areal uplift at the surface. Simulation results were also examined to ensure that the stress-stress perturbations were isolated within the subsurface, and that there was only limited upward migration of the CO 2. Simulations showed capacity to inject more than 10 million metric tons of CO 2 in a single well at the Arches and East Central Appalachian Basin sites without excessive geomechanical risks. Low-permeability rock layers at the Northern Appalachian Basin study area well resulted in very low CO 2 injection capacity. Fracture models developed for the sites suggests that the sites have sparse fracture network in the deeper Cambrian rocks. However, there were indicators in image logs of a moderate fracture matrix in the Rose Run Sandstone at the Northern Appalachian Basin site. Dual permeability fracture matrix simulations suggest the much higher injection rates may be feasible in the fractured interval. Guidance was developed for geomechanical site characterization in the areas of geophysical logging, rock core testing, well testing, and site monitoring. The guidance demonstrates that there is a suitable array of options for addressing geomechanical issues at CO 2 storage sites. Finally, a review of Marcellus and Utica-Point Pleasant shale gas wells and CO 2 storage intervals indicates that these items are vertically separated, except for the Oriskany sandstone and Marcellus wells in southwest Pennsylvania and northern West Virginia. Together, project results present a more realistic portrayal of geomechanical risk factors related to CO 2 storage for existing and future coal-fired power plants in Ohio.« less

  1. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.

  2. The sociogeometry of inequality: Part I

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo

    2015-05-01

    The study of socioeconomic inequality is of prime economic and social importance, and the key quantitative gauges of socioeconomic inequality are Lorenz curves and inequality indices-the most notable of the latter being the popular Gini index. In this series of papers we present a sociogeometric framework to the study of socioeconomic inequality. In this part we shift from the notion of Lorenz curves to the notion of Lorenz sets, define inequality indices in terms of Lorenz sets, and introduce and explore a collection of distance-based and width-based inequality indices stemming from the geometry of Lorenz sets. In particular, three principle diameters of Lorenz sets are established as meaningful quantitative gauges of socioeconomic inequality-thus indeed providing a geometric quantification of socioeconomic inequality.

  3. Quantification of Behavioral Stereotypy in Flies

    NASA Astrophysics Data System (ADS)

    Manley, Jason; Berman, Gordon; Shaevitz, Joshua

    A commonly accepted assumption in the study of behavior is that an organism's behavioral repertoire can be represented by a relatively small set of stereotyped actions. Here, ``stereotypy'' is defined as a measure of the similarity of repetitions of a behavior. Our group utilizes data-driven analyses on videos of ground-based Drosophila to organize the set of spontaneous behaviors into a two-dimensional map, or behavioral space. We utilize this framework to define a metric for behavioral stereotypy. This measure quantifies the variance in a given behavior's periodic trajectory through a space representing its postural degrees of freedom. This newly developed behavioral metric has confirmed a high degree of stereotypy among most behaviors and we correlate stereotypy with various physiological effects.

  4. Composable security proof for continuous-variable quantum key distribution with coherent States.

    PubMed

    Leverrier, Anthony

    2015-02-20

    We give the first composable security proof for continuous-variable quantum key distribution with coherent states against collective attacks. Crucially, in the limit of large blocks the secret key rate converges to the usual value computed from the Holevo bound. Combining our proof with either the de Finetti theorem or the postselection technique then shows the security of the protocol against general attacks, thereby confirming the long-standing conjecture that Gaussian attacks are optimal asymptotically in the composable security framework. We expect that our parameter estimation procedure, which does not rely on any assumption about the quantum state being measured, will find applications elsewhere, for instance, for the reliable quantification of continuous-variable entanglement in finite-size settings.

  5. Geometric quantification of features in large flow fields.

    PubMed

    Kendall, Wesley; Huang, Jian; Peterka, Tom

    2012-01-01

    Interactive exploration of flow features in large-scale 3D unsteady-flow data is one of the most challenging visualization problems today. To comprehensively explore the complex feature spaces in these datasets, a proposed system employs a scalable framework for investigating a multitude of characteristics from traced field lines. This capability supports the examination of various neighborhood-based geometric attributes in concert with other scalar quantities. Such an analysis wasn't previously possible because of the large computational overhead and I/O requirements. The system integrates visual analytics methods by letting users procedurally and interactively describe and extract high-level flow features. An exploration of various phenomena in a large global ocean-modeling simulation demonstrates the approach's generality and expressiveness as well as its efficacy.

  6. Gaussian geometric discord in terms of Hellinger distance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suciu, Serban, E-mail: serban.suciu@theory.nipne.ro; Isar, Aurelian

    2015-12-07

    In the framework of the theory of open systems based on completely positive quantum dynamical semigroups, we address the quantification of general non-classical correlations in Gaussian states of continuous variable systems from a geometric perspective. We give a description of the Gaussian geometric discord by using the Hellinger distance as a measure for quantum correlations between two non-interacting non-resonant bosonic modes embedded in a thermal environment. We evaluate the Gaussian geometric discord by taking two-mode squeezed thermal states as initial states of the system and show that it has finite values between 0 and 1 and that it decays asymptoticallymore » to zero in time under the effect of the thermal bath.« less

  7. Application of MRIL-WD (Magnetic Resonance Imaging Logging While Drilling) for irreducible water saturation, total reservoir, free-fluid, bound-fluid porosity measurements and its value for the petrophysical analysis of RT/RM data from the Shah Deniz well

    NASA Astrophysics Data System (ADS)

    Amirov, Elnur

    2016-04-01

    Sperry-Sun (Sperry Drilling Services) is the leader in MWD/LWD reliability, has developed the industry's first LWD NMR/MRIL-WD (nuclear magnetic resonance) tool. The MRIL-WD (magnetic resonance imaging logging-while-drilling) service directly measures the T1 component of hydrogen in subsurface rock units while drilling to obtain total reservoir porosity and to dissect the observed total porosity into its respective components of free fluid and bound fluid porosity. These T1 data are used to secure accurate total, free-fluid, capillary-bound water, and clay-bound water porosity of the reservoir sections which can be drilled in the several Runs. Over the last decade, results from Magnetic Resonance Imaging logs (NMR) have added significant value to petrophysical analysis and understanding by providing total, free-fluid and bound-fluid porosities, combined with fluid typing capabilities. With MRIL-WD very valuable Real-Time or Recorded Memory data/information is now available during or shortly after the drilling operation (formation properties measurement can be taken right after a drill bit penetration), while trip in and trip out as well. A key point in utilizing MRIL in an LWD environment is motion-tolerant measurements. Recent MRIL-WD logging runs from the Shah Deniz wells located in the Khazarian-Caspian Sea of the Azerbaijan Republic helped to delineate and assess hydrocarbon bearing zones. Acquired results demonstrate how MRIL data can be acquired while-drilling and provide reliable/high quality measurements. Magnetic Resonance Imaging logs at some developments wells have become a cornerstone in formation evaluation and petrophysical understanding. By providing total, free-fluid, and bound-fluid porosities together with fluid typing, MRIL results have significantly added to the assessment of reservoirs. In order to reduce NPT (Non-Productive Time) and save the rig operations time, there is always the desire to obtain logging results as soon as possible, preferably while the drilling of the brand new wells (logging-while-drilling, LWD). The MRIL-WD Tool can accomplish any tasks reliably and in a timely manner thus saving drilling time and reducing the overall risk for the well. Control of water production and identification of pay zones with high irreducible water saturation are also very important for formation evaluation and petrophysical analysis in oil fields located in the Azerbaijan Republic and also other fields around the world. Sometimes above-mentioned problems can cause delay in completion decisions which will create additional expenses for field management. In many wells, breakthroughs in reservoir characterization have been achieved in directly determining hydrocarbon volumes, net permeability thickness, and hydrocarbon type, thus circumventing the problems associated with obtaining wireline data and the considerable amount of rig time required (so MRIL-WD can considerably reduce the NPT). Some reservoir zones with relatively low water saturation, which calculated from the other conventional logs, can produce with relatively high percentage of water cut, primarily because much of the water is movable. However, other zones with high calculated water saturation produce water free hydrocarbons. The difficulty in predicting water production can be related with the producing from the complex lithology, which can contain low-permeability, medium- to fine-grained shaly sands. Where grains are small, the formations have high surface to volume ratios that result in high irreducible water saturation and due to this we can see low resistivity values. As a result the use of resistivity logs as pay indicator, sometimes can cause low resistivity pay zones might be overlooked and consequently net field pay could be underestimated. In the last few years, nuclear magnetic resonance logs have shown great promise in solving problems of formation evaluation that could not be directly resolved with conventional logs. The capability of MRIL-WD can help many engineers to differentiate between the immovable and movable water in oil reservoirs in many fields. Sometimes MRIL-WD have also been capable of providing better formation permeability than conventional logs, a feature which can save time and expense in well-completion decisions. The RT & RM bound fluid and total porosity measurements can provide a tremendous new insight into the formation evaluation of shaly sands and low resistivity pays. Unlike traditional porosity devices, which are affected by rock matrix changes, the MRIL-WD tool can be used in complex or mixed lithology sequences and provide measurements of porosity that are lithology independent.

  8. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    PubMed

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  9. Log evaluation in wells drilled with inverted oil emulsion mud. [GLOBAL program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, D.P.; Lacour-Gayet, P.J.; Suau, J.

    1981-01-01

    As greater use is made of inverted oil emulsion, muds in the development of North Sea oil fields, the need for more precise log evaluation in this environment becomes apparent. This paper demonstrates an approach using the Dual Induction Log, taking into account invasion and boundary effects. Lithology and porosity are derived from the Formation Density or Litho-Density Log, Compensated Neutron Log, Sonic Log and the Natural Gamma Ray Spectrometry log. The effect of invasion by the oil component of the mud filtrate is treated in the evaluation, and a measurement of Moved Water is made Computations of petrophysical propertiesmore » are implemented by means of the GLOBAL interpretation program, taking advantage of its capability of adaption to any combination of logging sensors. 8 refs.« less

  10. Petrochemical and petrophysical characterization of the lower crust and the Moho beneath the West African Craton, based on Xenoliths from Kimberlites

    NASA Technical Reports Server (NTRS)

    Haggerty, Stephen E.; Toft, Paul B.

    1988-01-01

    Additional evidence to the composition of the lower crust and uppermost mantle was presented in the form of xenolith data. Xenoliths from the 2.7-Ga West African Craton indicate that the Moho beneath this shield is a chemically and physically gradational boundary, with intercalations of garnet granulite and garnet eclogite. Inclusions in diamonds indicate a depleted upper mantle source, and zenolith barometry and thermometry data suggest a high mantle geotherm with a kink near the Moho. Metallic iron in the xenoliths indicates that the uppermost mantle has a significant magnetization, and that the depth to the Curie isotherm, which is usually considered to be at or above the Moho, may be deeper than the Moho.

  11. Seismic evidence for a crustal magma reservoir beneath the upper east rift zoneof Kilauea volcano, Hawaii

    USGS Publications Warehouse

    Lin, Guoqing; Amelung, Falk; Lavallee, Yan; Okubo, Paul G.

    2014-01-01

    An anomalous body with low Vp (compressional wave velocity), low Vs (shear wave velocity), and high Vp/Vs anomalies is observed at 8–11 km depth beneath the upper east rift zone of Kilauea volcano in Hawaii by simultaneous inversion of seismic velocity structure and earthquake locations. We interpret this body to be a crustal magma reservoir beneath the volcanic pile, similar to those widely recognized beneath mid-ocean ridge volcanoes. Combined seismic velocity and petrophysical models suggest the presence of 10% melt in a cumulate magma mush. This reservoir could have supplied the magma that intruded into the deep section of the east rift zone and caused its rapid expansion following the 1975 M7.2 Kalapana earthquake.

  12. Petrophysical laboratory invertigations of carbon dioxide storage in a subsurface saline aquifer in Ketzin/Germany within the scope of CO2SINK

    NASA Astrophysics Data System (ADS)

    Zemke, K.; Kummmerow, J.; Wandrey, M.; Co2SINK Group

    2009-04-01

    Since June of 2008 carbon dioxide has been injected into a saline aquifer at the Ketzin test site [Würdemann et al., this volume]. The food grade CO2 is injected into a sandstone zone of the Stuttgart formation at ca. 650 m depth at 35°C reservoir temperature and 62 bar reservoir pressure. With the injection of CO2 into the geological formation, chemical and physical reservoir characteristics are changed depending on pressure, temperature, fluid chemistry and rock composition. Fluid-rock interaction could comprise dissolution of non-resistant minerals in CO2-bearing pore fluids, cementing of the pore space by precipitating substances from the pore fluid, drying and disintegration of clay minerals and thus influence of the composition and activities of the deep biosphere. To testing the injection behaviour of CO2 in water saturated rock and to evaluate the geophysical signature depending on the thermodynamic conditions, flow experiments with water and CO2 have been performed on cores of the Stuttgart formation from different locations including new wells of ketzin test site. The studied core material is an unconsolidated fine-grained sandstone with porosity values from 15 to 32 %. Permeability, electrical resistivity, and sonic wave velocities and their changes with pressure, saturation and time have been studied under simulated in situ conditions. The flow experiments conducted over several weeks with brine and CO2 showed no significant changes of resistivity and velocity and a slightly decreasing permeability. Pore fluid analysis showed mobilization of clay and some other components. A main objective of the CO2Sink laboratory program is the assessment of the effect of long-term CO2 exposure on reservoir rocks to predict the long-term behaviour of geological CO2 storage. For this CO2 exposure experiments reservoir rock samples were exposed to CO2 saturated reservoir fluid in corrosion-resistant high pressure vessels under in situ temperature and pressure conditions over a period of several months. Before and after the CO2 exposure experiment cyclic measurements of physical properties were carried out on these cores in a mechanical testing system. After experimental runs of up to 3 months no significant changes in flow and petrophysical data were observed. [For the microbilogical studies see Wandrey et al., this volume.] To study the impact of fluid-rock interactions on petrophysical parameters, porosity and pore radii distribution have been investigated before and after the experiment by NMR relaxation and mercury-injection. NMR measurements on rock core plugs saturated with brine may return valuable information on the porous structure of the rock core. The distribution of NMR-T2 values (CPMG) reflects the pore sizes within the rock core. NMR pore size is a derivative of the ratio pore surface/volume. The mercury injection pore size is an area-equivalent diameter of the throats connecting the pore system. Most of the tested samples show in the NMR measurements a slightly increasing porosity and a higher part of large pores. The mercury measurements and thin- section for microstructural characterisation after the CO2 exposure will be done at a later date.

  13. Integrated application of in situ non destructive techniques for the evaluation of the architectural elements of monumental structures.

    NASA Astrophysics Data System (ADS)

    Fais, Silvana; Casula, Giuseppe; Cuccuru, Francesco; Ligas, Paola; Bianchi, Maria Giovanna; Marraccini, Alessandro

    2017-04-01

    The need to integrate different non invasive geophysical datasets for an effective diagnostic process of the stone materials of cultural heritage buildings is due to the complexity of the intrinsic characteristics of the different types of stones and of their degradation process. Consequently integration between different geophysical techniques is required for the characterization of stone building materials. In order to perform the diagnostic process by different non-invasive techniques thus interpreting in a realistic way the different geophysical parameters, it is necessary to link the petrophysical characteristics of stones with the geophysical ones. In this study the complementary application of three different non invasive techniques (terrestrial laser scanner (TLS), infrared thermography and ultrasonic surface and tomography measurements) was carried out to analyse the conservation state and quality of the carbonate building materials of three inner columns of the old precious church of San Lorenzo in the historical city center of Cagliari (Sardinia). In previous works (Casula et al., 2009; Fais et al., 2015), especially the integrated application of TLS and ultrasonic techniques has been demonstrated to represent a powerful tool in evaluating the quality of the stone building materials by solving or limiting the uncertainties typical of all indirect methods. Thanks to the terrestrial laser scanner (TLS) technique it was possible to 3D model the investigated columns and their surface geometrical anomalies. The TLS measurements were complemented by several ultrasonic in situ and laboratory tests in the 24kHz - 54kHz range. The ultrasonic parameters, especially longitudinal and transversal velocities, allow to recover information on materials related with mechanical properties. A good correlation between TLS surface geometrical anomalies and the ultrasonic velocity ones is evident at the surface and in shallow parts of the investigated architectural elements. To calibrate the geophysical results and provide reliable data for the interpretation, the petrophysical properties (porosity, density, water absorption) and petrographical characteristics (especially texture) of the carbonate building materials under study were examined. By combining petrographical, petrophysical, terrestrial laser scanner and ultrasonic techniques, a consistent diagnostic process of the carbonate building materials can be achieved to detect the presence of defects, fissures, fractures, weathering process or compositional variations. The above diagnostic process is very useful also to evaluate the behavior of the carbonate building materials, facilitating the planning of urgent and long-term conservation programs and in time monitoring. References Casula G, Fais S, Ligas P (2009) Experimental application of 3-D laser scanning and acoustic techniques in assessing the quality of stones used in monumental structures. Int J Microstruct. Mater. Prop. 4:45-56. doi: 10.1504/IJMMP.2009.028432 Fais, S., Cuccuru, F., Ligas, P, Casula, G., Bianchi M.G. (2015) Integrated ultrasonic, laser scanning and petrographical characterisation of carbonate building materials on an architectural structure of a historic building. Bull Eng Geol Environ. doi: 10.1007/s10064-015-0815-9 Acknowledgements: This work was supported by Regione Autonoma della Sardegna (RAS), Regional Law 7th August 2007, n. 7. The authors would also like to thank Archidiocesi di Cagliari and Mons. Mario Ledda for their kind permission to work on the San Lorenzo Church.

  14. Subsurface multidisciplinary research results at ICTJA-CSIC downhole lab and test site

    NASA Astrophysics Data System (ADS)

    Jurado, Maria Jose; Crespo, Jose; Salvany, Josep Maria; Teixidó, Teresa

    2017-04-01

    Two scientific boreholes, Almera-1 and Almera-2 were drilled in the Barcelona University campus area in 2011. The main purpose for this drilling was to create a new geophysical logging and downhole monitoring research facility and infrastructure. We present results obtained in the frame of multidisciplinary studies and experiments carried out since 2011 at the ICTJA "Borehole Geophysical Logging Lab - Scientific Boreholes Almera" downhole lab facilities. First results obtained from the scientific drilling, coring and logging allowed us to characterize the urban subsurface geology and hydrology adjacent to the Institute of Earth Sciences Jaume Almera (ICTJA-CSIC) in Barcelona. The subsurface geology and structural picture has been completed with recent geophysical studies and monitoring results. The upper section of Almera-1 214m deep hole was cased with PVC after drilling and after the logging operations. An open hole interval was left from 112m to TD (Paleozoic section). Almera-2 drilling reached 46m and was cased also with PVC to 44m. Since completion of the drilling in 2011, both Almera-1 and Almera-2 have been extensively used for research purposes, tests, training, hydrological and geophysical monitoring. A complete set of geophysical logging measurements and borehole oriented images were acquired in open hole mode of the entire Almera-1 section. Open hole measurements included acoustic and optical imaging, spectral natural gamma ray, full wave acoustic logging, magnetic susceptibility, hydrochemical-temperature logs and fluid sampling. Through casing (PVC casing) measurements included spectral gamma ray logging, full wave sonic and acoustic televiewer. A Quaternary to Paleozoic section was characterized based on the geophysical logging and borehole images interpretation and also on the complete set of (wireline) cores of the entire section. Sample availability was intended for geological macro and micro-facies detailed characterization, mineralogical and petrophysical tests and analyses. The interpretation of the geophysical logging data and borehole oriented images, and core data allowed us to define the stratigraphy, structures and petrophysical properties in the subsurface. Quaternary sediments overlie unconformably weathered, deformed and partially metamorphosed Paleozoic rocks. A gap of the Tertiary rocks at the drillsite was detected. Structures at intensely fractured and faulted sections were measured and have yielded valuable data to understand the subsurface geology, hydrology and geological evolution in that area. Logging, borehole imaging and monitoring carried out in the scientific boreholes Almera-1 and Almera-2 has allowed also to identify three preferential groundwater flow paths in the subsurface. Geophysical logging data combined with groundwater monitoring allowed us to identify three zones of high permeability in the subsurface. Logging data combined with core analysis were used to characterize the aquifers lithology and their respective petrophysical properties. We also analyzed the aquifer dynamics and potential relationships between the variations in groundwater levels and the rainfalls by comparing the groundwater monitoring results and the rainfall. A seismic survey was carried out to outline the geological structures beyond Almera-1 borehole, a vertical reverse pseudo-3D (2.5D) seismic tomography experiment. The results allowed us to define the geological structure beyond the borehole wall and also a correlation between the different geological units in the borehole and their geometry and spatial geophysical and seismic image.

  15. Crosswell seismic studies in gas hydrate-bearing sediments: P wave velocity and attenuation tomography

    NASA Astrophysics Data System (ADS)

    Bauer, K.; Haberland, Ch.; Pratt, R. G.; Ryberg, T.; Weber, M. H.; Mallik Working Group

    2003-04-01

    We present crosswell seismic data from the Mallik 2002 Production Research Well Program, an international research project on Gas Hydrates in the Northwest Territories of Canada. The program participants include 8 partners; The Geological Survey of Canada (GSC), The Japan National Oil Corporation (JNOC), GeoForschungsZentrum Potsdam (GFZ), United States Geological Survey (USGS), United States Department of the Energy (USDOE), India Ministry of Petroleum and Natural Gas (MOPNG)/Gas Authority of India (GAIL) and the Chevron-BP-Burlington joint venture group. The crosswell seismic measurements were carried out by making use of two 1160 m deep observation wells (Mallik 3L-38 and 4L-38) both 45 m from and co-planar with the 1188 m deep production research well (5L-38). A high power piezo-ceramic source was used to generate sweeped signals with frequencies between 100 and 2000 Hz recorded with arrays of 8 hydrophones per depth level. A depth range between 800 and 1150 m was covered, with shot and receiver spacings of 0.75 m. High quality data could be collected during the survey which allow for application of a wide range of crosswell seismic methods. The initial data analysis included suppression of tube wave energy and picking of first arrivals. A damped least-squares algorithm was used to derive P-wave velocities from the travel time data. Next, t* values were derived from the decay of the amplitude spectra, which served as input parameters for a damped least-squares attenuation tomography. The initial results of the P-wave velocity and attenuation tomography reveal significant features reflecting the stratigraphic environment and allow for detection and eventually quantification of gas hydrate bearing sediments. A prominent correlation between P velocity and attenuation was found for the gas hydrate layers. This contradicts to the apparently more meaningful inverse correlation as it was determined for the gas hydrates at the Blake Ridge but supports the results from the Mallik 2L-38 sonic log data. The P velocities and attenuation values, if combined with other information can be important for the quantitative evaluation of the gas hydrate saturation, and may further constrain petrophysical models of the hydrate bearing sediment formation.

  16. Concise biomarker for spatial-temporal change in three-dimensional ultrasound measurement of carotid vessel wall and plaque thickness based on a graph-based random walk framework: Towards sensitive evaluation of response to therapy.

    PubMed

    Chiu, Bernard; Chen, Weifu; Cheng, Jieyu

    2016-12-01

    Rapid progression in total plaque area and volume measured from ultrasound images has been shown to be associated with an elevated risk of cardiovascular events. Since atherosclerosis is focal and predominantly occurring at the bifurcation, biomarkers that are able to quantify the spatial distribution of vessel-wall-plus-plaque thickness (VWT) change may allow for more sensitive detection of treatment effect. The goal of this paper is to develop simple and sensitive biomarkers to quantify the responsiveness to therapies based on the spatial distribution of VWT-Change on the entire 2D carotid standardized map previously described. Point-wise VWT-Changes computed for each patient were reordered lexicographically to a high-dimensional data node in a graph. A graph-based random walk framework was applied with the novel Weighted Cosine (WCos) similarity function introduced, which was tailored for quantification of responsiveness to therapy. The converging probability of each data node to the VWT regression template in the random walk process served as a scalar descriptor for VWT responsiveness to treatment. The WCos-based biomarker was 14 times more sensitive than the mean VWT-Change in discriminating responsive and unresponsive subjects based on the p-values obtained in T-tests. The proposed framework was extended to quantify where VWT-Change occurred by including multiple VWT-Change distribution templates representing focal changes at different regions. Experimental results show that the framework was effective in classifying carotid arteries with focal VWT-Change at different locations and may facilitate future investigations to correlate risk of cardiovascular events with the location where focal VWT-Change occurs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Bayesian analysis of rare events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into themore » probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.« less

  18. Consistent Simulation Framework for Efficient Mass Discharge and Source Depletion Time Predictions of DNAPL Contaminants in Heterogeneous Aquifers Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Nowak, W.; Koch, J.

    2014-12-01

    Predicting DNAPL fate and transport in heterogeneous aquifers is challenging and subject to an uncertainty that needs to be quantified. Models for this task needs to be equipped with an accurate source zone description, i.e., the distribution of mass of all partitioning phases (DNAPL, water, and soil) in all possible states ((im)mobile, dissolved, and sorbed), mass-transfer algorithms, and the simulation of transport processes in the groundwater. Such detailed models tend to be computationally cumbersome when used for uncertainty quantification. Therefore, a selective choice of the relevant model states, processes, and scales are both sensitive and indispensable. We investigate the questions: what is a meaningful level of model complexity and how to obtain an efficient model framework that is still physically and statistically consistent. In our proposed model, aquifer parameters and the contaminant source architecture are conceptualized jointly as random space functions. The governing processes are simulated in a three-dimensional, highly-resolved, stochastic, and coupled model that can predict probability density functions of mass discharge and source depletion times. We apply a stochastic percolation approach as an emulator to simulate the contaminant source formation, a random walk particle tracking method to simulate DNAPL dissolution and solute transport within the aqueous phase, and a quasi-steady-state approach to solve for DNAPL depletion times. Using this novel model framework, we test whether and to which degree the desired model predictions are sensitive to simplifications often found in the literature. With this we identify that aquifer heterogeneity, groundwater flow irregularity, uncertain and physically-based contaminant source zones, and their mutual interlinkages are indispensable components of a sound model framework.

  19. Dynamic and quantitative evaluation of degenerative mitral valve disease: a dedicated framework based on cardiac magnetic resonance imaging.

    PubMed

    Sturla, Francesco; Onorati, Francesco; Puppini, Giovanni; Pappalardo, Omar A; Selmi, Matteo; Votta, Emiliano; Faggian, Giuseppe; Redaelli, Alberto

    2017-04-01

    Accurate quantification of mitral valve (MV) morphology and dynamic behavior over the cardiac cycle is crucial to understand the mechanisms of degenerative MV dysfunction and to guide the surgical intervention. Cardiac magnetic resonance (CMR) imaging has progressively been adopted to evaluate MV pathophysiology, although a dedicated framework is required to perform a quantitative assessment of the functional MV anatomy. We investigated MV dynamic behavior in subjects with normal MV anatomy (n=10) and patients referred to surgery due to degenerative MV prolapse, classified as fibro-elastic deficiency (FED, n=9) and Barlow's disease (BD, n=10). A CMR-dedicated framework was adopted to evaluate prolapse height and volume and quantitatively assess valvular morphology and papillary muscles (PAPs) function over the cardiac cycle. Multiple comparison was used to investigate the hallmarks associated to MV degenerative prolapse and evaluate the feasibility of anatomical and functional distinction between FED and BD phenotypes. On average, annular dimensions were significantly (P<0.05) larger in BD than in FED and normal subjects while no significant differences were noticed between FED and normal. MV eccentricity progressively decreased passing from normal to FED and BD, with the latter exhibiting a rounder annulus shape. Over the cardiac cycle, we noticed significant differences for BD during systole with an abnormal annular enlargement between mid and late systole (LS) (P<0.001 vs. normal); the PAPs dynamics remained comparable in the three groups. Prolapse height and volume highlighted significant differences among normal, FED and BD valves. Our CMR-dedicated framework allows for the quantitative and dynamic evaluation of MV apparatus, with quantifiable annular alterations representing the primary hallmark of severe MV degeneration. This may aid surgeons in the evaluation of the severity of MV dysfunction and the selection of the appropriate MV treatment.

  20. Groundwater modelling as a tool for the European Water Framework Directive (WFD) application: The Llobregat case

    NASA Astrophysics Data System (ADS)

    Vázquez-Suñé, E.; Abarca, E.; Carrera, J.; Capino, B.; Gámez, D.; Pool, M.; Simó, T.; Batlle, F.; Niñerola, J. M.; Ibáñez, X.

    The European Water Framework Directive establishes the basis for Community action in the field of water policy. Water authorities in Catalonia, together with users are designing a management program to improve groundwater status and to assess the impact of infrastructures and city-planning activities on the aquifers and their associated natural systems. The objective is to describe the role of groundwater modelling in addressing the issues raised by the Water Framework Directive, and its application to the Llobregat Delta, Barcelona, Spain. In this case modelling was used to address Water Framework Directive in the following: (1) Characterisation of aquifers and the status of groundwater by integration of existing knowledge and new hydrogeological information. Inverse modelling allowed us to reach an accurate description of the paths and mechanisms for the evolution of seawater intrusion. (2) Quantification of groundwater budget (mass balance). This is especially relevant for those terms that are difficult to asses, such as recharge from river infiltration during floods, which we have found to be very important. (3) Evaluation of groundwater-related environmental needs in aquatic ecosystems. The model allows quantifying groundwater input under natural conditions, which can be used as a reference level for stressed conditions. (4) Evaluation of possible impacts of territory planning (Llobregat river course modification, new railway tunnels, airport and docks enlargement, etc.). (5) Definition of management areas. (6) The assessment of possible future scenarios combined with optimization processes to quantify sustainable pumping rates and design measures to control seawater intrusion. The resulting model has been coupled to a user-friendly interface to allow water managers to design and address corrective measures in an agile and effective way.

  1. Snow Process Estimation Over the Extratropical Andes Using a Data Assimilation Framework Integrating MERRA Data and Landsat Imagery

    NASA Technical Reports Server (NTRS)

    Cortes, Gonzalo; Girotto, Manuela; Margulis, Steven

    2016-01-01

    A data assimilation framework was implemented with the objective of obtaining high resolution retrospective snow water equivalent (SWE) estimates over several Andean study basins. The framework integrates Landsat fractional snow covered area (fSCA) images, a land surface and snow depletion model, and the Modern Era Retrospective Analysis for Research and Applications (MERRA) reanalysis as a forcing data set. The outputs are SWE and fSCA fields (1985-2015) at a resolution of 90 m that are consistent with the observed depletion record. Verification using in-situ snow surveys showed significant improvements in the accuracy of the SWE estimates relative to forward model estimates, with increases in correlation (0.49-0.87) and reductions in root mean square error (0.316 m to 0.129 m) and mean error (-0.221 m to 0.009 m). A sensitivity analysis showed that the framework is robust to variations in physiography, fSCA data availability and a priori precipitation biases. Results from the application to the headwater basin of the Aconcagua River showed how the forward model versus the fSCA-conditioned estimate resulted in different quantifications of the relationship between runoff and SWE, and different correlation patterns between pixel-wise SWE and ENSO. The illustrative results confirm the influence that ENSO has on snow accumulation for Andean basins draining into the Pacific, with ENSO explaining approximately 25% of the variability in near-peak (1 September) SWE values. Our results show how the assimilation of fSCA data results in a significant improvement upon MERRA-forced modeled SWE estimates, further increasing the utility of the MERRA data for high-resolution snow modeling applications.

  2. Automatic detection of diseased regions in knee cartilage

    NASA Astrophysics Data System (ADS)

    Qazi, Arish A.; Dam, Erik B.; Olsen, Ole F.; Nielsen, Mads; Christiansen, Claus

    2007-03-01

    Osteoarthritis (OA) is a degenerative joint disease characterized by articular cartilage degradation. A central problem in clinical trials is quantification of progression and early detection of the disease. The accepted standard for evaluating OA progression is to measure the joint space width from radiographs however; there the cartilage is not visible. Recently cartilage volume and thickness measures from MRI are becoming popular, but these measures don't account for the biochemical changes undergoing in the cartilage before cartilage loss even occurs and therefore are not optimal for early detection of OA. As a first step, we quantify cartilage homogeneity (computed as the entropy of the MR intensities) from 114 automatically segmented medial compartments of tibial cartilage sheets from Turbo 3D T 1 sequences, from subjects with no, mild or severe OA symptoms. We show that homogeneity is a more sensitive technique than volume quantification for detecting early OA and for separating healthy individuals from diseased. During OA certain areas of the cartilage are affected more and it is believed that these are the load-bearing regions located at the center of the cartilage. Based on the homogeneity framework we present an automatic technique that partitions the region on the cartilage that contributes to maximum homogeneity discrimination. These regions however, are more towards the noncentral regions of the cartilage. Our observation will provide valuable clues to OA research and may lead to improving treatment efficacy.

  3. Parameter-induced uncertainty quantification of crop yields, soil N2O and CO2 emission for 8 arable sites across Europe using the LandscapeDNDC model

    NASA Astrophysics Data System (ADS)

    Santabarbara, Ignacio; Haas, Edwin; Kraus, David; Herrera, Saul; Klatt, Steffen; Kiese, Ralf

    2014-05-01

    When using biogeochemical models to estimate greenhouse gas emissions at site to regional/national levels, the assessment and quantification of the uncertainties of simulation results are of significant importance. The uncertainties in simulation results of process-based ecosystem models may result from uncertainties of the process parameters that describe the processes of the model, model structure inadequacy as well as uncertainties in the observations. Data for development and testing of uncertainty analisys were corp yield observations, measurements of soil fluxes of nitrous oxide (N2O) and carbon dioxide (CO2) from 8 arable sites across Europe. Using the process-based biogeochemical model LandscapeDNDC for simulating crop yields, N2O and CO2 emissions, our aim is to assess the simulation uncertainty by setting up a Bayesian framework based on Metropolis-Hastings algorithm. Using Gelman statistics convergence criteria and parallel computing techniques, enable multi Markov Chains to run independently in parallel and create a random walk to estimate the joint model parameter distribution. Through means distribution we limit the parameter space, get probabilities of parameter values and find the complex dependencies among them. With this parameter distribution that determines soil-atmosphere C and N exchange, we are able to obtain the parameter-induced uncertainty of simulation results and compare them with the measurements data.

  4. The role of attention in the tinnitus decompensation: reinforcement of a large-scale neural decompensation measure.

    PubMed

    Low, Yin Fen; Trenado, Carlos; Delb, Wolfgang; Corona-Strauss, Farah I; Strauss, Daniel J

    2007-01-01

    Large-scale neural correlates of the tinnitus decompensation have been identified by using wavelet phase stability criteria of single sweep sequences of auditory late responses (ALRs). The suggested measure provided an objective quantification of the tinnitus decompensation and allowed for a reliable discrimination between a group of compensated and decompensated tinnitus patients. By interpreting our results with an oscillatory tinnitus model, our synchronization stability measure of ALRs can be linked to the focus of attention on the tinnitus signal. In the following study, we examined in detail the correlates of this attentional mechanism in healthy subjects. The results support our previous findings of the phase synchronization stability measure that reflected neural correlates of the fixation of attention to the tinnitus signal. In this case, enabling the differentiation between the attended and unattended conditions. It is concluded that the wavelet phase synchronization stability of ALRs single sweeps can be used as objective tinnitus decompensation measure and can be interpreted in the framework of the Jastreboff tinnitus model and adaptive resonance theory. Our studies confirm that the synchronization stability in ALR sequences is linked to attention. This measure is not only able to serve as objective quantification of the tinnitus decompensation, but also can be applied in all online and real time neurofeedback therapeutic approach where a direct stimulus locked attention monitoring is compulsory as if it based on a single sweeps processing.

  5. Biofilm development of an opportunistic model bacterium analysed at high spatiotemporal resolution in the framework of a precise flow cell

    PubMed Central

    Lim, Chun Ping; Mai, Phuong Nguyen Quoc; Roizman Sade, Dan; Lam, Yee Cheong; Cohen, Yehuda

    2016-01-01

    Life of bacteria is governed by the physical dimensions of life in microscales, which is dominated by fast diffusion and flow at low Reynolds numbers. Microbial biofilms are structurally and functionally heterogeneous and their development is suggested to be interactively related to their microenvironments. In this study, we were guided by the challenging requirements of precise tools and engineered procedures to achieve reproducible experiments at high spatial and temporal resolutions. Here, we developed a robust precise engineering approach allowing for the quantification of real-time, high-content imaging of biofilm behaviour under well-controlled flow conditions. Through the merging of engineering and microbial ecology, we present a rigorous methodology to quantify biofilm development at resolutions of single micrometre and single minute, using a newly developed flow cell. We designed and fabricated a high-precision flow cell to create defined and reproducible flow conditions. We applied high-content confocal laser scanning microscopy and developed image quantification using a model biofilm of a defined opportunistic strain, Pseudomonas putida OUS82. We observed complex patterns in the early events of biofilm formation, which were followed by total dispersal. These patterns were closely related to the flow conditions. These biofilm behavioural phenomena were found to be highly reproducible, despite the heterogeneous nature of biofilm. PMID:28721252

  6. Uncertainty quantification for personalized analyses of human proximal femurs.

    PubMed

    Wille, Hagen; Ruess, Martin; Rank, Ernst; Yosibash, Zohar

    2016-02-29

    Computational models for the personalized analysis of human femurs contain uncertainties in bone material properties and loads, which affect the simulation results. To quantify the influence we developed a probabilistic framework based on polynomial chaos (PC) that propagates stochastic input variables through any computational model. We considered a stochastic E-ρ relationship and a stochastic hip contact force, representing realistic variability of experimental data. Their influence on the prediction of principal strains (ϵ1 and ϵ3) was quantified for one human proximal femur, including sensitivity and reliability analysis. Large variabilities in the principal strain predictions were found in the cortical shell of the femoral neck, with coefficients of variation of ≈40%. Between 60 and 80% of the variance in ϵ1 and ϵ3 are attributable to the uncertainty in the E-ρ relationship, while ≈10% are caused by the load magnitude and 5-30% by the load direction. Principal strain directions were unaffected by material and loading uncertainties. The antero-superior and medial inferior sides of the neck exhibited the largest probabilities for tensile and compression failure, however all were very small (pf<0.001). In summary, uncertainty quantification with PC has been demonstrated to efficiently and accurately describe the influence of very different stochastic inputs, which increases the credibility and explanatory power of personalized analyses of human proximal femurs. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    PubMed

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  8. Performance of U-net based pyramidal lucas-kanade registration on free-breathing multi-b-value diffusion MRI of the kidney.

    PubMed

    Lv, Jun; Huang, Wenjian; Zhang, Jue; Wang, Xiaoying

    2018-06-01

    In free-breathing multi-b-value diffusion-weighted imaging (DWI), a series of images typically requires several minutes to collect. During respiration the kidney is routinely displaced and may also undergo deformation. These respiratory motion effects generate artifacts and these are the main sources of error in the quantification of intravoxel incoherent motion (IVIM) derived parameters. This work proposes a fully automated framework that combines a kidney segmentation to improve the registration accuracy. 10 healthy subjects were recruited to participate in this experiment. For the segmentation, U-net was adopted to acquire the kidney's contour. The segmented kidney then served as a region of interest (ROI) for the registration method, known as pyramidal Lucas-Kanade. Our proposed framework confines the kidney's solution range, thus increasing the pyramidal Lucas-Kanade's accuracy. To demonstrate the feasibility of our presented framework, eight regions of interest were selected in the cortex and medulla, and data stability was estimated by comparing the normalized root-mean-square error (NRMSE) values of the fitted data from the bi-exponential intravoxel incoherent motion model pre- and post- registration. The results show that the NRMSE was significantly lower after registration both in the cortex (p < 0.05) and medulla (p < 0.01) during free-breathing measurements. In addition, expert visual scoring of the derived apparent diffusion coefficient (ADC), f, D and D* maps indicated there were significant improvements in the alignment of the kidney in the post-registered image. The proposed framework can effectively reduce the motion artifacts of misaligned multi-b-value DWIs and the inaccuracies of the ADC, f, D and D* estimations. Advances in knowledge: This study demonstrates the feasibility of our proposed fully automated framework combining U-net based segmentation and pyramidal Lucas-Kanade registration method for improving the alignment of multi-b-value diffusion-weighted MRIs and reducing the inaccuracy of parameter estimation during free-breathing.

  9. Food waste quantification in primary production - The Nordic countries as a case study.

    PubMed

    Hartikainen, Hanna; Mogensen, Lisbeth; Svanes, Erik; Franke, Ulrika

    2018-01-01

    Our understanding of food waste in the food supply chain has increased, but very few studies have been published on food waste in primary production. The overall aims of this study were to quantify the total amount of food waste in primary production in Finland, Sweden, Norway and Denmark, and to create a framework for how to define and quantify food waste in primary production. The quantification of food waste was based on case studies conducted in the present study and estimates published in scientific literature. The chosen scope of the study was to quantify the amount of edible food (excluding inedible parts like peels and bones) produced for human consumption that did not end up as food. As a result, the quantification was different from the existing guidelines. One of the main differences is that food that ends up as animal feed is included in the present study, whereas this is not the case for the recently launched food waste definition of the FUSIONS project. To distinguish the 'food waste' definition of the present study from the existing definitions and to avoid confusion with established usage of the term, a new term 'side flow' (SF) was introduced as a synonym for food waste in primary production. A rough estimate of the total amount of food waste in primary production in Finland, Sweden, Norway and Denmark was made using SF and 'FUSIONS Food Waste' (FFW) definitions. The SFs in primary production in the four Nordic countries were an estimated 800,000 tonnes per year with an additional 100,000 tonnes per year from the rearing phase of animals. The 900,000 tonnes per year of SF corresponds to 3.7% of the total production of 24,000,000 tonnes per year of edible primary products. When using the FFW definition proposed by the FUSIONS project, the FFW amount was estimated at 330,000 tonnes per year, or 1% of the total production. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Quantification of in vivo metabolic kinetics of hyperpolarized pyruvate in rat kidneys using dynamic 13C MRSI.

    PubMed

    Xu, Tao; Mayer, Dirk; Gu, Meng; Yen, Yi-Fen; Josan, Sonal; Tropp, James; Pfefferbaum, Adolf; Hurd, Ralph; Spielman, Daniel

    2011-10-01

    With signal-to-noise ratio enhancements on the order of 10,000-fold, hyperpolarized MRSI of metabolically active substrates allows the study of both the injected substrate and downstream metabolic products in vivo. Although hyperpolarized [1-(13)C]pyruvate, in particular, has been used to demonstrate metabolic activities in various animal models, robust quantification and metabolic modeling remain important areas of investigation. Enzyme saturation effects are routinely seen with commonly used doses of hyperpolarized [1-(13)C]pyruvate; however, most metrics proposed to date, including metabolite ratios, time-to-peak of metabolic products and single exchange rate constants, fail to capture these saturation effects. In addition, the widely used small-flip-angle excitation approach does not correctly model the inflow of fresh downstream metabolites generated proximal to the target slice, which is often a significant factor in vivo. In this work, we developed an efficient quantification framework employing a spiral-based dynamic spectroscopic imaging approach. The approach overcomes the aforementioned limitations and demonstrates that the in vivo (13)C labeling of lactate and alanine after a bolus injection of [1-(13)C]pyruvate is well approximated by saturatable kinetics, which can be mathematically modeled using a Michaelis-Menten-like formulation, with the resulting estimated apparent maximal reaction velocity V(max) and apparent Michaelis constant K(M) being unbiased with respect to critical experimental parameters, including the substrate dose, bolus shape and duration. Although the proposed saturatable model has a similar mathematical formulation to the original Michaelis-Menten kinetics, it is conceptually different. In this study, we focus on the (13)C labeling of lactate and alanine and do not differentiate the labeling mechanism (net flux or isotopic exchange) or the respective contribution of various factors (organ perfusion rate, substrate transport kinetics, enzyme activities and the size of the unlabeled lactate and alanine pools) to the labeling process. Copyright © 2011 John Wiley & Sons, Ltd.

  11. Factors influencing the quality of Myrmecia pilosula (Jack Jumper) ant venom for use in in vitro and in vivo diagnoses of allergen sensitization and in allergen immunotherapy.

    PubMed

    Wanandy, T; Dwyer, H E; McLean, L; Davies, N W; Nichols, D; Gueven, N; Brown, S G A; Wiese, M D

    2017-11-01

    Allergen immunotherapy uses pharmaceutical preparations derived from naturally occurring source materials, which contain water-soluble allergenic components responsible for allergic reactions. The success of in vivo and in vitro diagnoses in allergen sensitization and allergen immunotherapy largely depends on the quality, composition and uniformity of allergenic materials used to produce the active ingredients, and the formulation employed to prepare finished products. We aimed to examine the factors influencing batch-to-batch consistency of Jack Jumper (Myrmecia pilosula) ant venom (JJAV) in the form of active pharmaceutical ingredient (AI) and informed whether factors such as temperature, artificial light and container materials influence the quality of JJAV AIs. We also aimed to establish handling and storage requirements of JJAV AIs to ensure preservation of allergenic activities during usage in the diagnosis of allergen sensitization and in allergen immunotherapy. The quality and consistency of JJAV AIs were analysed using a combination of bicinchoninic acid assay for total protein quantification, HPLC-UV for JJAV allergen peptides quantification, ELISA inhibition for total allergenic potency, SDS-PAGE, AU-PAGE and immunoblot for qualitative assessment of JJAV components, and Limulus Amebocyte Lysate assay for the quantification of endotoxin concentration. API-ZYM and Zymogram assays were used to probe the presence of enzymatic activities in JJAV. Pharmaceutical-grade JJAV for allergen immunotherapy has good batch-to-batch consistency. Temporary storage at 4°C and light exposure do not affect the quality of JJAV. Exposure to temperature above 40°C degrades high MW allergens in JJAV. Vials containing JJAV must be stored frozen and in upright position during long-term storage. We have identified factors, which can influence the quality and consistency of JJAV AIs, and provided a framework for appropriate handling, transporting and storage of JJAV to be used for the diagnosis of allergen sensitization and in AIT. © 2017 John Wiley & Sons Ltd.

  12. Upscaling mixing in porous media from an experimental quantification of pore scale Lagrangian deformation statistics

    NASA Astrophysics Data System (ADS)

    Turuban, R.; Jimenez-Martinez, J.; De Anna, P.; Tabuteau, H.; Meheust, Y.; Le Borgne, T.

    2014-12-01

    As dissolved chemical elements are transported in the subsurface, their mixing with other compounds and potential reactivity depends on the creation of local scale chemical gradients, which ultimately drive diffusive mass transfer and reaction. The distribution of concentration gradients is in turn shaped by the spatial gradients of flow velocity arising from the random distribution of solid grains. We present an experimental investigation of the relationship between the microscale flow stretching properties and the effective large scale mixing dynamics in porous media. We use a flow cell that models a horizontal quasi two-dimensional (2D) porous medium, the grains of which are cylinders randomly positioned between two glass plates [de Anna et al. 2013]. In this setup, we perform both non diffusive and diffusive transport tests, by injecting respectively microsphere solid tracers and a fluorescent dye. While the dye front propagates through the medium, it undergoes in time a kinematic stretching that is controlled by the flow heterogeneity, as it encounters stagnation zones and high velocity channels between the grains. The spatial distribution of the dye can then be described as a set of stretched lamellae whose rate of diffusive smoothing is locally enhanced by kinematic stretching [Le Borgne et al., 2013]. We show that this representation allows predicting the temporal evolution of the mixing rate and the probability distribution of concentration gradients for a range of Peclet numbers. This upscaling framework hence provides a quantification of the dynamics of effective mixing from the microscale Lagrangian velocity statistics. References:[1] P. de Anna, J. Jimenez-Martinez, H. Tabuteau, R. Turuban, T. Le Borgne, M. Derrien,and Yves Méheust, Mixing and reaction kinetics in porous media : an experimental pore scale quantification, Environ. Sci. Technol. 48, 508-516, 2014. [2] Le Borgne, T., M. Dentz, E. Villermaux, Stretching, coalescence and mixing in porous media, Phys. Rev. Lett., 110, 204501 (2013)

  13. A Python Interface for the Dakota Iterative Systems Analysis Toolkit

    NASA Astrophysics Data System (ADS)

    Piper, M.; Hutton, E.; Syvitski, J. P.

    2016-12-01

    Uncertainty quantification is required to improve the accuracy, reliability, and accountability of Earth science models. Dakota is a software toolkit, developed at Sandia National Laboratories, that provides an interface between models and a library of analysis methods, including support for sensitivity analysis, uncertainty quantification, optimization, and calibration techniques. Dakota is a powerful tool, but its learning curve is steep: the user not only must understand the structure and syntax of the Dakota input file, but also must develop intermediate code, called an analysis driver, that allows Dakota to run a model. The CSDMS Dakota interface (CDI) is a Python package that wraps and extends Dakota's user interface. It simplifies the process of configuring and running a Dakota experiment. A user can program to the CDI, allowing a Dakota experiment to be scripted. The CDI creates Dakota input files and provides a generic analysis driver. Any model written in Python that exposes a Basic Model Interface (BMI), as well as any model componentized in the CSDMS modeling framework, automatically works with the CDI. The CDI has a plugin architecture, so models written in other languages, or those that don't expose a BMI, can be accessed by the CDI by programmatically extending a template; an example is provided in the CDI distribution. Currently, six Dakota analysis methods have been implemented for examples from the much larger Dakota library. To demonstrate the CDI, we performed an uncertainty quantification experiment with the HydroTrend hydrological water balance and transport model. In the experiment, we evaluated the response of long-term suspended sediment load at the river mouth (Qs) to uncertainty in two input parameters, annual mean temperature (T) and precipitation (P), over a series of 100-year runs, using the polynomial chaos method. Through Dakota, we calculated moments, local and global (Sobol') sensitivity indices, and probability density and cumulative distribution functions for the response.

  14. Applying time series Landsat data for vegetation change analysis in the Florida Everglades Water Conservation Area 2A during 1996-2016

    NASA Astrophysics Data System (ADS)

    Zhang, Caiyun; Smith, Molly; Lv, Jie; Fang, Chaoyang

    2017-05-01

    Mapping plant communities and documenting their changes is critical to the on-going Florida Everglades restoration project. In this study, a framework was designed to map dominant vegetation communities and inventory their changes in the Florida Everglades Water Conservation Area 2A (WCA-2A) using time series Landsat images spanning 1996-2016. The object-based change analysis technique was combined in the framework. A hybrid pixel/object-based change detection approach was developed to effectively collect training samples for historical images with sparse reference data. An object-based quantification approach was also developed to assess the expansion/reduction of a specific class such as cattail (an invasive species in the Everglades) from the object-based classifications of two dates of imagery. The study confirmed the results in the literature that cattail was largely expanded during 1996-2007. It also revealed that cattail expansion was constrained after 2007. Application of time series Landsat data is valuable to document vegetation changes for the WCA-2A impoundment. The digital techniques developed will benefit global wetland mapping and change analysis in general, and the Florida Everglades WCA-2A in particular.

  15. Data-driven reduced order models for effective yield strength and partitioning of strain in multiphase materials

    NASA Astrophysics Data System (ADS)

    Latypov, Marat I.; Kalidindi, Surya R.

    2017-10-01

    There is a critical need for the development and verification of practically useful multiscale modeling strategies for simulating the mechanical response of multiphase metallic materials with heterogeneous microstructures. In this contribution, we present data-driven reduced order models for effective yield strength and strain partitioning in such microstructures. These models are built employing the recently developed framework of Materials Knowledge Systems that employ 2-point spatial correlations (or 2-point statistics) for the quantification of the heterostructures and principal component analyses for their low-dimensional representation. The models are calibrated to a large collection of finite element (FE) results obtained for a diverse range of microstructures with various sizes, shapes, and volume fractions of the phases. The performance of the models is evaluated by comparing the predictions of yield strength and strain partitioning in two-phase materials with the corresponding predictions from a classical self-consistent model as well as results of full-field FE simulations. The reduced-order models developed in this work show an excellent combination of accuracy and computational efficiency, and therefore present an important advance towards computationally efficient microstructure-sensitive multiscale modeling frameworks.

  16. Structural and parameteric uncertainty quantification in cloud microphysics parameterization schemes

    NASA Astrophysics Data System (ADS)

    van Lier-Walqui, M.; Morrison, H.; Kumjian, M. R.; Prat, O. P.; Martinkus, C.

    2017-12-01

    Atmospheric model parameterization schemes employ approximations to represent the effects of unresolved processes. These approximations are a source of error in forecasts, caused in part by considerable uncertainty about the optimal value of parameters within each scheme -- parameteric uncertainty. Furthermore, there is uncertainty regarding the best choice of the overarching structure of the parameterization scheme -- structrual uncertainty. Parameter estimation can constrain the first, but may struggle with the second because structural choices are typically discrete. We address this problem in the context of cloud microphysics parameterization schemes by creating a flexible framework wherein structural and parametric uncertainties can be simultaneously constrained. Our scheme makes no assuptions about drop size distribution shape or the functional form of parametrized process rate terms. Instead, these uncertainties are constrained by observations using a Markov Chain Monte Carlo sampler within a Bayesian inference framework. Our scheme, the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS), has flexibility to predict various sets of prognostic drop size distribution moments as well as varying complexity of process rate formulations. We compare idealized probabilistic forecasts from versions of BOSS with varying levels of structural complexity. This work has applications in ensemble forecasts with model physics uncertainty, data assimilation, and cloud microphysics process studies.

  17. Adaptation of pancreatic islet cyto-architecture during development

    NASA Astrophysics Data System (ADS)

    Striegel, Deborah A.; Hara, Manami; Periwal, Vipul

    2016-04-01

    Plasma glucose in mammals is regulated by hormones secreted by the islets of Langerhans embedded in the exocrine pancreas. Islets consist of endocrine cells, primarily α, β, and δ cells, which secrete glucagon, insulin, and somatostatin, respectively. β cells form irregular locally connected clusters within islets that act in concert to secrete insulin upon glucose stimulation. Varying demands and available nutrients during development produce changes in the local connectivity of β cells in an islet. We showed in earlier work that graph theory provides a framework for the quantification of the seemingly stochastic cyto-architecture of β cells in an islet. To quantify the dynamics of endocrine connectivity during development requires a framework for characterizing changes in the probability distribution on the space of possible graphs, essentially a Fokker-Planck formalism on graphs. With large-scale imaging data for hundreds of thousands of islets containing millions of cells from human specimens, we show that this dynamics can be determined quantitatively. Requiring that rearrangement and cell addition processes match the observed dynamic developmental changes in quantitative topological graph characteristics strongly constrained possible processes. Our results suggest that there is a transient shift in preferred connectivity for β cells between 1-35 weeks and 12-24 months.

  18. Improving Soil Seed Bank Management.

    PubMed

    Haring, Steven C; Flessner, Michael L

    2018-05-08

    Problems associated with simplified weed management motivate efforts for diversification. Integrated weed management uses fundamentals of weed biology and applied ecology to provide a framework for diversified weed management programs; the soil seed bank comprises a necessary part of this framework. By targeting seeds, growers can inhibit the propagule pressure on which annual weeds depend for agricultural invasion. Some current management practices affect weed seed banks, such as crop rotation and tillage, but these tools are often used without specific intention to manage weed seeds. Difficulties quantifying the weed seed bank, understanding seed bank phenology, and linking seed banks to emerged weed communities challenge existing soil seed bank management practices. Improved seed bank quantification methods could include DNA profiling of the soil seed bank, mark and recapture, or 3D LIDAR mapping. Successful and sustainable soil seed bank management must constrain functionally diverse and changing weed communities. Harvest weed seed controls represent a step forward, but over-reliance on this singular technique could make it short-lived. Researchers must explore tools inspired by other pest management disciplines, such as gene drives or habitat modification for predatory organisms. Future weed seed bank management will combine multiple complementary practices that enhance diverse agroecosystems. This article is protected by copyright. All rights reserved.

  19. Designing monitoring programs for chemicals of emerging concern in potable reuse--what to include and what not to include?

    PubMed

    Drewes, J E; Anderson, P; Denslow, N; Olivieri, A; Schlenk, D; Snyder, S A; Maruya, K A

    2013-01-01

    This study discussed a proposed process to prioritize chemicals for reclaimed water monitoring programs, selection of analytical methods required for their quantification, toxicological relevance of chemicals of emerging concern regarding human health, and related issues. Given that thousands of chemicals are potentially present in reclaimed water and that information about those chemicals is rapidly evolving, a transparent, science-based framework was developed to guide prioritization of which compounds of emerging concern (CECs) should be included in reclaimed water monitoring programs. The recommended framework includes four steps: (1) compile environmental concentrations (e.g., measured environmental concentration or MEC) of CECs in the source water for reuse projects; (2) develop a monitoring trigger level (MTL) for each of these compounds (or groups thereof) based on toxicological relevance; (3) compare the environmental concentration (e.g., MEC) to the MTL; CECs with a MEC/MTL ratio greater than 1 should be prioritized for monitoring, compounds with a ratio less than '1' should only be considered if they represent viable treatment process performance indicators; and (4) screen the priority list to ensure that a commercially available robust analytical method is available for that compound.

  20. A general unified framework to assess the sampling variance of heritability estimates using pedigree or marker-based relationships.

    PubMed

    Visscher, Peter M; Goddard, Michael E

    2015-01-01

    Heritability is a population parameter of importance in evolution, plant and animal breeding, and human medical genetics. It can be estimated using pedigree designs and, more recently, using relationships estimated from markers. We derive the sampling variance of the estimate of heritability for a wide range of experimental designs, assuming that estimation is by maximum likelihood and that the resemblance between relatives is solely due to additive genetic variation. We show that well-known results for balanced designs are special cases of a more general unified framework. For pedigree designs, the sampling variance is inversely proportional to the variance of relationship in the pedigree and it is proportional to 1/N, whereas for population samples it is approximately proportional to 1/N(2), where N is the sample size. Variation in relatedness is a key parameter in the quantification of the sampling variance of heritability. Consequently, the sampling variance is high for populations with large recent effective population size (e.g., humans) because this causes low variation in relationship. However, even using human population samples, low sampling variance is possible with high N. Copyright © 2015 by the Genetics Society of America.

Top